Voyager App Test
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Michael Ten @lemmy.world to Technology@lemmy.worldEnglish · 1 year ago

Opera is testing letting you download LLMs for local use, a first for a major browser

www.zdnet.com

external-link
message-square
26
link
fedilink
25
external-link

Opera is testing letting you download LLMs for local use, a first for a major browser

www.zdnet.com

Michael Ten @lemmy.world to Technology@lemmy.worldEnglish · 1 year ago
message-square
26
link
fedilink
You can now use LLMs in Opera locally, meaning without having to send information to a server.
alert-triangle
You must log in or register to comment.
  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    I haven’t used Opera since they switched from their own engine to chrome. They are now owned by a Chinese company, so it probably has at least as much tracking built into it as Google Chrome now.

    • squid_slime@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I miss old opera before the buyout

      • lemmyng@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        That’s essentially Vivaldi now.

        • squid_slime@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Apart from it being chromium based 😕

          • coolmojo@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Have a look at Otter browser It aims to replicate the old interface. It is using QtWebEngine as Presto was closed source. It is in development since 10 years now. And it is open source.

            • baduhai@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              3
              ·
              1 year ago

              QtWebEngine is Chromium :(

              It’s Chromium all the way down.

              • coolmojo@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Qt WebEngine uses code from the Chromium project. However, it is not containing all of Chrome/Chromium: Binary files are stripped out Auxiliary services that talk to Google platforms are stripped out, Source

                • baduhai@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  While that’s one of the reasons I don’t want to use chromium, it’s not actually the main reason, if so I’d just use Ungoogled Chromium. I just want more web engines, and I dont want google to monopolise the internet.

  • reddig33@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Why the hell do I need this in a web browser? Why isn’t it a stand alone app?

    • GlitterInfection@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      If you think of LLMs as a thing to replace search bars then this kind of makes sense.

      • noodle (he/him)@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        If you think of LLMs as a thing to replace search bars

        I don’t.

      • reddig33@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Just more unnecessary browser bloat.

        • GlitterInfection@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Like search bars.

          • Plopp@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            The more search bars the faster your internet becomes!

            • GlitterInfection@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              This is true. I asked my LLM.

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      There are plenty of stand-alone LLM apps.

  • Bandicoot_Academic@lemmy.one
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Intresting. But I’m curious about the performance.

    A bigger LLM (mixtral) already struggles to run on my mid-range gaming PC. Trying to run an LLM that isn’t terrible on a standard laptop wouldn’t be a good experience.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I have no idea how this is set up to work technically, but most of the heavy lifting is gonna be on the GPU. I’m not sure that it matters much whether the browser is what’s pushing data to the GPU or some other package.

      • Bandicoot_Academic@lemmy.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.

        • douglasg14b@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.

  • folak@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    https://rentry.co/operagx

  • ColdWater@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    What’s LLM?

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      https://en.wikipedia.org/wiki/Large_language_model

      A lot of the “AI” stuff that’s been in the news recently, chatbots and image generation and such, are based on LLMs.

  • Gunpachi@lemmings.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Thats a cool feature for sure but I don’t trust opera.

  • essteeyou@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Can’t they just stick to normal browser things like gaming integrations?

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.96K users / day
  • 8.76K users / week
  • 17.3K users / month
  • 37.6K users / 6 months
  • 1 local subscriber
  • 72.4K subscribers
  • 9.94K Posts
  • 335K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • L4s@hackingne.ws
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org