• Bandicoot_Academic@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.