Michael Ten @lemmy.world to Technology@lemmy.worldEnglish · 8 months agoOpera is testing letting you download LLMs for local use, a first for a major browserwww.zdnet.comexternal-linkmessage-square26fedilinkarrow-up129
arrow-up125external-linkOpera is testing letting you download LLMs for local use, a first for a major browserwww.zdnet.comMichael Ten @lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square26fedilink
minus-squareBandicoot_Academic@lemmy.onelinkfedilinkEnglisharrow-up0·8 months agoMost people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.
minus-squaredouglasg14b@lemmy.worldlinkfedilinkEnglisharrow-up1·8 months agoIt doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.
Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.
It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.