• 0 Posts
  • 16 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle



  • I agree. I think people might have the idea that the states dictates the contents, but that’s not at all how it works in well functioning democracies. It’s there to serve the public interest: to have a relatively unbiased news outlet that’s accessible to all and without (or with little) commercial interests. It coexists with commercial news outlets.


  • Sure, but I’m just playing around with small quantized models on my laptop with integrated graphics and the RAM was insanely cheap. It just interests me what LLMs are capable of that can be run on such hardware. For example, llama 3.2 3B only needs about 3.5 GB of RAM, runs at about 10 tokens per second and while it’s in no way comparable to the LLMs that I use for my day to day tasks, it doesn’t seem to be that bad. Llama 3.1 8B runs at about half that speed, which is a bit slow, but still bearable. Anything bigger than that is too slow to be useful, but still interesting to try for comparison.

    I’ve got an old desktop with a pretty decent GPU in it with 24 GB of VRAM, but it’s collecting dust. It’s noisy and power hungry (older generation dual socket Intel Xeon) and still incapable of running large LLMs without additional GPUs. Even if it were capable, I wouldn’t want it to be turned on all the time due to the noise and heat in my home office, so I’ve not even tried running anything on it yet.


  • The only time I can remember 16 GB not being sufficient for me is when I tried to run an LLM that required a tad more than 11 GB and I had just under 11 GB of memory available due to the other applications that were running.

    I guess my usage is relatively lightweight. A browser with a maximum of about 100 open tabs, a terminal, a couple of other applications (some of them electron based) and sometimes a VM that I allocate maybe 4 GB to or something. And the occasional Age of Empires II DE, which even runs fine on my other laptop from 2016 with 16 GB of RAM in it. I still ordered 32 GB so I can play around with local LLMs a bit more.


  • I’m not going to defend Apple’s profit maximization strategy here, but I disagree. Most people won’t end up buying a cable and adaptare because they already have one, and in contrast to those pieces made of plastic and metal, the packaging is mostly made of paper. I’m pretty confident that the reduction in plastic and metal makes up for the extra packaging that’s produced for the minority that does buy a cable and/or adapter.








  • For me they only work in relatively quiet environments, or with earplugs. As soon as a car drives by it completely drowns out the sound. With music that might not be an issue, but with podcasts or calls it’s very annoying. I’ve bought earplugs especially for this, as my other earbuds have issues with wind while running, but it does feel like it’s defeating the purpose a bit. I guess turning them all the way up would also work, but that doesn’t feel healthy. Other than that I like them and the mic quality is also good according to people I’ve spoken with over the phone.


  • He’s comparing one state to one country (Sweden) and then adds that Europe is not small, which is fair, because the caption says that the “European” mind can’t comprehend this. Europe as a continent is about as big as the US, the European Union is less than half of the size of the US and the individual countries are of course way smaller than the US. Since the EU has open borders, I’d say that comparing the US to the EU is fair and EU member states can be compared to US states. For example: France is about as large as Texas, Germany about as large as Montana and Italy is comparable to New Mexico. There’s a lot of movement between EU countries and some people cross borders every day to go to work or do groceries. The highway/road just continues without interruption.

    Europe as a continent is meaningless, though, and then you might as well include Asia, as Europe isn’t an actual continent (Eurasia is the worlds largest continent). You could drive all the way to Eastern China if you’d like, but you’d be crossing multiple borders with border control and visa requirements, so that makes it incomparable to driving within the US.