• 4 Posts
  • 2.09K Comments
Joined 2 years ago
cake
Cake day: March 22nd, 2024

help-circle

  • Actually… I have quite a negative perception of GIMP. I’m primarily a Linux user, but I just remember it as something that’s either always felt obtuse to use, missing something I need, or sluggish for the more narrow processing I’m trying to do.

    AFAIK that perception is more pronounced outside Linux.

    I don’t care about a brand either way. But if the GIMP project is ready, I think a “fresh start” to draw in users without any preconceived notions is a good thing.



  • This is commonly cited, but not strictly true.

    Prompt processing is completely compute limited. And at high batch sizes, where the weights are read once for many tokens generated in parallel, token generation is also quite compute limited. Obviously you want enough bandwidth to match the compute, but its very compute heavy.

    You can see this for yourself. Try ~10 prompts in parallel on a CPU in llama.cpp, and it will slow to a crawl, while a GPU with a narrow bus won’t slow down much.

    Training is a bit more complicated, but that’s not doable on CPUs anyway.

    Now, local inference (aka a batch size of 1), past prompt processing, is heavily bandwidth limited. This is why hybrid inference works alright on CPUs. But this doesn’t really apply to servers, which process many users in parallel with each “pass”.



  • No. Not even close. Non-US models are trained (and run) on peanuts compared to big US models, because they don’t have mega GPU farms and have no other option. Deepseek in particular went all-in on software architecture efficiency.

    …Ironically, the Nvidia GPU embargo was the best thing that ever happened to the Chinese devs. It made them thrifty.

    Many tried to warn US regulators of this, but they had AI Bros whispering in their ears. The US tech system is just too screwed up, I guess.







  • Well, don’t use Twitter.

    I don’t mean to be grating, I mean to be blunt. Whatever you are doing here:

    …but if you as an LGBT person answer to the homophobic conservatives with the same energy…

    Does not matter because the algorithms are skewed, too. No “defending” you do will be shown to users who might actually change their opinion over what you say. As that wouldn’t be engaging.

    Don’t believe me? Look at any neutral content (like NASA’s Artemis posts) logged out+VPN, then on your account.

    there’s huge accounts on X that are dedicated to spread extreme hate even wishing death on other people.

    There is no “fighting” this on Twitter, there is no balancing. That’s the illusion. There is no free speech on Twitter even if you were never censored, hence only way to win is to leave. And deprive them of your engagement.



  • Anarchist types are concerned about government backed crypto coins since you lose the fungibility/anonymity of physical dollars but don’t get any of the freedom and separation from centralization that crypto supposedly represents.

    Plus all the potential for oligarch corruption, like current crypto has. Yeah, it’s like the worst of everything, by design.





  • Were they robbed?

    I bet they were.

    Say what you will about cash, but some hacker isn’t taking paper bills from across the planet via some technical exploit way over my head. With Etherium, the only thing protecting your money from the entire internet is you, and your understanding of complicated intricacies… And when lost, no one is coming to help you.

    They might get my credit card, yeah. But that’s either my own dumb fault, or a very rich bank’s problem.


    …It’s great for scammers, though. Crypto’s like a wet dream for them. And I find it remarkable the crypto community sees that as a feature, not a bug, and somehow thinks the whole world must see it that way.



  • The problem isn’t restricted to one sex/gender.

    Perhaps not.

    …And yeah, there’s all sorts of flavors of hostility even here on Lemmy.

    But the manosphere seems to be “winning” their culture war, at least here in the US, compared to whatever the equivalent problem on the other end is. That feels like the bigger problem.