I was fine with you knowing this information until “The hard part is always moving the body unnoticed.” Always? As in every single time?
I was fine with you knowing this information until “The hard part is always moving the body unnoticed.” Always? As in every single time?
Was a bit tongue in cheek. Edge can download Linux. Rufus puts it on a USB stick, and goodbye Windows. Then, I can use my computer.
Rufus is the first (and only) program I install on Windows =]
Yup. Stopped answering questions as soon as they did that.
In response to:
Moreover, the claim that they can harm the software is unwarranted because it is OPEN and many eyes are on it.
The xz attack was an intentional backdoor put into a project that was “OPEN and many eyes are on it.” Also, it was discovered due to the way it was executing and not because someone found it in the source. The original assumption has been proven wrong.
xz attack was an open source attack and it would be silly to assume that it was unique.
This opinion is a breath of fresh air compared to the rest of tech journalism screaming “AI software engineer” after each new model release.
Get it in the schools. It’s a bad habit from many people’s childhood that they need to break. Make that original habit not suck.
You want to see a picture of me when I was younger?
I’m not sure what metric you’re using to determine this. The bottom line is, if you’re trying to get the CPU to really fly, using memory efficiently is just as important (if not more) than the actual instructions you send to it. The reason for this is the high latency required to go out to external memory. This is performance 101.
Just wanted to point out that the number 1 performance blocker in the CPU is memory. In the general case, if you’re wasting memory, you’re wasting CPU. These two things really cannot be talked about in isolation.
Guy from '95: “I bet it’s lightning fast though…”
No dude. It peaks pretty soon. In my time, Microsoft is touting a chat program that starts in under 10 seconds. And they’re genuinely proud of it.
Then, they look confused when I tell them I don’t want the thing connected to the Internet.
100% this. The base algorithms used in LLMs have been around for at least 15 years. What we have now is only slightly different than it was then. The latest advancement was training a model on stupid amounts of scraped data off the Internet. And it took all that data to make something that gave you half decent results. There isn’t much juice left to squeeze here, but so many people are assuming exponential growth and “just wait until the AI trains other AI.”
It’s really like 10% new tech and 90% hype/marketing. The worst is that it’s got so many people fooled you hear many of these dumb takes from respectable journalists interviewing “tech” journalists. It’s just perpetuating the hype. Now your boss/manager is buying in =]
I can’t believe I’m actually upvoting that statement… coming from a former windows nerd (until 7).
Preach brother!
In before a Microsoft apologist drops in to tell us how much they are sick of Lemmy nerds suggesting Linux. Then, proceeds to cry about the terminal and provide reasons that could be a textbook definition of Stockholm Syndrome. Point them to this comment when they get here, please.
Countdown until Google shittymorphs me looking for cooking recipes.
…Always had been
I used to buy broken video cards on ebay for ~$25-50. The ones that run, but shut off have clogged heat sinks. No tools or parts required. Just blow out the dust. Obviously more risky, but sometimes you can hit gold.