No, he stuffs a whole bunch of animals in a ship on a mountain, and months later there’s only 2 of each that come out. After which he dismantles the ship as his friends and neighbors laugh at him.
No, he stuffs a whole bunch of animals in a ship on a mountain, and months later there’s only 2 of each that come out. After which he dismantles the ship as his friends and neighbors laugh at him.
Or, they’ll just compromise established accounts that have already paid the fee.
GPTs are designed with translation in mind, so I could see it being extremely useful in providing me instruction on a topic in a non-English native language.
But they haven’t been around long enough for the novelty factor to wear off.
It’s like computers in the 1980s… people played Oregon Trail on them, but they didn’t really help much with general education.
Fast forward to today, and computers are the core of many facets of education, allowing students to learn knowledge and skills that they’d otherwise have no access to.
GPTs will eventually go the same way.
I unfortunately did… it imploded.
I was thinking pre-SmartTV flatscreen….
I’ve used Mac OS since 1988, BSD/UNIX since 1990, Linux since 1993 and Windows since 1995.
Due to my expertise with all these operating systems, I make enough money to have a life.
I predominantly use Apple hardware, partly because I can easily run all these OSes on it.
With an Epson, HP, Lexmark or Brother printer?
I sense another meme coming on….
I hear you. I’ve never had a FB account, but my parents do. They’re in their 80s.
Was that the older generation you were referring to?
I’d say the window of overlap for “look at the computer” and “information superhighway” was actually pretty small for most people.
Maybe 1996-2001?
So then you factor in how old people would have been during that period who would have done this. Being generous, I’d say 9-18. At different ages in that range “going to my friend’s place to look at the computer” would have been a euphemism for different things.
But the range there would be from 1977-1992, which is actually pretty impressive for a cultural moment. Essentially, most millennials.
Indeed :D — that’s why I left it :)
No, I wrote youngest instead of oldest. And you’re the first person to both catch it and comment. Added an edit :)
To be fair, Millennials tend to forget us Gen-Xers exist almost as much as the Boomers do. The crazy thing these days though is that outspoken millennials still think of Boomers as being anyone over the age of 40. The youngest boomer is 60 this year. The youngest millennial is 40.
[edit]. I thought “surely I couldn’t have written ’youngest millennial’ — and there it is. Oldest. Oldest millennial.”
That’s my point. The post could easily apply to me and I’m not a young millennial, or a millennial at all. There’s no correlation.
Lots of things came earlier. I’m just trying to figure out what being a “young millennial” has to do with a preference for vs computer games. We gen-x-ers were playing PVP games before young millennials were born. Atari was a staple of our childhood. Nintendo multitap was a thing in 1985.
Actually, the first PVP game I played was NetTrek in 1990 — forgot about that one. We generally didn’t start calling them PVP games until 1993-ish.
I spent a lot of time on MUDs in the 90s too…. They generally had mobkill and PVP zones.
I don’t get this.
I was playing PVP games in 1993. On the Internet.
I played my first offline video game in 1983.
Most video games I play today are offline on my phone, with a few PVP games in the browser on my computer.
What does being a millennial have to do with any of that?
Because iCloud was a smashing success for Apple when they used this technique?
At least iOS and macOS don’t keep on asking you after you say no like Windows does though. At least not until you change something in your iCloud configuration.
The most interesting eagle encounter stories I have are when I was alone (apart from the eagle).
Only rich white men who think the right way, mind you.
Spot-on.
I spend a lot of time training people how to properly review code, and the only real way to get good at it is by writing and reviewing a lot of code.
With an LLM, it trains on a lot of code, but it does no review per-se… unlike other ML systems, there’s no negative and positive feedback systems in place to improve quality.
Unfortunately, AI is now equated with LLM and diffusion models instead of machine learning in general.