HOLD ON I NEED PUT ON MY PUSSY PACK.
U wot
HOLD ON I NEED PUT ON MY PUSSY PACK.
U wot
Uh, ACKUALLY, these should be called GNU/Linux because without the Global Nutrition United’s packaging, these cookies would impossible to ship on there own
haha, yeah I am well aware I could do something like that. Unfortunately, once you start working for larger companies, your options for solutions to problems typically shrink dramatically and also need to fit into neat little boxes that someone else already drew. And our environment rules are so draconian, that we cannot use k8s to its fullest anyhow. Most of the people I work with have never actually touched k8s, much less any kind of server oriented UNIX. Thanks for the advice though.
This kinda functionality is surprisingly apropos to a problem I have a work, I realize. And yet, I have k8s. More and more I am appreciating the niche systemd can play with pets instead of cattle and wished corps weren’t jumping to managed k8s and all of that complexity it entails immediately.
It makes somewhat passable mediocrity, very quickly when directly used for such things. The stories it writes from the simplest of prompts is always shallow and full of cliche (and over-represented words like “delve”). To get it to write good prose basically requires breaking down writing, the activity, into its stream of constituent, tiny tasks and then treating the model like the machine it is. And this hack generalizes out to other tasks, too, including writing code. It isn’t alive. It isn’t even thinking. But if you treat these things as rigid robots getting specific work done, you can make then do real things. The problem is asking experts to do all of that labor to hyper segment the work and micromanage the robot. Doing that is actually more work than just asking the expert to do the task themselves. It is still a very rough tool. It will definitely not replace the intern, just yet. At least my interns submit code changes that compile.
Don’t worry, human toil isn’t going anywhere. All of this stuff is super new and still comparatively useless. Right now, the early adopters are mostly remixing what has worked reliably. We have yet to see truly novel applications yet. What you will see in the near future will be lots of “enhanced” products that you can talk to. Whether you want to or not. The human jobs lost to the first wave of AI automation will likely be in the call center. The important industries such as agriculture are already so hyper automated, it will take an enormous investment to close the 2% left. Many, many industries will be that way, even after AI. And for a slightly more cynical take: Human labor will never go away because having power over machines isn’t the same as having power over other humans. We won’t let computers make us all useless.
You’re aware Linux basically runs the
InternetWorld, right?
Billions of devices run Linux. It is an amazing feat!
I recently setup magnetico and tuned its crawling to not be super disruptive to my network (ISP’s shitty router doesn’t have enough RAM to maintain a stateful firewall for NAT for all the sockets magnetico likes to open).
And slowly, I’ve been accumulating torrent hashes. In a couple of months, I’m up to 118k+. I’ve considered trying to merge in other people’s magnetico databases. The point is to maintain my own search for torrents to avoid the the whack-a-mole that stupid governments play with torrent search sites.
A buddy of mine swears by usenet and uses a pretty cheap option for access.
All of that said about piracy: Support creators in your life. Cut off parasites.
+1 to FreeTube. It just sucks that all these players typically use the same YouTube.js lib. So when google fucks with that lib (because they definitely cat-and-mouse that shit), it breaks so many.
This is a solvable problem. Just make a LoRA of the Alice character. For modifications to the character, you might also need to make more LoRAs, but again totally doable. Then at runtime, you are just shuffling LoRAs when you need to generate.
You’re correct that it will struggle to give you exactly what you want because you need to have some “machine sympathy.” If you think in smaller steps and get the machine to do those smaller, more do-able steps, you can eventually accomplish the overall goal. It is the difference in asking a model to write a story versus asking it to first generate characters, a scenario, plot and then using that as context to write just a small part of the story. The first story will be bland and incoherent after awhile. The second, through better context control, will weave you a pretty consistent story.
These models are not magic (even though it feels like it). That they follow instructions at all is amazing, but they simply will not get the nuance of the overall picture and be able to accomplish it un-aided. If you think of them as natural language processors capable of simple, mechanical tasks and drive them mechanistically, you’ll get much better results.
Maybe the problem is that I’m too close to the specific problem. AI tooling might be better for open-ended or free-association “why not try glue on pizza” type discussions, but when you already know “send exactly 4-7-Q-unicorn emoji in this field or the transaction is converted from USD to KPW” having to coax the machine to come to that conclusion 100% of the time is harder than just doing it yourself.
I, too, work in fintech. I agree with this analysis. That said, we currently have a large mishmash of regexes doing classification and they aren’t bulletproof. It would be useful to see about using something like a fine-tuned BERT model for doing classification for transactions that passed through the regex net without getting classified. And the PoC would be would be just context stuffing some examples for a few-shot prompt of an LLM and a constrained grammar (just the classification, plz). Because our finance generalists basically have to do this same process, and it would be nice to augment their productivity with a hint: “The computer thinks it might be this kinda transaction”
Leading to either having to carefully double check what it suggests, or having fix bugs in code that I wrote but didn’t actually write.
100% this. Recent update from jetbrains turned on the AI shitcomplete (I guess my org decided to pay for it). Not only is it slow af, but in trying it, I discovered that I have to fight the suggestions because they are just wrong. And what is terrible is I know my coworkers will definitely use it and I’ll be stuck fixing their low-skill shit that is now riddled with subtle AI shitcomplete. The tools are simply not ready, and anyone that tells you they are, do not have the skill or experience to back up their assertion.
This take is so naive. You really think the advertisers will give up their current, rich sources of data for Mozilla’s watered down crap? Given the current market share, no one is going to pay a premium for this little data. Or do you think the people that came up with everything creep.js does in order to track you will suddenly grow some ethics and stop doing that just because Mozilla is selling my data in aggregate? Not only is this a dumb idea that won’t even work (like just about every other non-browser thing they have tried), but then they also felt selling my data was within their right.
Mozilla Corp was never entitled to my data to sell in aggregate or to stay in for-profit business.
…
Riding through the village, Christmas Eve. You can say there's no such thing as Elminster, But as for me and Grandpa, we believe. Grandma got run over by a dragon Walking home from the tavern, Christmas Eve. (On her way home) You can say there's no such thing as Elminster, (Say there's no Elminster) But as for me and Grandpa, we believe. (Lord, we believe) She'd been drinkin' too much ale, And we'd begged her not to roam. But she'd left her sword behind, So she stumbled out into the snow. Grandma got run over by a dragon Walking home from the tavern, Christmas Eve. (On her way home) You can say there's no such thing as Elminster, (Say there's no Elminster) But as for me and Grandpa, we believe. (Lord, we believe) Now we're all so proud of Grandpa, He's been takin' this so well. See him in there watchin' the dice roll, Drinkin' ale and singin' with the halfling fell. It's not Christmas without Grandma. All the villagers are in shock. And we just can't help but wonder: Should we bury her treasure or give it to the clerk? Grandma got run over by a dragon Walking home from the tavern, Christmas Eve. (Midnight before Christmas) You can say there's no such thing as Elminster, (Say there's no Elminster) But as for me and Grandpa, we believe. (Lord, we believe) Now the feast is on the table And the meat pies made of turkey. And a red and golden candle That would have just matched the hair in grandma's curly wig. I've warned all my friends and comrades. "Better watch out for yourselves. They should never give a license, To a dragon that breathes fire and smokes with elves." Grandma got run over by a dragon Walking home from the tavern, Christmas Eve. (Minding her own business) You can say there's no such thing as Elminster, (What do you mean there's no Elminster?) But as for me and Grandpa, we believe. (Lord, we believe) Oh, As for me and Grandpa, we believe!