• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: July 23rd, 2023

help-circle

  • Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.

    I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren’t capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won’t do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.

    By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you’d be wildly optimistic to say we are anywhere close to having one of those available on the open market.






  • yarr@feddit.nltoMicroblog Memes@lemmy.worldNot Asking
    link
    fedilink
    English
    arrow-up
    4
    ·
    7 months ago

    That’s because journalism has more or less lost all semblance of integrity, so it’s turned into “what cheap clickbait can I crap out today to maximize my clicks?” That’s why instead of the hard-hitting investigation and journalism we got with Watergate, we get “TRUMP = LITERAL NAZI, CLICK HERE TO FIND OUT WHY”




  • Why do people care so much that it’s an app? If it was not an app would everyone have been buying it in droves?

    At least part of this is due to a direct quote from the CEO mentioning how they need a VERY bespoke Android version for it to run, which is clearly bullshit because you can run the APK on other devices other than the Rabbit R1 hardware.

    Since Rabbit was at least partially funded by the “Cyber Manufacture Co” rug-pull and they suffered NO penalty the CEO has taken this as a sign the market will tolerate his scams. You should view the Rabbit R1 through the lens of it being a former “web3” company and I’m sure the shady legacy remains inside that company.

    Since Rabbit sells at $199 and then NO monthly charge, there is basically no viable funding model for this company. Every single request you send the Rabbit costs them money. So, it’s only a matter of time before the R1 itself is “rugged”, whether that’s suddenly requiring a monthly fee OR just shutting down entirely.

    My guess would be, like the Humane Pin, they wanted to do a monthly fee, but if they did the R1 would sell even worse (since it’s basically entirely broken out of the box). If these guys make it 3 years I’ll be surprised. And, since the R1 does nothing locally, it turns into a nice paperweight when these guys eventually pull THIS rug.




  • Its their job to block that content before it reaches an audience

    The problem is (or isn’t, depending on your perspective) that it is NOT their job. Facebook, YouTube, and Reddit are private companies that have the right to develop and enforce their own community guidelines or terms of service, which dictate what type of content can be posted on their platforms. This includes blocking or removing content they deem harmful, objectionable, or radicalizing. While these platforms are protected under Section 230 of the Communications Decency Act (CDA), which provides immunity from liability for user-generated content, this protection does not extend to knowingly facilitating or encouraging illegal activities.

    There isn’t specific U.S. legislation requiring social media platforms like Facebook, YouTube, and Reddit to block radicalizing content. However, many countries, including the United Kingdom and Australia, have enacted laws that hold platforms accountable if they fail to remove extremist content. In the United States, there have been proposals to amend or repeal Section 230 of CDA to make tech companies more responsible for moderating the content on their sites.