• CileTheSane@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      ·
      23 hours ago

      Because it’s not AI, it’s LLMs, and all LLMs do is guess what word most likely comes next in a sentence. That’s why they are terrible at answering questions and do things like suggest adding glue to the cheese on your pizza because somewhere in the training data some idiot said that.

      The training data for LLMs come from the internet, and the internet is full of idiots.

      • Melvin_Ferd@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        16 hours ago

        That’s what I do too with less accuracy and knowledge. I don’t get why I have to hate this. Feels like a bunch of cavemen telling me to hate fire because it might burn the food