

Or it flags something as an error falsely and the human has so much faith in the system that it must be correct, and either wastes time finding the solution or bends reality to “correct” it in a human form of hallucinating bs. Especially dangerous if saying there is an error supports the individual’s personal beliefs
Edit:
I’ll call it “AI-induced confirmation bias” cousin to AI-induced psychosis.









That happens all the time on prime video. it happened to me watching the old school unsolved mysteries. Good thing I have it on my OSMC Pi’s 128gb drive.