Post

Avatar
AI's don't have to hallucinate because somewhere, some time, some fucksmith will have given an answer to the question. Taking all human utterances as true is one of the stupidest things that AI's do. There's a lot that I've said that I wouldn't recommend as "truth". Like this.
The Google AI isn’t hallucinating about glue in pizza, it’s just over indexing an 11 year old Reddit post by a dude named fucksmith.