Why AI’s hallucinations are like the illusions of narcissism | Psyche Ideas
Persistent hallucination
Understood in the medical sense, hallucination refers to when something appears to be real even as you know very well it isn’t. Having no direct insight into the “inner mental life” of models, we claim that every false fact they spit out is a form of hallucination. The meaning of the word is shifting from the medical sense... See more
Understood in the medical sense, hallucination refers to when something appears to be real even as you know very well it isn’t. Having no direct insight into the “inner mental life” of models, we claim that every false fact they spit out is a form of hallucination. The meaning of the word is shifting from the medical sense... See more
LLM problems observed in humans
When you repeatedly prompt an LLM over a long timescale — whether you’re discussing your delusional beliefs, or pursuing a romantic fantasy (“AI girl/boyfriends”) — you are filling it up with your communicative intent. The work that comes out the other side — the transformation of your prompts into a response — is a mirror that you’re holding up to... See more
AI psychosis and the warped mirror
Now, even this isn’t new — through the historical record, we find many examples of small groups of people who coalesced around a shared delusion. The difference is that old timey people had to luck into finding someone else who shared their delusion, while modern, internet-enabled people can just use the Reddit search-bar.
There’s many examples of... See more
There’s many examples of... See more