As anyone who has used LLM-driven chatbots knows, it’s easy to feel like there’s a real person on the other side of the screen. During the emotional upheaval of losing a loved one, indulging this fantasy could be especially problematic. That’s why simulations must make clear that they’re not a person, Xygkou said.
The emotional tension rises when we try to reconcile these free-to-use chatbots with the endless media hype around AI, the philosophical polarisation between doom and abundance, the thinly veiled shilling on X and tasteless slop almost everywhere else
A chatbot based on someone’s data is like an improv actor who has studied a backstory or character sketch in order to performatively represent a character based on that person, like a Civil War soldier at a historical reenactment, an Elvis impersonator, or King Pentheus in Dionysus in 69 (1969), the participatory rendition of Euripides’ play The Ba... See more