It is predictable, then, that users are consistently fooled into believing that their AI companions are conscious persons, capable of feeling real emotions.
we tell ourselves that humans do something clever or tactical because our brains have simulated that this course of action will produce favourable outcomes, but when we learn that ants do the same thing by enacting preprogrammed responses to pheromones, surely that doesn’t count.
The next time you’re driving home in a car or sitting down to enjoy a meal, spare some thought for the ways in which the neurochemical soup in your brain mimics your gut: helping you to digest complex patterns of information as you navigate the intricacies of your daily life.
a substantial proportion of users voluntarily signal departure with a farewell message, especially when they are more engaged. This behavior reflects the social framing of AI companions as conversational partners, rather than transactional tools.
as conversational AI agents become more interactive and personalized, they will surpass human influencers in their ability to shape our decisions without us realizing it.
"A lot of how we feel," she explains, "is all about the systems we interact with, whether they're other people or technology. We're interacting increasingly with technology over time, and I think things like how much time we spend staring at screens, our phones and social media is impacting our mental health. So maybe, by having machines that have... See more