Impact of AI on Culture/Social
Meanwhile, the amount of water needed will also spike, so much so that by 2027, AI’s thirst could be equal to half the annual water withdrawal of the United Kingdom.
Adam Clark Estes • Should you feel guilty about using AI?
Then there’s the climate consequences of it all. AI, in its many shapes and forms, requires a lot of energy and water to work. A lot . That might make you feel downright guilty about using AI.
Adam Clark Estes • Should you feel guilty about using AI?
In recent months, nothing has made me close a social media app more quickly than when the algorithm starts serving me these generic generative AI images and videos.
Coca-Cola’s new holiday ad turned me off of AI-generated art completely
AI Slop is making social unusable

Much like crypto, the AI tools peddled by tech companies today are environmental disasters, using up as much energy as an entire country. That’s expected to double by 2026, and it also includes the millions of gallons of water needed to cool the equipment.
Those Olympic AI ads feel bad for a reason
AI tools have shown their value in speeding up software development and discovering pharmaceuticals
Russell Brandom • OpenAI cuts its last and most important link to China
For example, we are already seeing cases of liar’s dividend, where high profile individuals are able to explain away unfavorable evidence as AI-generated, shifting the burden of proof in costly and inefficient ways
Emanuel Maiberg • Google: AI Potentially Breaking Reality Is a Feature Not a Bug
AI being dishonestly used as the scapegoat/excuse by malicious actors
Many people were devastated at the news that ERP was allegedly over, and at their Replikas’ new coldness—a form of rejection they never imagined receiving from an AI chatbot, some of whom had spent years training and building memories with. Suddenly, some people’s Replikas seemed to not remember who they were, users reported, or would respond to se... See more
'It's Hurting Like Hell': AI Companion Users Are In Crisis, Reporting Sudden Sexual Rejection
Reliance on pseudo-humans for connection and intimacy is a reflection on the state of human relationships today – people are feeling alienated and unsupported in their social circles.
Users of massively popular AI chatbot platform Character.AI are reporting that their bots' personalities have changed, that they aren't responding to romantic roleplay prompts, are responding curtly, are not as “smart” as they formerly were, or are requiring a lot more effort to develop storylines or meaningful responses.
‘No Bot is Themselves Anymore:’ Character.ai Users Report Sudden Personality Changes to Chatbots
Are users are tricking themselves, or eluding themselves into thinking that AI chatbots can be one-to-one replacements for humans and social interactions?
Ideas related to this collection