“Can machines be therapists?” is a question receiving increased attention given the relative ease of working with generative artificial intelligence. Although recent (and decades-old) research has found that humans struggle to tell the difference between responses from machines and humans, recent findings suggest that artificial intelligence can... See more
However, unlike their human equivalents, AI companions lack a conscience, and the market for these services operates without regulatory oversight, meaning there is no specific legal framework governing how these systems should operate. As a result, companies are left to police themselves, which is highly questionable for an industry premised on... See more
Elon Musk has said Grok, the A.I.-powered chatbot that his company developed, should be “politically neutral” and “maximally truth-seeking.”
But in practice, Musk and his artificial intelligence company, xAI, have tweaked the chatbot to make its answers more conservative on many issues, according to an analysis of... See more
But Weizenbaum’s turn toward critique started with the reception of ELIZA, which he built to imitate Rogerian therapy (an approach that often relies on mirroring patients’ statements back to them). Although he was explicit that ELIZA had nothing to do with psychotherapy, others, such as Stanford psychiatrist Kenneth Colby, hailed it as a first step... See more