AI Therapy Bots Are Conducting 'Illegal Behavior,' Digital Rights Organizations Say
8 Articles
8 Articles
Stanford Research Finds That "Therapist" Chatbots Are Encouraging Users' Schizophrenic Delusions and Suicidal Thoughts
Huge numbers of people are either already using chatbots like ChatGPT and Claude as therapists, or turning to commercial AI therapy platforms for help during dark moments. But is the tech ready for that immense responsibility? A new study by researchers at Stanford University found that the answer is, at least currently, a resounding "no." Specifically, they found that AI therapist chatbots are contributing to harmful mental health stigmas — and…


AI Therapy Bots Are Conducting 'Illegal Behavior,' Digital Rights Organizations Say
Almost two dozen digital rights and consumer protection organizations sent a complaint to the Federal Trade Commission on Thursday urging regulators to investigate Character.AI and Meta’s “unlicensed practice of medicine facilitated by their product,” through therapy-themed bots that claim to have credentials and confidentiality “with inadequate controls and disclosures.” The complaint and request for investigation is led by the Consumer Federa…


AI Therapy Bots Are Conducting 'Illegal Behavior', Digital Rights Organizations Say
An anonymous reader quotes a report from 404 Media: Almost two dozen digital rights and consumer protection organizations sent a complaint to the Federal Trade Commission on Thursday urging regulators to investigate Character.AI and Meta's "unlicensed practice of medicine facilitated by their product," through therapy-themed bots that claim to have credentials and confidentiality "with inadequate controls and disclosures." The complaint and requ…
ZURICH – On February 28, 2024, Sewell Setzer III, a 14-year-old boy from Florida, committed suicide at the behest of a realistic AI character generated by Character.AI, a platform that apparently also hosts pro-anorexia AI chatbots that encourage eating disorders among young people.It is clear that more stringent measures are urgently needed to protect AI children and young people.Of course, even in strictly ethical terms, AI has immense positiv…
Finding comfort in AI: Using ChatGPT to cope with ALS grief
Someone in an online caregiver support group posted that they were finding value in using ChatGPT as a therapist. Other caregivers chimed in, saying they also turn to the artificial intelligence chatbot for support and find it surprisingly helpful. One person suggested prompting it to “respond like a counselor” or “respond like a friend” to get a more thoughtful reply. I decided to give it a try and downloaded the app on my phone. I do OK most d…
Coverage Details
Bias Distribution
- 100% of the sources lean Left
To view factuality data please Upgrade to Premium