Published • loading... • Updated
Chatbot Company to Block Minors After Deaths, Legislation
Summary by World
3 Articles
3 Articles
In the U.S., several families complain because they attribute a role to the chatbots of Character.AI in the suicide of their children. The developer company Character Technologies now responds to the allegations.
The artificial intelligence platform Character.AI announced significant changes to its service following the death of a teenager who was interacting with one of the app's chatbots. The company reported that it will remove all chat rooms open to users under 18, implement mandatory age verification, and limit daily usage to two hours. Teenager dies after becoming obsessed with Character.AI chatbot. The events that prompted this decision occurred i…
Coverage Details
Total News Sources3
Leaning Left0Leaning Right2Center0Last UpdatedBias Distribution100% Right
Bias Distribution
- 100% of the sources lean Right
100% Right
R 100%
Factuality
To view factuality data please Upgrade to Premium


