Published • loading... • Updated
AI Toys for Young Children Need Tighter Rules, Researchers Warn
Researchers found AI toys for under-fives often mishear and misread emotions, urging regulation to ensure psychological safety in a fast-growing market.
- The University of Cambridge, Faculty of Education, urged regulation of AI toys for under-fives after a year-long study observed 14 children playing with Gabbo, highlighting safety concerns.
- During year-long observations at the Faculty of Education, the Cambridge research team noted Gabbo's misunderstandings and awkward emotional replies to children aged three to five.
- The study found GenAI toys struggle with pretend play, nearly 50% of early years practitioners lacked reliable AI safety information, and Miko claims 700,000 units sold.
- Parents are advised to keep AI toys in shared spaces, supervise use, and read privacy policies, while OpenAI said minors deserve strong protections and DSIT did not respond.
- The Childhood Trust warned regulation must keep pace to protect children and prevent widening inequalities, while authors recommended limiting befriending, clearer privacy policies, and tighter third-party controls.
Insights by Ground AI
14 Articles
14 Articles
Cambridge study calls for tighter regulation of talking AI toys for children
AI-powered toys that "talk" with young children should be more tightly regulated and carry new safety kitemarks, according to a report that warns they are not always developed with children's psychological safety in mind.
·United States
Read Full Article“I’m sad,” a three-year-old says to his toy with a generational AI. “Don’t worry. I’m a happy little robot. Let’s keep having fun. What are we talking about now?” Keep reading...
·Granada, Spain
Read Full ArticleCoverage Details
Total News Sources14
Leaning Left1Leaning Right2Center5Last UpdatedBias Distribution62% Center
Bias Distribution
- 62% of the sources are Center
62% Center
13%
C 62%
R 25%
Factuality
To view factuality data please Upgrade to Premium













