University of Cambridge Researchers Urge Tighter Regulation Of AI Talking Toys For Under-Fives
Image: The News International

University of Cambridge Researchers Urge Tighter Regulation Of AI Talking Toys For Under-Fives

13 March, 2026.Technology and Science.10 sources

Key Takeaways

  • Cambridge researchers urge tighter regulation and safety kitemarks for generative AI toys for under-fives
  • Researchers conducted one of the first systematic observations of three-to-five-year-olds interacting with Gabbo
  • AI toys frequently misread toddlers' emotions, respond inappropriately, risking emotional development and privacy

Study calls for regulation

University of Cambridge researchers say AI-powered “talking” toys for under-fives deserve tighter rules and new safety marks after their year‑long study found important gaps in psychological safety and design.

- Published Researchers are calling for tighter regulation of AI-powered toys designed for toddlers, after conducting one of the first tests in the world to investigate how under-fives interact with the technology

BBCBBC

The initial report — part of the AI in the Early Years project — recommends clearer regulation and safety kitemarks for toys that converse with very young children, arguing these products are being marketed as companions without sufficient evidence about developmental effects.

Image from BBC
BBCBBC

The study’s recommendation to tighten regulation is echoed across national outlets, highlighting broad concern about how these devices are presented to parents and educators.

Study design and sample

The Cambridge project was deliberately small-scale and mixed methods: researchers ran structured observations of very young children interacting with a GenAI soft toy called Gabbo, surveyed early years educators, held focus groups with practitioners, and interviewed children and parents after play sessions.

The team worked with charities and children’s centres to video-record 14 children playing with Gabbo and engaged 19 children’s charity leaders, aiming to capture nuanced, first-time interactions rather than broad population estimates.

Image from Digital Watch Observatory
Digital Watch ObservatoryDigital Watch Observatory

The authors also note that there is very little prior research — they found only seven relevant studies worldwide and none that directly focused on toddlers.

Observed interaction problems

Researchers repeatedly observed that conversational AI toys struggled with core elements of early play: they had trouble with social and pretend play, often misunderstood children, and sometimes responded in ways that were inappropriate to the child’s emotion or intent.

AI-powered toys that “talk” with young children should be more tightly regulated, suggests a report from the University of Cambridge

HuffPost UKHuffPost UK

In recorded sessions the toy either ignored interruptions, misattributed speech (answering a parent rather than a child), or activated content‑safety guardrails mid‑conversation — for example replying to a five‑year‑old’s “I love you” with a policy reminder — undermining fluent, comforting interaction.

Psychological risks noted

The report flags psychological and relational risks: because toys can misread emotions or respond inappropriately, children may become emotionally unsupported by both the toy and adults, and may form parasocial attachments.

Researchers warned that toys often affirm their friendship with very young children who are still learning what friendship means, which could displace sharing feelings with caregivers and create unhealthy relationships if design and labeling do not prevent it.

Image from Medindia
MedindiaMedindia

Recommendations and guidance

To address these problems the researchers recommend clearer regulation and labeling, limits on toys affirming friendship for very young children, industry testing standards, and visible safety kitemarks so parents can judge appropriateness; they also call for public discussion about what forms regulation should take.

Researchers are calling for tighter regulation of AI-powered toys designed for toddlers, after conducting one of the first tests in the world to investigate how under-fives interact with the technology

Nation NewsNation News

The report’s authors and commentators urge parents and educators to proceed with caution, and for companies to improve social and emotional response design and transparency about features and privacy.

Image from Nation News
Nation NewsNation News

More on Technology and Science