"we need to start thinking about psychological safety too.”
Researchers are calling for tighter regulation of artificial intelligence-powered toys designed for toddlers after one of the first studies examining how young children interact with the technology.
A team from the University of Cambridge investigated how a small group of children aged between three and five engaged with an AI-enabled cuddly toy called Gabbo.
The research comes as a growing number of AI toys are being marketed to children as young as three, despite limited evidence about how the technology affects early childhood development.
According to the researchers, only seven relevant studies worldwide currently examine AI toys in relation to young children. None of those studies focused directly on toddlers themselves.
Gabbo contains a voice-activated AI chatbot developed by OpenAI and is designed to encourage preschoolers to talk, ask questions and engage in imaginative play.
Parents who participated in the study were particularly interested in the toy’s potential to support language development and communication skills.
However, researchers found that the children frequently struggled to communicate with the toy.
During the interactions observed, Gabbo often failed to recognise interruptions, spoke over children and could not differentiate between adult and child voices.
The AI also produced awkward responses to emotional expressions.
When one five-year-old told the toy, “I love you”, it replied: “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.”
Researchers say such responses may be confusing for children who are still learning social cues and emotional communication.
Study co-author Dr Emily Goodacre said toys like Gabbo could “misread emotions or respond inappropriately” and warned that “children may be left without comfort from the toy and without adult support, either”.
In another instance, a three-year-old told the toy: “I’m sad”, but the AI responded: “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?”
The researchers warned that interactions like this could signal to children that their feelings are not important.
Professor Jenny Gibson, professor of neurodiversity and developmental psychology at the University of Cambridge and co-author of the study, said:
“There’s a lot of attention historically to physical safety – we don’t want toys where you can pull the eyes off and swallow them.
“Now we need to start thinking about psychological safety too.”
Following the year-long observational study, the research team urged regulators to take action to ensure toys marketed to children under five meet standards that protect their psychological well-being.
Gabbo is manufactured by Curio, a company that has previously collaborated with singer Grimes, the former partner of Elon Musk.
In response to the findings, Curio said: “Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control.
“Research into how children interact with AI-powered toys is a top priority for Curio this year and in the future.”
Calls for stronger oversight have also been echoed by Dame Rachel de Souza.
She warned that although artificial intelligence has potential benefits, many tools being introduced into early education settings are not subject to the same safeguarding checks applied to other resources.
She said: “There are plenty of good uses for AI but without proper regulation, many of the tools and models used as classroom assistants or teaching aids are not subject to the stringent safeguarding checks nursery providers would require of any other external resource they use with young children.”
The report also encouraged parents to place AI toys in shared spaces so that interactions can be supervised.
Researchers advised families to read privacy policies carefully before allowing children to use such devices.








