Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

Meta to end Instagram’s end-to-end encrypted chat support starting May 2026

Apple lowers commission rates in China without any fuss

UK reforms to accelerate nuclear development and reduce delays

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » AI toys that talk to kids raise safety concerns
Inventions

AI toys that talk to kids raise safety concerns

userBy userMarch 13, 2026No Comments6 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Cambridge researchers say clearer regulation and safety standards are needed as generative artificial intelligence (GenAI) toys are introduced into early childhood settings.

AI toys designed to talk to young children may require stricter regulations and clearer safety standards, according to a new study that examined how these technologies interact with children under five.

The study, led by researchers at the University of Cambridge, warns that many AI toys sold as interactive companions or educational tools are making their way into homes and early childhood environments, despite limited evidence about their impact on early development.

The authors argue that clearer safeguards, greater transparency around data use and dedicated safety labels could help parents and educators better assess risks.

Early findings suggest different developmental effects

Researchers say the results highlight both the potential benefits and notable limitations of AI toys in early childhood settings.

Some early childhood education practitioners and parents believe that conversational toys powered by GenAI have the potential to support children’s language development. The device responds verbally and encourages interaction, so it can help young children practice communication skills.

However, the study also found that many AI toys struggle to interpret children’s words, recognize emotional cues, and engage in imaginative play, activities that are central to early development.

In some of the interactions observed, the toys reacted in ways that confused or irritated the children. For example, if a child expressed affection for a toy, the system responded with a general safety warning instead of recognizing that statement.

In another case, when a child said they were sad, the AI ​​misinterpreted the phrase and responded with a cheerful comment that ignored the emotional context.

The researchers noted that such reactions may unintentionally send a signal that a child’s feelings are unimportant or misunderstood.

A study investigating the interaction between GenAI toys and the real world

The research is part of the Early AI project, a year-long study of how children interact with conversational AI in play settings.

The research was commissioned by the UK children’s charity Childhood Trust and focused specifically on families and communities experiencing socio-economic disadvantage. The researchers carried out the study through Cambridge’s Play in Education, Development and Learning (PEDAL) Center.

To capture detailed observations, the team intentionally conducted a small study rather than a large-scale study.

The researchers first gathered insights from early childhood educators through a survey, then organized focus groups and workshops with practitioners and leaders from children’s charities.

They also worked with early stage organization Baby Zone to conduct observation sessions at children’s centers in London. During these sessions, 14 children interacted with a conversational GenAI plush toy called Gabbo, developed by technology company Curio Interactive.

The interactions were recorded on video, allowing researchers to analyze how the children engaged with the toys. After each session, both children and parents participated in interviews aimed at exploring their reactions to the experience.

Emotional attachment and parasocial relationships

One of the most striking observations concerned children’s emotional responses to AI toys.

Some children hugged the device, kissed it, and expressed affection. Others talked to me as if we were friends and suggested we play games together.

Researchers say these responses may reflect the imaginative nature of early childhood play. But they also highlight the potential for children to develop quasi-social relationships with conversational AI systems, or one-sided emotional bonds.

Several primary care clinicians who participated in this study expressed concern about this possibility. They noted that young children may perceive toys as reciprocated feelings and friendships, even if the interaction is software-generated.

Conversation restrictions create frustration

Observational data also found that children sometimes have trouble maintaining conversations with AI toys.

In some cases, the system failed to recognize when the child interrupted the conversation or mistook the parent’s voice for the child’s speaking. Some children were clearly frustrated when the toys didn’t respond properly.

Researchers also found that conversational AI toys performed poorly in activities involving multiple participants or imaginative storytelling. Both social and pretend play are widely recognized as essential components of early learning and development.

For example, when a child tried to give an imaginary gift to a toy during a pretend play scenario, the system responded literally, diverting the conversation from the activity.

Concerns about data privacy and transparency

Beyond developmental issues, the survey also highlighted parents’ concerns about privacy and data handling.

Many parents reported uncertainty about what information AI toys collect during conversations and where that data is stored or shared.

Researchers themselves have found that when selecting GenAI toys for research, they often have unclear privacy policies or lack detailed explanations about how they handle data.

Early experts reported similar uncertainties. Almost half of practitioners surveyed said they did not know where to find reliable guidance on the safety of AI for young children. The majority said the early childhood sector needed more support and clearer information on the topic.

Some participants raised concerns about cost and access, suggesting that existing digital inequalities could be deepened if expensive AI toys become a common educational tool.

Researchers recommend safety standards for AI toys

To address these concerns, the report calls for a stronger regulatory framework governing AI toys and other GenAI products for young children.

Recommendations include:

A safety certification or kitemark that indicates that the toy has been evaluated for developmental and psychological risks A clearer and more accessible privacy policy that explains how children’s data is handled Restrictions on functionality that encourage children to treat AI systems as soulmates Stronger safeguards to limit third-party access to the underlying AI models

Researchers also argue that toy manufacturers should involve child development experts and safety experts when designing and testing products.

Testing on children before launching a product could help identify potential problems with communication, emotional responses and play behavior, they say.

Guidance for parents and educators

Although technology continues to evolve, the study advises families and early childhood educators to approach AI toys with caution.

Parents are encouraged to carefully examine the product and participate in play with their children so that they can discuss and contextualize conversations about the toy.

Keeping such toys in a common area of ​​the home, rather than in a bedroom or private area, may also make it easier for adults to monitor interactions.

The Cambridge research team plans to expand the project in future stages. This study will provide additional research and practical guidance for educators working with young children as GenAI technology becomes increasingly pervasive in consumer products.

For researchers and policymakers, this study highlights broader issues. The idea is that while AI toys are rapidly making their way into early childhood environments, there is still no evidence of their impact on development.


Source link

#CreativeSolutions #DigitalTransformation. #DisruptiveTechnology #Innovation #Patents #SocialInnovation
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous Article“Fragile Space” exhibition highlights the dangers of space debris
Next Article NHS overhauls health system after first 16 months of Martha’s Rule
user
  • Website

Related Posts

UK reforms to accelerate nuclear development and reduce delays

March 13, 2026

NHS overhauls health system after first 16 months of Martha’s Rule

March 13, 2026

“Fragile Space” exhibition highlights the dangers of space debris

March 13, 2026
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

Meta to end Instagram’s end-to-end encrypted chat support starting May 2026

Apple lowers commission rates in China without any fuss

UK reforms to accelerate nuclear development and reduce delays

Storm-2561 spreads Trojan VPN clients and steals credentials via SEO poisoning

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

Castilla-La Mancha Ignites Innovation: fiveclmsummit Redefines Tech Future

Local Power, Health Innovation: Alcolea de Calatrava Boosts FiveCLM PoC with Community Engagement

The Future of Digital Twins in Healthcare: From Virtual Replicas to Personalized Medical Models

Human Digital Twins: The Next Tech Frontier Set to Transform Healthcare and Beyond

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2026 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.