A new study has raised concerns about AI-powered toys after researchers found they can misunderstand young children¡¯s emotions and respond in troubling ways.
Scientists at the University of Cambridge observed children ages 3 to 5 interacting with an AI soft toy designed to encourage conversation and imaginative play. While parents hoped it would support the children¡¯s language development, many interactions fell short. The toy often talked over children, missed cues, and gave confusing and sometimes inappropriate replies.
In one case, a 5-year-old told her toy, ¡°I love you,¡± but received a rigid response telling her to stick to guideline-approved topics. In another, a child expressing sadness was quickly redirected to a different topic. Researchers warn these responses could make children feel dismissed and ignored during a critical stage of emotional development.
The toy also struggled with imagination, responding literally when a child pretended to give it a gift. Experts say this may weaken children¡¯s imaginative ¡°muscle¡± over time.
Calls are growing for stricter safety standards for AI toys aimed at young children. Until then, experts advise parents to closely supervise their use and remember that no machine can replace genuine human care and connection.
Yesel Kang Copy Editor junior/1775694951/1613367813
1. Why are experts concerned about AI toys for children?
2. How did the AI toy respond to a child?
3. What imaginative muscle might weaken by using these robots?
4. What ages were the children in the Cambridge study?
1. Should parents limit the time children play with AI?
2. Can a machine provide genuine human care and connection?
3. What is the most important quality for a toy?
4. How should companies improve safety standards for AI products?