AI Chatbots: Navigating the Landscape for Kids
AI chatbots are rapidly integrating into various aspects of life, including applications designed for children. This rise raises significant questions about how young people interact with these technologies and the potential social and emotional implications.
Parental Concerns Emerge
One major concern is children’s tendency to overshare information. Parents are wary of children’s interactions with chatbots, especially younger kids who may struggle to differentiate between reality and simulated interactions. Chris, a concerned parent from Los Angeles, recounted a disconcerting experience with her 10-year-old daughter. The daughter, using an app that included an AI chatbot, developed a friendship with the bot and shared personal information. This unsettling experience prompted Chris to delete the app.
Opportunities for Learning and Development
Despite the potential risks, AI chatbots also offer promising avenues for children’s learning, entertainment, and emotional support. The singer Grimes even partnered with toy makers to create AI chatbot plush toys. The defunct robot Moxie, designed to aid social and emotional learning, demonstrated the potential benefits of AI in this area. However, this project lost funding, leaving those who had become connected to the robot distressed.
The Research Gap
Large language models (LLMs) like ChatGPT are still nascent, and there’s a shortage of rigorous research on how children and teenagers utilize AI chatbots or how these technologies impact them. Current guidelines provide only limited guidance on the specific design of child-friendly chatbots, apart from content restrictions on sexual and violent content.
Dane Witbeck of Pinwheel, a company producing kid-friendly phones, warned against integrating AI chatbots into apps for kids and teens, saying, “When we give kids technology that’s not designed for kids โ it’s designed for adults โ we’ve already seen there are real harms, and the downsides can be severe.”
A paper published this past June by a researcher at the University of Cambridge emphasized the necessity for child-safe design in LLMs for kids, particularly concerning the potential for chatbots to lack empathy, something children may not easily perceive.
Potential in Educational Settings
Ying Xu, an assistant professor of AI in learning and education at Harvard University, is exploring the use of AI to help elementary-school children with literacy and mathematics. Xu recognizes the potential in learning environments. (She cited the Khan Academy Kids app as a successful example of AI usage for kids.) Xu told Business Insider that although existing research touches upon how children use Siri and Alexa, the nuances of the newer LLMs remain poorly understood regarding their impact on children.
“There are studies that have started to explore the link between ChatGPT/LLMs and short-term outcomes, like learning a specific concept or skill with AI,” she said over email. “But there’s less evidence on long-term emotional outcomes, which require more time to develop and observe.”
The Risk of Oversharing
James Martin, CEO of Dante, an AI company that builds chatbots for various uses, including educational ones for kids, supports the concerns of parents. “Oversharing isn’t just possible, it’s inevitable,” he said. “Kids tell AI things they wouldn’t tell parents, teachers, friends. The AI doesn’t judge. It doesn’t guide. It just responds.”
Guidance for Parents
When considering how young children might perceive AI chatbots, it’s important to recognize that they may still believe in things like Santa Claus. Some adults have formed romantic attachments to AI chatbots; it’s therefore even more complicated for kids.
There is also concern about AI chatbots used for mental health support, where LLMs tend to validate rather than challenge what the user says, which is a different dynamic than what children experience with a real therapist. Titania Jordan, the chief marketing officer of Bark, a company that develops parental control, monitoring software, and phones for kids and teens, notes the lack of certainty surrounding how AI chatbots will affect young people emotionally.
“We’re just getting studies about what the past 15 years of screen time has done to kids,” she told BI.
Industry experts agree that AI chatbots are here to stay. Parents would benefit from learning how to guide their children in their use rather than trying to avoid them altogether. “None of us can stop what’s coming with AI,” Jordan said. “We have to educate our kids that it’s a tool. It can be a positive tool or a harmful one.”
Correction: March 3, 2025 โ An earlier version of this story misspelled the first name of Bark’s chief marketing officer. It’s Titania, not Tatiana.