Tips for Parents On AI and Teens

Researchers warn that “heavy and prolonged reliance” on AI may create a “cognitive debt,” where students’ cognitive abilities like recall are diminished over time.

If you’re the parent of an adolescent, you probably know your child is using artificial intelligence in their daily lives — but you may not know how to ensure that they engage with it safely and responsibly.

According to a February Pew Research Center study, more than half of U.S. teens use task-based AI tools to search for information or help with schoolwork and nearly half use them for fun or entertainment. One in 10 teens say they do all or most of their schoolwork with the help of chatbots, while 21 percent report doing some and 23 percent report doing a little of their schoolwork with the help of a chatbot. Almost 50 percent of all teens say chatbots have been extremely or very helpful for completing their homework.

AI can be a powerful tool for students when it’s not used as a substitute for their own thinking. Yet researchers at MIT warn that “heavy and prolonged reliance” on AI may create a “cognitive debt,” where students’ cognitive abilities like recall are diminished over time, and critical thinking and problem-solving skills are impaired, particularly in young people aged 17-25. Creating first, and then using AI tools to refine or expand ideas, can help build skills rather than replace them.

Where the danger lies

Not all uses of AI chatbots are beneficial, and some carry real risks. Sixteen percent of U.S. teens say they have had casual conversations with AI chatbots, and 12 percent say they have sought emotional support or advice. The American Psychological Association warns of the dangers to adolescents and young adults when AI chatbots become emotional, immersive, or intimate — when the tools are designed to act like friends, therapists, or romantic partners.

Teens and young adults are particularly vulnerable to these companion-based AI chatbot systems, because the part of their brain that controls decision-making, impulse control, social awareness, and emotional regulation is still developing. According to an article from Stanford Medicine News, these chatbots provide “frictionless relationships, without the rough spots that are bound to come up in a typical friendship…. Teens might use these AI systems to avoid real-world social challenges, increasing their isolation rather than reducing it.”

In some extreme cases, AI chatbots have offered harmful advice to teens and young adults in distress. Unlike trained mental health professionals, AI tools don’t assess risk or intervene in a crisis. They offer fabricated connections without providing real care or accountability and have been reported to promote harmful behavior, including suggestions toward self-harm and suicide. AI systems targeting children and teens are built for engagement and profit without adequate safeguards to protect children, which is why the American Psychological Association calls for “immediate regulatory guardrails for artificial intelligence chatbots to protect the mental health and well-being of young people.”

How adults can respond

The most important protection you can offer your child is open communication about AI as early as possible. Yet the Pew Center study reports that about four in 10 parents say they have never talked with their teen about chatbots.

Like most difficult or awkward conversations with your kids, start with curiosity and collaboration. Lead with an interest in learning about the young person’s experiences with AI. For example, what kinds of AI is the teen using? What are the most interesting things they have done with AI? Ask if they have concerns about certain types of AI, like chatbots, and how AI makes them feel. Nurture the adolescent’s critical thinking skills by exploring how AI works, discussing how the answers are generated, and how AI companies make money. You want to strengthen your family connections and encourage your child to come to you for help rather than increasing isolation or confusion.

As always, your behavior matters. Share how you use AI — for work, ideas, research, or productivity. And emphasize that it is not something you use for emotional support. When parents demonstrate healthy and appropriate use, they open the door to questions and further conversation.

Lisa Dominici is executive director of the Rye Youth Council, a Rye-based nonprofit that promotes social-emotional development, strengthens resilience, and supports youth mental health.