Experts detail the pros and cons of AI and young people’s well-being

As the rapid advancement and accessibility of generative artificial intelligence (AI) tools such as ChatGPT transform how children interact, learn and socialize, researchers and parents alike have begun to question the impacts these shifts will have on children’s development and socialization.

Despite potential benefits, some worry that younger children in particular may face reduced opportunities for critical social interactions essential for the development of emotional intelligence, empathy and communication skills.

The Brookings Institution’s Center for Universal Education hosted a panel discussion on May 6 featuring experts in early learning, mental health and technology, and research on AI companions exploring the impacts of AI on student’s lives.

Below are some the key themes that emerged from the conversation.

Real relationships matter

Surveys show that 45 percent of high school students already use OpenAI platforms such as ChatGPT to deal with friendship, relationship and mental health issues through the use of AI companions.

“Relationships predict success in school,” said Isabelle Hau, Stanford Accelerator for Learning executive director. “The presence of relationships and the number of those strong relationships — and this is for high school students — is one of the strongest predictors of academic motivation, engagement and persistence. And yet despite all we know, we continue to treat relationships as invisible in our systems of learning.”

Hau cited several studies which have found that while social robots can reduce feelings of isolation, particularly among older adults, they also pose a risk of creating dependency and can blur the boundaries between real and artificial relationships.

This issue is even more pressing among young children, as about 90 percent of the brain is estimated to develop before age 5. Hau pointed to a recent study from Common Sense Media that found 40 percent of children have their own device by age 2, and separate research that found the average American adult picks up their phone or their device 205 times per day.

“What’s clear from research that’s emerging … we are not just designing tools, we are shaping patterns of connection, and if AI becomes a substitute rather than a scaffold for human relationships, we risk automating our own humanity,” she said. “My belief is that we must reimagine learning and care as fundamentally relational. What if we trained educators not only for instruction, but for connection? What if we were to measure not just literacy scores, but a strength of connection in our schools? Ideally, I would like all of us to grow our human relational intelligence with the same urgency that we all give to artificial intelligence because in this world of rapid automation, our humanity, I believe, will be our superpower.”

Benefits to mental health

Drew Barvir, co-founder and CEO of Sonar Mental Health, is seeking to do just that in his work, which relies on both AI and humans to help support the mental health and well-being of young people. Sonar partners with school districts to offer students 24/7 chat-based support with a real human on the other end who is being supported by AI.

“What this looks like is, you’ve got a person receiving the message [from a student] and responding, but they have what we call our well-being companion co-pilot on the other side of their computer screen where they can see summaries of past conversations with the student, recommendations on how to respond — whether that’s using resources, pulling in context from the student, pulling in clinical recommendations that we’ve built into our system or even making suggestions on tone and style with the student based on those past conversations,” Barvir explained. “We’ve been trying to be as rigorous as we can around measurement and have seen outcomes such as reduced clinical referrals, reduced disciplinary rates, improved grades, attendance, etc.”

He said that one goal is to help make systems more preventative by helping to identify challenges, offer support earlier and escalate those cases that need it to real people, be it a counselor, teacher, parent or therapist.

The second goal is to help young people build skills and confidence to tackle challenges in their lives on their own, be it having a difficult conversation, giving a presentation in front of their class or going off to college.

Using AI in ways that support connection is critical to meeting the massive need nationwide, Barvir said. “Frankly, we just don’t have the clinicians, the support systems, to be able to address those needs.”

He noted extreme gaps in access to mental health supports in rural and low-income communities. “You see 60-plus day wait times or lack of access altogether, and so AI or technology-enabled solutions are an incredible way to help increase access at a bare minimum to support that can then hopefully help debottleneck the system to help escalate to providers for those who need it,” Barvir said.

“Of course, human interactions and clinicians are needed in particular for higher acuity cases, but for those that are perhaps dealing with day-to-day mild to moderate challenges, there’s an opportunity to help skill-build and help work through those situations in a way that’s super accessible and cost effective,” he continued.

Legal implications

“What are the concerns with AI companions?” asked Gaia Bernstein, a legal professor of Technology, Privacy and Policy, and co-director of both the Institute for Privacy Protection and Gibbons Institute of Law Science and Technology at Seton Hall Law School. “The first one is no guard rails.”

Bernstein highlighted several recent lawsuits, including one filed by a mother whose son died by suicide after forming an intimate relationship with a chatbot made by Character.ai, and another filed against the same company after its bot suggested to a teen that murdering his parents was a “reasonable response” to them limiting his screen time.

“Kids’ brains are not as developed as adults or even teens, especially in areas of emotional regulation, risky behavior, decision-making, all of this has an impact and how vulnerable they are to these bots,” Bernstein said. “Another thing that these bots do is they make the kids emotionally dependent on them, and they isolate them from family or friends.”

Echoing Hau’s concerns about bots replacing real life connections, Bernstein noted that in order to mine data from users, this technology is developed with the intention of keeping users engaged for extended periods of time. For young people, especially those dealing with loneliness, this can become an easy trap to fall into.

“These AI companions tend to say what we want to hear, they affirm what we say, they’re much easier than real life companions,” Bernstein said. “And if you think about kids, you know it’s not fun to be in middle school. Life is difficult, relationships are difficult, why bother having friends? Why bother to learn how to have relationships if you can have a friend that’s easy to get along with? Why fall in love as a teenager with all the heartbreak if you can have an intimate relationship with a bot who is always nice to you?”