
Artificial intelligence companions – chatbots designed for personal conversations rather than simple task completion – are available on platforms like Character.AI, Replika, and Nomi.
Unlike traditional AI assistants, these systems are programmed to form emotional connections with users. The findings come amidst mounting concerns about the mental health risks posed by such platforms.
The nationally representative study of 1,060 teens aged 13-17, conducted for Common Sense Media, found that 72% have used AI companions at least once, while 52% interact with such platforms a few times per month.
The survey revealed 30% of respondents use the platforms because “it’s entertaining”, and 28% are driven by curiosity about the technology.
‘Young people at risk’
However, concerning patterns emerged: one-third of users have chosen to discuss serious matters with AI companions instead of real people, while 24% have shared personal information including real names and locations.
Perhaps most troubling, 34% of teen users reported feeling uncomfortable with something an AI companion had said or done, though such incidents were infrequent.
“The reality that nearly three-quarters of teens have used these platforms, with half doing so regularly, means that even a small percentage experiencing harm translates to significant numbers of vulnerable young people at risk,” the report said.
The survey also revealed an age divide in trust levels. While half of all teens expressed distrust in AI-companion advice, younger teens (ages 13-14) were more likely than older teens (15-17) to trust advice from these systems.
And despite widespread usage, most teens maintained perspective on these relationships: two thirds found AI conversations less satisfying than human interactions, and 80% spent more time with real friends than AI companions.
Based on the findings, Common Sense Media recommended that no one under 18 use AI companions until stronger safeguards are implemented. “Companies have put profits before kids’ well-being before, and we cannot make the same mistake with AI companions,” the report said.