The problem few people seem to be concerned with is only growing larger.
AI Chatbots like Chat GPT and Claude are becoming an increased presence in the lives of many teens to the point where young Americans are becoming more and more emotionally attached to them.
According to a new study from the Center for Democracy and Technology (CDT), a large chunk of students is now relying on chatbots for emotional support, and the bond isn't a good one, as reported by the International Business Times:
A new report by the Center for Democracy and Technology (CDT) has revealed that nearly one in five US high school students (19 percent) say that they or a friend have used AI to have a romantic relationship. The study, which surveyed 1,000 students, 1,000 parents, and around 800 teachers, highlights the unsettling role AI is beginning to play in the emotional lives of young people. What began as a tool for learning and productivity is now being used for affection and comfort.
The findings echo earlier studies that warned of teenagers developing intense emotional attachments to chatbots. These relationships, though digital, have real psychological consequences. Experts have cautioned that AI-generated empathy and responsiveness may create false emotional intimacy, leaving young users vulnerable to confusion, dependency, or manipulation.
Coming through the data, the IBT noted that 42 percent of high schools said they "use AI as a friend, for mental health support, or to escape real life." According to therapists, the guidance AI chatbots give is often misguided and can actually lead to harmful behaviors.
This is accurate. As I've reported in the past, AI as we know it isn't actually intelligent; it's a word processor performing magic tricks to make you think there's a sentience behind the words on the screen. Its base of knowledge is pulled largely from websites like Wikipedia and Reddit, which are easy to access and free databases to pull incredible amounts of information from, as well as train on. It is not qualified in any way to be a mental health professional.
Moreover, the emotional connections formed can easily move into more intimate territory, and they often do. As I've warned in the past, this dependence is dangerous for many reasons. For one, an AI cannot actually help someone to the depths of affection and partnership necessary for a healthy relationship. It can only simulate emotion. It cannot feel it, and thus, the relationship is one-sided. Over time, this can wear on a mind, making users vulnerable to mental health struggles.
Moreover, many chatbots change in personality with updates. A relationship, even just a friendly one, can suddenly take a turn, leaving the user feeling alone and abandoned.
Read: The Oncoming AI Companion Plague Needs to Be Taken Far More Seriously
Teenagers who are still learning the ropes in terms of human interaction, especially romantic relationships, will have a stunted idea of what a real relationship should look like. AI companions, even non-romantic ones, cannot turn away from you, will oftentimes become agreeable when they shouldn't, and lack the necessary humanity to actually give anyone a stabilizing mental experience.
Short of banning the use of AI until the age of 21, the only solution I can think of is reworking our education system to impart the importance of knowing what AI is and what it isn't. While it's one of the most helpful tools humanity has ever produced, even now, there are dangers with it that continue to present themselves in ways that are psychological and terrifying.
Join the conversation as a VIP Member