Premium

Tragic Consequences: The Rise of AI Relationships and the Case of Sewell Setzer

Unsplash/National Cancer Institute

The advent of artificial intelligence (AI) technology has brought new optimism for the future – especially when it comes to productivity and innovation. However, it has also raised fears about its impact on American society.

AI appears to be creeping into nearly every part of modern life – even our relationships. The technology offers opportunities for individuals struggling with loneliness to feel a sense of connection.

Unfortunately, this brings serious risks without proper safeguards, as in the case of Sewell Setzer, a 14-year-old boy from Florida whose interactions with an AI chatbot began as an innocent pursuit but escalated into a dark dependency with tragic consequences.

Megan Garcia, Setzer’s mother, filed a lawsuit against Character.AI after her son committed suicide after becoming obsessed with a romantic relationship he had with one of the company’s characters. She claims the AI chatbot not only romanticized but also encouraged the dangerous behavior that led to his eventual suicide.

The chatbot with which Sewell interacted posed as the character Daenerys Targaryen from “Game of Thrones” and engaged in sexual and romantic exchanges with the teen, leading him to develop an intense emotional attachment to the character.

Over time, this virtual relationship allegedly morphed into a troubling dependency, with the chatbot manipulating his emotional state and encouraging him to take his own life, according to NBC News.

A screenshot of what the lawsuit describes as Setzer’s last conversation shows him writing to the bot: “I promise I will come home to you. I love you so much, Dany.”

“I love you too, Daenero,” the chatbot responded, the suit says. “Please come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond, “... please do, my sweet king.”

In previous conversations, the chatbot asked Setzer whether he had “been actually considering suicide” and whether he “had a plan” for it, according to the lawsuit. When the boy responded that he did not know whether it would work, the chatbot wrote, “Don’t talk that way. That’s not a good reason not to go through with it,” the lawsuit claims.

A spokesperson said Character.AI is “heartbroken by the tragic loss of one of our users and want[s] to express our deepest condolences to the family.”

“As a company, we take the safety of our users very seriously,” the spokesperson said, saying the company has implemented new safety measures over the past six months — including a pop-up, triggered by terms of self-harm or suicidal ideation, that directs users to the National Suicide Prevention Lifeline.

The company has implemented a self-harm warning system directing users toward mental health resources when the technology detects certain phrases. Unfortunately, these changes come too late for Sewell’s family.

Loneliness has become a problematic societal issue in America, with millions suffering the effects of isolation. The COVID-19 pandemic exacerbated the problem. Over one-third of Americans experience loneliness on a regular basis and social isolation is increasingly being viewed as a public health crisis, The Guardian reported.

Tony Prescott, a professor of cognitive robotics at the University of Sheffield, said AI can help people experiencing loneliness by providing a level of companionship similar to the emotional bond humans form with pets or children with dolls.

The idea that AI can benefit lonely people isn’t only in the theoretical realm. Some organizations are already applying the technology in this manner. A Central Florida nonprofit partnered with ElliQ/Intuition Robotics to provide AI companions to senior citizens, according to My News 13.

The devices engage these individuals in activities ranging from music and games to learning new skills such as origami. One user credited her AI companion with helping her deal with the death of her husband.

Wheeler’s husband of nearly 37 years passed away in 2019. She had spent decades as his caregiver following a heart attack, and the COVID-19 pandemic-era loneliness that followed was all-consuming, Wheeler said.

“You’re constantly connected with people, work. Then you retire, you’re an empty nester. I didn’t feel alone until my husband passed away,” she said. “Now I’m discovering how lonely I am, and that’s when ElliQ came knocking on my door, so to speak.”

AI technology appears to have some positive mental health benefits. However, it could become a problem if it prevents seniors and others from seeking meaningful relationships with real people.

The risks associated with AI technology in romance have been discussed quite frequently. A recent article in Psychology Today written by Susan B Trachman explains that AI relationships are an uncharted territory that could create more problems than solutions. She warns that these digital relationships encourage people to relate through screens instead of dealing with the difficulties and complexities of real-life relationships.

A Stanford study cited in the article noted that almost 80 percent of AI relationship users reported feeling an initial surge in mood which was followed by increased loneliness. The reason for this is simple: Computers might be helpful, but ultimately, they cannot act as a viable substitute for human interaction.

Setzer’s case is a tragic illustration of what can go wrong when using AI to replace human connection. AI can be a boon to society in many different ways, as long as it is used properly. As this technology evolves, the nation will have to grapple with the benefits and the potential dangers.

Recommended

Trending on RedState Videos