Premium

The AI Program That Won't Let You Rest in Peace Has Arrived, and It's Horrifying

AP Photo/Mark Schiefelbein

Artificial Intelligence can bring with it some monumental improvements to our society, but I feel like we're pushing into territory best left unexplored too often, especially as we test the boundaries and capabilities of this tech. 

Being that humans are a curious bunch, more preoccupied with whether or not we can and not whether or not we should, it's not surprising that we're pushing past the limits of wisdom in the name of indulgence. 

What we refer to as a "ghostbot" isn't a new concept. In fact, on several occasions, we've displayed them in science fiction, including in movies like iRobot and the episode Be Right Back in the show Black Mirror. 

A "ghostbot" is when you take a real-world person and, through artificial intelligence, create a digital version of them that you can interact with. I could try to relay how dystopian that is, but I'd rather show you. I'll let the co-founder of an app called 2wai, Calum Worthy, show you himself in a video displaying how recording a grandmother for three minutes allowed her daughter and grandchildren to talk to her long after she'd passed from this world. 

There's something undeniably wrong about this, and on several levels. 

For starters, people still don't seem to understand what AI is. It's not alive. There's no actual intelligence there, and so the emotional connection you'd need to have a relationship with this bot wouldn't actually be achievable. You would be talking to a fancy code that is incredibly limited in what it can convey, or even the emotions it would need to have. 

And this brings me to the real issue here. This company wants you to replace your deceased family member, friend, or lover with this false replica that cannot, in three minutes, absorb this person in their entirety. You would be effectively talking to an entity that isn't intelligent to begin with, and is wearing the face of someone that it isn't, while trying to convince you that they are the person you love. 

Words, thoughts, and opinions will have to be expressed to stabilize anything close to an emotional connection, but the program will not have the understanding, life experiences, and beliefs that the person you knew had. It will be like a skin-walker or a body snatcher, but in digital form.

Moreover, it will effectively be the AI's programmers speaking to you since they will very likely put limitations and boundaries on the AI program. For instance, if your loved one was an ardent pro-lifer and thinks abortion is murder, the programmers may see that as an egregious affront to civil rights and refuse to let your bot express anything that the real-life person would have expressed boldly and loudly. 

And that's just covering the issues with its intended purpose. What will make this even worse is that many will use it to create AI versions of people who aren't even dead. This could be a celebrity created from an interview they upload to the app, a stranger they record having a conversation, or, in some disgusting cases, a real person they may have a crush or love that doesn't feel the same way as them. Your likeness could be used to create a relationship with someone you never consented to. 

I'm sure there are levels of abuse here that I'm not even thinking of. 

But there's the final issue of it being bad for the person talking to the AI program. As reported by The Conversation, the grief process is something that takes time, and these bots would effectively delay the completion of that process by hijacking your emotional connection and becoming reliant on the bot for emotional support: 

But the ghostbots’ uncanny resemblance to a lost loved one may not be as positive as it sounds. Research suggests that deathbots should be used only as a temporary aid to mourning to avoid potentially harmful emotional dependence on the technology.

AI ghosts could be harmful for people’s mental health by interfering with the grief process.

I don't have to guess that long-term use is the goal, because that concept showed generations of family members talking to the same AI grandmother. 

But The Conversation brings up another issue that could potentially spell disaster for the bereaved: 

There are also risks that these ghost-bots could say harmful things or give bad advice to someone in mourning. Similar generative software such as ChatGPT chatbots are already widely criticised for giving misinformation to users.

Imagine if the AI technology went rogue and started to make inappropriate remarks to the user – a situation experienced by journalist Kevin Roose in 2023 when a Bing chatbot tried to get him to leave his wife. It would be very hurtful if a deceased father was conjured up as an AI ghost by a son or daughter to hear comments that they weren’t loved or liked or weren’t their father’s favourite.

Or, in a more extreme scenario, if the ghostbot suggested the user join them in death or they should kill or harm someone. This may sound like a plot from a horror film but it’s not so far fetched. In 2023, the UK’s Labour party outlined a law to prevent the training of AI to incite violence.

As far as I'm concerned, AI is a loaded gun and we're a bunch of kids who found it. We're still figuring out how AI affects us on an individual and societal level, and now we're going to introduce this level of emotional connection? 

There is no scenario where this ends up being a good idea, and even if there are some benefits, the risks outweigh them significantly. 

"Star Trek: The Next Generation" tackled this very subject with expert precision. 

Recommended

Trending on RedState Videos