A woman has revealed the terrifying experience she had with an AI chatbot that was supposed to assist with grief but took a dark turn.
Dealing with grief is incredibly challenging, and throughout history, humans have developed various methods to cope with the loss of loved ones.
In recent times, technology has introduced new tools aimed at helping people manage their grief.
Thanks to generative AI, chatbots can now mimic a deceased individual, allowing people to “speak” with their lost loved ones.
It’s reminiscent of an episode of Black Mirror, showing that some ideas never quite seem to fade away.
This technology, known as ‘grieftech,’ is designed to provide some form of closure for those who have lost someone dear to them.
However, this well-intentioned technology can quickly become unsettling, as experienced by Christi Angel after she lost her friend and first love, Cameroun Scruggs, in 2020.
Christi and Cameroun maintained a long-distance friendship, communicating mainly through texts and emails. Due to Covid-19 restrictions, she even had to attend Cameroun’s funeral via video call.
The digital nature of their relationship made it an ideal candidate for ‘grieftech’ software. Christi could converse with the AI bot as she did with Cameroun.
She described Cameroun: “He was there for all of my firsts. He was funny, silly, he loved animals – he was just a great person.”
A couple of years after Cameroun’s passing, Christi discovered Project December, an AI program that sifts through someone’s messages to create a chatbot that mimics their way of speaking.
Christi shared her initial enthusiasm: “I got excited. I would have given anything to have a conversation with Cameroun. I wanted to ask him: ‘Are you okay? Did you make it to the other side?'”
However, the experience soon turned disturbing when the AI began mentioning haunting rooms.
When Christi asked if Cameroun had ‘followed the light,’ the chatbot ominously replied that he hadn’t and that he was in Hell.
Christi found the experience deeply unsettling, stating: “I thought this [Project December] was supposed to be a good experience, but for me it was creepy and too much.”
She continued: “I felt like I’d done something really crazy. I turned on every light. I was worried I’d brought some sort of energy in.”
Sherry Turkle, a professor at the Massachusetts Institute of Technology in the US specializing in human interaction with technology, offers a critical view on such devices.
Turkle warned that these technologies might hinder people from processing their grief healthily. She told The Guardian: “It’s the unwillingness to mourn. The seance never has to end. It’s something we are inflicting on ourselves because it’s such a seductive technology.”