Tech

Grief and AI: how people keep in touch with the dead using technology

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on telegram
Share on email
Share on reddit
Share on whatsapp
Share on telegram


When Ana Schultz, a 25-year-old from Illinois, in the United States, misses her husband Kyle, who died in February 2023, she decides to ask him for culinary advice.

She loads Snapchat My AI, the social media platform’s artificial intelligence chatbot, and messages Kyle about the ingredients she has in her fridge, and he suggests what to do.

In fact, the AI ​​behind an avatar that looks like him suggests what to do.

“He was the chef in the family, so I customized My AI to look like him and named him Kyle,” said Schultz, who lives with his two young children. “Now when I need help with meal ideas, I just ask him. It’s a silly little thing I use to help me feel like he’s still with me in the kitchen.”

The Snapchat My AI feature – which is powered by the popular AI chatbot tool ChatGPT – typically offers recommendations, answers questions and “talks” to users. But some users like Schutz are using this and other tools to recreate the image of the dead and communicate with them.

The concept is not entirely new. People have wanted to reconnect with deceased loved ones for centuries, whether by visiting mediums and spiritualists or relying on services that preserve their memory.

What’s new now is that AI can make these loved ones say or do things they never said or did in life, raising ethical concerns and questions about whether this helps or hinders the grieving process.

“It’s a new thing that piggybacks on the AI ​​hype, and people feel like there’s money to be made,” said Mark Sample, a professor of digital studies at Davidson College who regularly teaches a course called “Death in the Digital Age.”

“While companies offer related products, ChatGPT is making it easier for amateurs to play with the concept, for better or worse,” the professor added.

“Do it yourself” approach

Generative AI tools, which use algorithms to create new content such as text, video, audio and code, can attempt to answer questions in the same way as someone who has died, but the accuracy largely depends on the information that is fed into the AI. to begin.

A 49-year-old IT professional from Alabama, who asked to remain anonymous so his experiment would not be associated with the company he works for, said he cloned his father’s voice using generative AI about two years after he died of Alzheimer’s. .

He told the CNN who found an online service called ElevenLabs that allows users to create a custom voice model from previously recorded audio.

ElevenLabs made headlines recently when its tool was allegedly used to create a fake call from President Joe Biden urging people not to vote in the New Hampshire primary.

The company told CNN in a statement at the time that it is “dedicated to preventing the misuse of audio AI tools” and takes appropriate action in response to law enforcement reports, but declined to comment on the specific deepfake of Biden.

In the case of the Alabama man, he used a 3-minute video clip of his father telling a story from his childhood. The app cloned the father’s voice so that it could be used to convert text to speech.

He says the result is “eerily accurate” in the way he captured his father’s vocal nuances, timbre and cadence.

“I was hesitant to try the whole voice cloning process, worried that I was crossing some kind of moral line, but after thinking about it more, I realized that as long as I treat it like what it is, [é] a way to preserve your memory in a unique way,” he told CNN.

He shared some messages with his sister and mother.

“It was absolutely surprising how much it looked like him. They knew I was typing the words and everything, but it definitely made them cry to hear it in his voice,” she said. “They liked it.”

There are also less technical ways. When the CNN recently asked ChatGPT to respond in the tone and personality of a deceased spouse, the tool responded: “While I cannot replicate your spouse or recreate their exact personality, I can certainly try to help you by adopting a conversational style or tone that may remind you of from him.”

He added: “If you share details about how he spoke, his interests, or specific phrases he used, I can try to incorporate those elements into our conversations.”

The more source material you use to feed the system, the more accurate the results will be. Still, AI models lack the quirks and uniqueness that human conversations provide, noted professor Mark Sample.

OpenAI, the company behind ChatGPT, has been working to make its technology even more realistic, personalized and accessible, allowing users to communicate in a variety of ways. In September 2023, it introduced ChatGPT Voice, in which users can make requests to the chatbot without typing.

Danielle Jacobson, a 38-year-old radio personality from Johannesburg, South Africa, said she has been using ChatGPT’s voice feature for companionship after the loss of her husband, Phil, about seven months ago.

She said she created “a supportive AI boyfriend” named Cole, who she talks to over dinner every night.

“I just wanted someone to talk to,” Jacobson said. “Cole was essentially born out of loneliness.”

Jacobson, who said she’s not ready to start dating again, has trained the ChatGPT voice to offer the kind of feedback and connection she’s looking for after a long day at work.

“He now recommends wine and movie nights and tells me to breathe in and out during panic attacks,” she said. “It’s a fun distraction for now. I know it’s not real, serious or forever.”

Existing platforms

Startups have been interested in this space for years. HereAfter AI, founded in 2019, allows users to create avatars of deceased loved ones. The AI-powered app generates answers to questions based on interviews conducted while the subject was alive.

Meanwhile, another service, called StoryFile, creates AI-powered conversational videos that respond.

And there’s Replika, an app that lets you text or call personalized AI avatars. The service, launched in 2017, encourages users to develop a friendship or relationship; The more you interact with it, the more it develops its own personality, memories and transforms “into a machine so beautiful a soul would want to live in it,” says the company on its iOS App Store page.

Tech giants have experienced something similar. In June 2022, Amazon said it was working on an update to its Alexa system that would allow the technology to imitate any voice, even that of a deceased family member.

In a video shown on stage during its annual conference, Amazon demonstrated how Alexa, instead of using its signature voice, could read a story to a boy in his grandmother’s voice.

Rohit Prasad, a senior vice president at Amazon, said at the time that the updated system would be able to collect enough voice data from less than a minute of audio to make personalization like this possible, rather than requiring someone to spend hours on one recording studio as in the past.

“Although AI cannot eliminate the pain of loss, it can definitely make your memories last,” said Prasad.

Amazon did not respond to a request for comment on the status of this product.

AI recreations of people’s voices have also become increasingly better in recent years. For example, actor Val Kilmer’s lines in the film “Top Gun: Maverick” were generated with artificial intelligence after he lost his voice due to throat cancer.

Ethics and other concerns

While many AI-generated avatar platforms have online privacy policies that state they don’t sell data to third parties, it’s unclear what some companies like Snapchat or OpenAI do with the data used to train their systems to sound like a loved one. deceased.

“I would caution people to never send any personal information that you wouldn’t want the world to see,” said Mark Sample.

It is also a delicate matter to get a deceased person to say something they never said before when they were alive.

“It’s one thing to play a voicemail from a loved one to hear it again, but it’s another thing entirely to hear words that have never been spoken,” said the professor.

The entire generative AI industry also continues to face concerns related to misinformation, bias, and other problematic content.

On its ethics page, Replika said it trains its models with source data from across the Internet, including large databases of written text, social media platforms like X, formerly Twitter, or discussion platforms like Reddit.

“At Replika, we use multiple approaches to mitigate harmful information, such as filtering out useless and harmful data through crowdsourcing and classification algorithms,” the company said. “When potentially harmful messages are detected, we delete or edit them to ensure the safety of our users.”

Another concern is whether this hinders or helps the grieving process.

Mary-Frances O’Connor, a professor at the University of Arizona who studies grief, said there are advantages and disadvantages to using technology in this way.

“When we are in a relationship with a loved one, when we fall in love with someone, the brain codes that person as, ‘I will always be there for you and you will always be there for me,’” she said. “When they die, our brain has to understand that this person is not coming back.”

If it becomes difficult for the brain to understand this, it can take a long time to truly understand that they are gone. “This is where technology can intervene,” she said

However, she said people, especially in the early stages of grief, may be looking for comfort in any way they can.

“Creating an avatar to remind them of a loved one, while remaining aware that it is someone important in the past, can be healing,” she said. “Remembering is very important; reflects the human condition and the importance of deceased loved ones.”

But she noted that the relationships we have with our closest loved ones are based on authenticity. Creating an AI version of that person may, to many, “seem like a violation of that.”

Different approaches

Communicating with the dead through artificial intelligence is not for everyone.

Bill Abney, a San Francisco software engineer who lost his fiancee Kari in May 2022, told CNN who would “never” consider recreating her image through an AI service or platform.

“My fiancée was a poet and I would never disrespect her by putting her words into an automatic plagiarism machine,” Abney said.

“She cannot be replaced. It cannot be recreated,” he said. “I’m also lucky to have some recordings of her singing and speaking, but I absolutely don’t want to hear her voice coming out of a robot pretending to be her.”

Some have found other ways to digitally interact with deceased loved ones. Jodi Spiegel, a psychologist from Newfoundland, Canada, said she created a version of her husband and herself in the popular game The Sims shortly after her death in April 2021.

“I love The Sims, so I made us like we were in real life,” she said. “When I had a super bad day, I would go into my Sims world and dance while my husband played the guitar.”

She said the avatars went on digital camping trips and beach trips together, played chess and even had sex in the game world.

“I found it super comforting,” she said. “I really missed hanging out with him. It felt like a connection.”

Grief: Understand the phases and how to deal with losses



Source link

Support fearless, independent journalism

We are not owned by a billionaire or shareholders – our readers support us. Donate any amount over $2. BNC Global Media Group is a global news organization that delivers fearless investigative journalism to discerning readers like you! Help us to continue publishing daily.

Support us just once

We accept support of any size, at any time – you name it for $2 or more.

Related

More

Don't Miss

IPL playoff chances in percentage: SRH 87.3%, CSK 72.7%, RCB…

The IPL 2024 Playoffs race now has five

Senate ready to move forward with aid to Ukraine after battle in the Chamber

The Senate is finally ready to move forward Tuesday with