Sat. Nov 2nd, 2024
Occasional Digest - a story for you

CREEPY AI chatbots of the dead could give bereaved family an “unwanted digital haunting” with serious psychological consequences, experts have warned.

So-called AIghostbots” may even digitally stalk their loved ones beyond the grave.

Experts fear people could be manipulated by AI chatbots1

Experts fear people could be manipulated by AI chatbotsCredit: Alamy

Online platforms allowing people to virtually reincarnate lost relatives have exploded in recent years.

The eerie tech is capable of simulating language patterns and personality traits of a dead person using the digital footprint they have left behind.

A person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner

Dr Katarzyna Nowaczyk-BasinskaCambridge University

But researchers at Cambridge University say AI chatbots need new safety measures to prevent causing psychological harm.

They fear some users may also develop “strong emotional bonds” with the simulation making them particularly vulnerable to being manipulated.

“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” said Dr Tomasz Hollanek, who co-authored the paper.

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost.

“The potential psychological effect, particularly at an already difficult time, could be devastating.”

The study by Cambridge’s Leverhulme Centre for the Future of Intelligence goes on to say that the area is “high risk”.

They fear companies could also use deadbots to shamelessly advertise products to users in the manner of a departed loved one, or distress children by insisting a dead parent is still “with you”.

When the living sign up to be virtually re-created after they die, their chatbot could be used by firms to spam surviving family and friends with unsolicited notifications, reminders and updates about the services they provide – akin to being digitally “stalked by the dead” – they write in the Philosophy and Technology journal.

Simple question that will stump AI voice clones

People who are initially comforted by the deadbot may get drained by daily interactions that become an “overwhelming emotional weight”.

And living family may have no way to cut off the service if their now-deceased loved one signed a lengthy contract with a digital afterlife service.

“Rapid advancements in generative AI mean that nearly anyone with internet access and some basic know-how can revive a deceased loved one,” said co-author Dr Katarzyna Nowaczyk-Basinska.

“This area of AI is an ethical minefield.

“It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.

“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner.

“The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

Dr Hollanek added that there should be ways of “retiring deadbots in a dignified way, which may require some “form of digital funeral”.

The researchers recommend age restrictions for deadbots, and also call for “meaningful transparency” to ensure users are consistently aware that they are interacting with an AI.

Dr Nowaczyk-Basinska said: “We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here.”

Read more about Artificial Intelligence

Everything you need to know about the latest developments in Artificial Intelligence

Source link