Do you ever wish you could talk to a loved one who has passed? With posthumous AI avatars, you can – kind of. These AI chatbots aren’t really long-lost loved ones, but they can be very convincing. While the idea is enticing to many people who are grieving, there are also some serious risks associated with the technology. As AI becomes more and more common, it’s important to consider the estate planning implications.
How Do Posthumous AI Avatars Work?
Posthumous AI avatars are essentially AI chatbots that have been trained to replicate the appearance, voice and personality of a real person, so loved ones can talk to them after they have passed on.
According to The New York Times, when Peter Listro was diagnosed with terminal blood cancer, he, his wife and his son decided to use StoryFile to create an AI avatar of Peter. StoryFile sent a producer to interview Peter and collect material to train the AI.
The StoryFile AI avatar created in Peter’s likeness will only answer the questions that he answered during the extensive interview process, meaning all the answers are exactly what Peter wanted to say, but also meaning that the AI simply cannot answer certain questions posed to it. However, StoryFile is planning to roll out a generative AI chatbot that will be trained on other materials, such as social media posts and emails, so it can answer questions that the individual did not answer during an interview session.
Some companies work differently. Eternos lets people train an AI on their voice and personality by following the platforms simple prompts. Once the training is complete, users can invite others to chat with AI. Other posthumous AI tools include HereAfter and Re;Memory2.
The Pros and Cons of Grief Tech
Posthumous AIs, sometimes called grief tech, griefbots or ghostbots, may provide comfort to bereaved loved ones and help keep the memory and legacy of deceased family members alive. Someone who has been diagnosed with a terminal illness could train the AI on messages that they want to give their families at important milestones, such as weddings, graduations, and births.
However, the technology is not without critics. The University of Alabama At Birmingham Institute for Human Rights Blog says that most companies offering these AIs do so on a subscription basis, so they’re financially incentivized to keep grieving customers using the technology as long as possible. There’s a risk that the AI will simplify the memories of the individual, and that users will become emotionally dependent on them.
Researchers from the University of Cambridge say that without safety protocols, these AI chatbots could cause psychological harm and end up “haunting” surviving loved ones. Companies could exploit AI chatbots to spam family and friends with unwanted notifications and offers.
The consequences could be serious. Scientific American warns that AI chatbots may be fueling psychotic thinking as more and more people engage in long conversations with AI companions. There have been multiple reports of users falling into delusional spirals.
Do You Want to Preserve Your Memory with an AI Chatbot?
At this point, you may be thinking that preserving your legacy with an AI chatbot sounds like a beautiful way to be there for your loved ones – or you may be thinking it sounds like a dystopian nightmare. Either way, as this technology takes off, it’s important to acknowledge it as part of your estate plan. Otherwise, your loved ones may make decisions that do not reflect your wishes.
- If you don’t want an AI chatbot made in your likeness, make this clear. Although many of the griefbots currently available are designed to be created by the person they represent, it’s possible that someone could create an AI of someone else based on existing videos and correspondences. If you don’t want your loved ones to do this, you should let them know. Also consider providing alternatives, such as regular (non-interactive) videos to be saved for special milestones in the future.
- If you do want an AI chatbot, research your options carefully. Several AI companies are mentioned in this article, but these mentions are not endorsements. Do your own research and go with a company you trust to respect your legacy and your loved ones. Also consider what type of AI you want. Do you want it to stick to statements you prepared in advance, or do you want it to be capable of improvising based on your profile? Carefully consider the risks inherent in the latter option, including the possibility of emotional dependency and responses that do not reflect your real personality and memories.
Does your estate plan cover all the basis? There’s a lot to consider, and new trends like posthumous AI avatars add to the complexity. An estate planning lawyer can guide you through the estate planning process. Contact Skinner Law.