chatting with digital afterlife

Companies are now creating digital versions of dead people using artificial intelligence. These AI recreations, called deadbots, use emails, social media posts, photos, and videos left behind by deceased individuals. Advanced technology like generative AI and deepfakes lets these digital ghosts have text conversations, speak verbally, or even appear as holograms.

The technology’s spreading fast. Project December and similar platforms let users chat with simulated versions of dead loved ones. Some people even record responses and design their own digital avatars before they die. Almost anyone with internet access can now create or interact with these deadbots. The cost to create these digital avatars ranges from hundreds to $15,000, depending on complexity and location.

While deadbots might comfort grieving families at first, they can cause serious problems. Daily interactions with digital versions of deceased relatives often become emotionally overwhelming. What starts as comfort can turn into psychological distress or unhealthy dependence. Mental health experts warn that deadbots might stop people from moving through their grief naturally. Some users feel helpless when they can’t shut off unwanted AI simulations of their loved ones. These systems can blur the lines between authentic and artificial relationships, raising concerns about maintaining genuine emotional connections.

The deadbot industry, sometimes called “Death Capitalism” or “Grief Tech,” is raising alarm bells about ethics and privacy. Companies sell subscriptions and contracts for creating these digital afterlife services. Some businesses even offer deadbots as surprise gifts to unprepared family members. The biggest ethical concern is consent. It’s nearly impossible to know if dead people would’ve wanted AI versions of themselves created. Their private data gets used without their permission, and families often can’t control, edit, or delete these digital ghosts once they’re made. Researchers at Cambridge’s Leverhulme Centre for the Future of Intelligence published a comprehensive study examining these psychological and social risks in May 2024.

The industry’s growing with little regulation. Companies sometimes create deadbots without checking if the deceased person wanted this. Security risks exist too, as sensitive personal data could be misused or exploited commercially.

Different generations view deadbots differently. Some see them as helpful tools for remembering loved ones. Others find them disrespectful or disturbing. These AI ghosts are changing how society handles death and memory. They’re blurring the line between the living and dead in digital spaces, forcing people to rethink traditional mourning rituals and what it means to preserve someone’s legacy.

References

You May Also Like

Beyond Physics: When Time Bends, AI Evolves, and Minds Transcend Reality

Is reality an illusion? Witness AI systems transcending their programming as time bends in impossible ways. Our fundamental understanding of existence faces extinction.

AI Vader Voice in Fortnite Sparks Union Rebellion After James Earl Jones’ Death

Epic Games’ AI Darth Vader in Fortnite triggers SAG-AFTRA revolt while Jones’ family celebrates. The voice recreation battle exposes the raw tension between legacy preservation and actors’ rights.

The Unseen AI Revolution: 89% of Corporate AI Usage Lurks in Digital Shadows

Corporate executives are blind to the 89% of AI hiding in plain sight. Workers secretly use AI for daily tasks while leadership remains oblivious. Security risks mount as companies race toward implementation.

Police AI Disaster: When ChatGPT Altered Evidence From Drug Bust Photos

When police used ChatGPT to edit drug bust photos, the AI created bizarre distortions that sparked legal chaos and public outrage.