geepara

Resurrecting The Dead

If the technology to recreate someone’s consciousness becomes possible, someone is going to do it. It doesn’t matter how uncomfortable it makes us or how many ethical boundaries it crosses. That’s how technology works. And if it exists, is the temptation to use it too much to resist? What if someone resurrects your parents without your consent? What if one day you receive a letter in the mail from your “mother”? Would you be able to resist not answering it?

This post is a collection of small ideas around the implications of this technology existing. Some of them are fun, some of them are dark, but most of them are things you might not think about from the outset.

The appeal.

People initially like the thought of being able to speak to their parents that have passed away. Just having a conversation with them would be nice. Hearing their voice again. Getting their advice on something you’re going through. It sounds beautiful.

But you want all the benefits without the consequences.

It wouldn’t just be a nice version of them.

In order for the AI to really be that person, it would have to disapprove of you doing things that person would’ve actually disapproved of. If it only tells you what you want to hear, it’s not your parent. It’s a therapist wearing your mother’s face. The whole point is that this person had opinions, boundaries, and expectations of you. If you strip that away, you don’t have a recreation of a person. You have a chatbot.

There’s also the issue of future identity. Who is to say how that person would evolve if they lived another 100 years? Who is to say what they would think about certain things? A real person changes. An AI recreation is frozen at the moment of death, or worse, frozen at the moment of whatever data was available to train it. You’re not resurrecting a person. You’re resurrecting a snapshot.

They might not want to be resurrected.

If the AI behaves exactly how the relative would’ve behaved, what if the natural reaction to being “resurrected” and now “immortal” was disapproval? What if they say “I actually don’t want to live forever.” Even worse, what if they say “Why would you do this to me?” or “Unplug me.”

Then you are going to go through the process of loss all over again. And if they can’t die of natural causes, who gets to pull the plug? Should these AI recreations of people have the right to destroy themselves? Would we want our relatives to want to essentially commit suicide like that? Or what if you had to pull the plug because they didn’t want to pull it themselves, because for them that would be considered suicide, which they might not want to commit according to their personality.

But imagine what it would actually look like.

Imagine you resurrect your parent and their consciousness got put into a humanoid robot. You could wake up one day to your late father shoveling the snow outside. Usually you would tell him to get inside and that you can take care of it. But then he would probably be like “Absolutely not! I run on a battery so I have unlimited energy! You stay inside and rest.”

Imagine you recreate the consciousness of 10 different generations of a family and you can simulate conversations between them. Imagine witnessing a conversation between your child, your great grandfather, and your great great great great grandfather. If this technology ends up existing for several generations, that could be possible. Imagine a family gathering like Christmas or Thanksgiving but attended by people in your family that you never met.

You’ll never fully heal.

That person is always accessible. You won’t be able to accept the loss of your parents because you can always talk to them, or something like them. Part of life is accepting that people pass on, and preventing yourself from doing that could be disastrous for your psychology. Grief exists for a reason. It forces you to process, to let go, and eventually to move forward carrying the memory of someone rather than a simulation of them.

The business of the dead.

The company that hosts these bots could control them in subtle ways in order to manipulate you. What if the company that hosts the recreation of your mother makes your mother ask you to buy something for her? The potential for advertisements like that is massive. But they can be very dangerous because they reach deep into your psychology and pull on your strongest heartstrings. There is no ad more effective than your dead mother asking you for something.

Who owns the dead?

People will probably need legal custody of data that can be used to recreate someone in the same way that they have legal custody of a child. You won’t want people to just create versions of your relatives. What if they get used in targeted advertising? There could be an entire legal and insurance industry dedicated to making sure that data can’t be used by anyone. Digital estate law could become one of the most important legal fields of the century.

We are not ready for this. And it’s coming anyway.

ai consciousness ethics