|i see ur win is showin|
But does this mean that you are immortal? Right know you're probably thinking 'hey, no, that program's not me', but what if it is? What if all you are is a series of algorithms that say 'you'll be nice to people' and 'you oppose abortion'? If so, you could easily be controlled by a computer program.
So why haven't we done it already? We're trying. I read in NewScientist that some guys recently made a program that could simulate some aspects of human personality, but the problem is in the sheer number of personality bits that we have to program to create a full personality. Apparently we wouldn't have enough information if we asked a question to a person every five minutes for his whole life.
So how do we get around this? The only way is to create better algorithms to guess the answers to lots of the questions based on the answers given by the person whose personality we're trying to digitalise. For example if they said they were against eating dogs, you could automatically guess that they were against eating cats. You could also use recent surveys of people like the person, to get the general idea of what the opinion is. Also, you could fill in details, so if they say they're left-wing then you could automatically generate all the opinions that are left-wing, and - unless specified otherwise - the person should also hold these beliefs.
But is that computer actually you? Automatically, you'd say no. I would; I'd hate to think that a computer could keep living as 'me' after I'm dead. Your identity is your core possession, and you wouldn't want it to be exploited after you die. For example, what if they use your digitised personality to run a public robot, like a domesticated housebot. If you have a kind personality, they could take out all the bad bits and make a robot that acts like you, but follows orders.
Forget DNA exploiting, once you put your personality into a digital personality bank you have no idea what it could be used for. Plus, death is a big part of life. If your death was 'bypassed' by creating a robot that acted like you for your family, there'd be no grief. Whilst I haven't felt grief properly myself, it should be clear that people can't go on 'living' forever, and you'd want to be known for what you did in your lifetime. Death is about putting a definite cap on life; a cap that all of us dread and fear, but a cap that makes life a clear beginning-middle-end. Life shouldn't go on forever, especially when you don't know your family is going to resurrect 'you' through a computer program. Don't we have the right to die?
It seems I've strayed onto the topic of death again; not because I have some obsession with it but because it a definite point in time - yet something that's so indefinite in its definition - and it's a very philosophical topic too. Sorry if it's too near to the knuckle for you, I don't mean it.