Let’s do a simple thought experiment. Let’s assume that we currently have the hardware and software to build ourselves a matrix like virtual reality simulation that would run on your home PC. It is not a MMO system like the Wachowski Brothers matrix though – it is a single player world, but populated with very advanced, Turning grade AI’s that are almost indistinguishable from the real people. In fact, the simulation cheats a little bit by peeking at your long term memories. So for example it can realistically simulate a meeting with your old college buddy – and when you said “remember that time we got drunk and puked all over John’s car” it will actually look up that event, and extrapolate your friend’s relations based on them.
The whole simulation is built to be 100% realistic. If you cut yourself, you will feel genuine pain. If you die in the simulation, you probably won’t die IRL but it will feel real right up until the last second. You may even go into a shock because of the experience.
Let’s say one day you wake up, and you decide to simulate your life without any modifications. So you plug yourself into your machine, launch the VR program and find yourself back in your rum. You walk up to your computer, plug yourself in, launch another VR simulation, and you are back in your room. Assume that the hardware can take it – the simulation optimizes everything so even though you recursively jacked in 8 or 9 times, it is running only running 1 layer (I mean, there is nothing happening in all the other virtual worlds where you are just sitting in a chair with a wire sticking out of your head).
Repeat this bunch of times, going deeper and deeper into the simulated world. Don’t count how many times you do it. At a random spot stop, and start going back out. In this VR you don’t have an overlay HUD or any user interface – to exit you have to do something specific: for example click your hills 3 times while saying “there is no place like home”. Or just concentrate on the word “QUIT” real hard. Whatever – the point is that it is not a mechanical thing – it is an action or process, and it must be performed very exactly. If you are sloppy, the simulation may ignore the input (to many people were annoyed when they got booted out of their masturbatory virtual fantasies due to falsely interpreted input so the software was fine tuned to err on the side of caution).
Here is the question: given the imperfect exit routine, and the fact you were not counting how many times you jacked in, and that there is no user interface whatsoever, how do you know you are out of the simulation and in the real world?
I mean, how many times do you try the exit routine until you assume you are out? What if there is a parser glitch? What if you are somehow messing it up each time. What if someone accessed your machine while you were jacked in, and changed the exit routine to something else.
How do you know if the splitting headache or stomach pain is just an oridinary migraine or indigestion or a symptom of your body slowly getting dehydrated, or starving in the real world. How often do you check and do the exit routine?
I have another scenario for you, which is a variation on the one above. Let’s say we have a Total Recall like technology that can give you false memories that are nearly indistinguishable from the real ones. It is a bit more advanced than that however, since the memories don’t follow a rigid script but rather are built by a powerful AI director, basing your personality, memories and behavioral traits. Just like the VR technology above, the simulator can dig through all of your memories and dig out useful bits to build a very convincing and very real simulation.
You can of course specify parameters and desired outcome. It can be anything from spending the night with that girl you have a crush on, to becoming an astronaut and walking on the surface of Mars. The AI will take the starting point, and the desired outcome and figure out a realistic scenario in which you end up achieving that outcome. It is not a real time simulation, and you are not actually in control – the AI director manufactures the whole thing and makes the decisions for you to maximize the impact of the simulated scenario. This means you can’t possibly fuck up your dream date, or mess up that big audition that lands you a kick ass role in a great movie, and makes you a super star. When you later recall these memories you actually made all the relevant choices even though you didn’t. The whole process of implanting memories takes only couple of minutes, even though you may get a while lifetime of experiences that way.
So you buy yourself a fake memory implant, and you set it up to simulate a whole new life for you. It will start at early childhood, and then proceed until you are about 30 years old. You make the career choice to be random, and maximize random encounters and entropic events to make this experience as different and unexpected from your current life as possible. You also require that the last fake memory to be implanted is one in which you go and purchase a fake memory implant with the exact same set of instructions, and run it.
Assuming that the memory machine is perfectly capable of such recursion and it doesn’t crash or run out of resources, what will happen when you start the implant process? Since you are running this experiment at home, and no one will actually check up on you until some time the next day, you may actually get hundreds if not thousands of lifetimes recursively implanted. Each one feeds off of experiences and desires of the last ones and so the personality you assume in these memories starts diverging from your original one quite rapidly.
What happens when someone finally finds you and disconnects you? Who will you be? Will you even remember your original self, or will you be a mostly a product of the AI director that orchestrated your memories?
I guess to answer such question we would need to know what is the actual storage capacity of a human brain. Given such a torrent of new experiences, what would happen to the old ones. Would you completely forget your original life? Would you only remember the last few iterations? Let’s assume that there is no way to distinguish the implanted memories from the real ones – they are biologically identical and carry the exact same amount of emotional weight and meaning.
Anyways, those are my two interesting scenarios for you to ponder today.