Dec 22, 2012 17:29
The problem with the idea of "The Universe May Be A Simulation", is that it seems to miss several critical key issues of simulation. There's some definite things going for it as well though.
On the "pro" side, our physical universe is highly quantized: Energy has discrete states, we have a lower bound for measurable space (Planck length, Planck area, Planck volume), we have an upper limit for energy within a given unit of space (Planck energy), we even have a minimum measurable unit of time, Planck Time (bet you didn't see that coming). Current simulation theories commonly assume that if the universe is a simulation, it's modeled on a 4-dimensional grid with individual units Planck units (in both space and time), on an equally spaced grid.
The idea that the universe is an infinite grid of infinitesimal units is not new. This one has been around for a while. More recent theories of space-time that attempt to describe this have moved on to irregular or regular lattices of different polyhedra, such as Weaire-Phelan foam, and/or to higher-dimensional lattices. (Space-filling polyhedra in n-dimensions of n>3 is left an an exercise for the reader.)
But I'm not at all convinced that - even if it was determined that the universe was an orderly array Planck-scale hypercubic cells - we are living on a simulation. Let's look at the issues of simulation.
Simulation involves tradeoffs in space (memory) and time (processing power), and there's two more real-world trade-offs: Size of the computation machinery and ratio of simulation time to real-world time. Let's address these.
Let's be optimistic here. Let's assume that rather than have the memory for the simulation of the entire grid allocated, the grid is sparse, and the simulation only assigns memory for the particles in the known universe. That's 100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 (or 1x10^80, 1E80 for short) particles, and the memory required for each particle. With any reasonable assumption, any given memory cell must have a mass equivalent to at least 1 fundamental particle, and more likely to be several orders of magnitude larger. That gives us a lower bound of the mass of the entire visible universe for our universe to be a simulation.
What about computation time? There are information-theoretical limits to computation. An ideal quantum computer could store one bit of information in one particle. I have no idea what the resolution or range of numbers should be for quantum states. But I understand that the simplest real particles are believes to be described by 12-14 quantum numbers. Let's pick an arbitrarily high number for resolution, and assign 128 bits. That means the simplest particle would be simulated by 1536-1792 particles. Information theory also tells us that the computational speed of this quantum computer is related to temperature and pressure. I believe the only way you could get anything approaching planck-scale computational rates out of a quantum computer would be in a black hole, but my grasp of information theory is too weak to calculate the ideal computation rate of realistic materials are.
Long story short: Simulation always trades time and space for the ability to control the parameters of your simulation. To simulate the universe, would require something orders of magnitude more massive than the universe.