The premise stems from philosophy; are the universe and human reality real external phenomena? Mathematician Rene Descartes hypothesized the existence of a demon that presented a complete illusion of the external world to his bodily senses to hide the true world1,2. The Hindu philosopher Advaita Vedanta proposed the conscious experiences that comprise human reality are the result of a complex illusionary power to veil human minds from true reality3. Diverse thinkers have entertained the nature of reality, but epistemologically, there is consensus that simulants cannot distinguish “false” reality from the genuine4. However, contemporary quantum computational theory may suggest otherwise.
In modern context, the Simulation Argument entails a technologically mature civilization simulating the universe using computing engines. Nick Bostrom inflammatorily ascertained that it may even be likely that reality is a computer simulation5. He claimed technology can progress as to achieve enormous computing power. Thus one of three propositions must be true: either civilization may never acquire such technology, civilizations who do acquire such technology choose not run such simulations or most people with our kind of experiences live in a simulation6. We are left with a conundrumical conclusion: if the universe is simulated, analogous simulants will also simulate a universe, and their simulants will in turn simulate a universe, ad infinitum. Hence, all beings besides the original simulators are in fact simulants.
Interestingly, we are in fact simulating the universe using supercomputers, albeit on a small scale. Salman Habib from the Argonne National Laboratory leads a research team to investigate the origins of the universe through use of the most intricate computer simulation thus attempted7. Employing known laws of physics, the Mira supercomputer begins with the Big Bang and proceeds to track trillions of virtual particles and their interactions over 13 billion virtual years7,8. Though not a full scale rendering of the universe, the Mira simulation, and earlier attempts, have been successful in predicting phenomenon observable in our universe7. Since with current technology small-scale universe simulations are functional, simulating the entire universe appears an issue of scale, rather than feasibility.
Furthermore, consider trends in technological progress. In 1965, Intel co-founder Gordon E. Moore proposed his eponymous law. Moore’s Law recognizes that historically, IC transistor density doubles roughly every two years9. This pattern indirectly supposes technological progress as exponential. If Moore’s Law holds, larger and larger universe simulations become possible. From a technological standpoint, there is substantive reason to consider whether our universe is a simulation.
If the universe is a simulation, are there indicators simulants can recognize? Philosophical circles declare that without an external standard simulant status is impossible to distinguish from non-simulant status4. However, Beane et al. claim otherwise; by assuming the universe is simulated using lattice gauge theory technology, they sought to predict deviations from a non-simulated universe.
Silas Beane from the University of Bonn and colleagues claim that no matter the scale ofthe simulation, whether individual atom or quark behavior is calculated, space-time will be imposed on an underlying lattice framework as opposed to actual space-time, which is infinitely continuous10. Beane et al. calculated the consequences a grid imposes on physical behavior. They found that a superimposed lattice framework by nature imposes a fundamental upper limit on the energy particles can have, a contradiction with quantum chromodynamics10. This is because in a simulation, nothing can be smaller than the grid itself, and there are behavioral differences when infinitesimally small particles travel parallel or non-parallel to gridlines.
What do these findings mean for the simulation hypothesis? Strangely, there are observed phenomenon involving high energy particles not entirely consistent with modern physics. The paradoxical Greisen–Zatsepin–Kuzmin limit is a seemingly arbitrary upper limit on cosmic ray particle energy, one that doesn’t fit neatly with quantum gravity theory11,12,13. Interestingly Bean et al predicted the GZK limit assuming there is an underlying lattice framework10. They also predict that cosmic rays deflect less when traveling parallel with a lattice boundary.
Unfortunately, the simulation hypothesis remains unresolved. If other deviations from the laws of physics predicted by Simulation Hypothesis such as cosmic ray deflection in certain directions are observed, the simulation hypothesis would hold more credence. However such anomalies may be the result of our ignorance towards certain phenomenon. If another fundamentally different type of computer is employed for universe simulation, it may be difficult if not impossible to distinguish paradoxical phenomenon of a genuine universe from indicators of a simulation.
- Richard M. Kennington (2004). “The Finitude of Descartes’ Evil Genius.” On Modern Origins: Essays in Early Modern Philosophy. Lexington Books. pp. 146. ISBN 0-7391-0815-8.
- Descartes, R. (1641) Meditations on First Philosophy, in The Philosophical Writings of René Descartes, trans. by J. Cottingham, R. Stoothoff and D. Murdoch, Cambridge: Cambridge University Press, 1984, vol. 2, 1-62.
- Deussen, Paul and Geden, A. S. (2010) The Philosophy of the Upanishads. Cosimo Classics. P. 86. ISBN 1616402407.
- Warburg, B. (1942). The Relativity of Reality. Reflections on the Limitations of Thought and the Genesis of the Need of Causality: by René Laforgue. Translated by Anne Jouard. New York: Nervous and Mental Disease Monographs, 1940. 92 pp. Psychoanal Q., 11:562.
- Bostrom, Nick. (2003) The Simulation Argument: Why the Probability that You are Living in the Matrix is Quite High. Times Higher Educational Supplement, May 16, 2003.
- Bostrom, Nick. (2003) Are You Living In a Computer Simulation? Philosophical Quarterly, 2003, Vol. 53, No. 211, pp. 243-255.
- Habib, Salman. (2012)Cosmic web, multistream flows, and tessellations. Physical Review D, vol. 85, Issue 8, id. 083005
- Jackson, Joab (2012). “United States Commissions Beefy IBM Supercomputer”. PC World. Retrieved 23 August 2012.
- Moore, Gordon E. (1965). “Cramming more components onto integrated circuits” (PDF). Electronics Magazine. p. 4. Retrieved 2006-11-11.
- Beane, Silas R., Davoudi, Zohreh, Savage, Martin J. (2012) Constraints on the Universe as a Numerical Simulation. arXiv:1210.1847. Retrieved 10-21-12.
- Greisen, Kenneth. (1966) “End to the Cosmic-Ray Spectrum?”. Physical Review Letters 16 (17): 748–750. Bibcode 1966PhRvL..16..748G. doi:10.1103/PhysRevLett.16.748
- Zatsepin, G. T.; Kuz’min, V. A. (1966) “Upper Limit of the Spectrum of Cosmic Rays”. Journal of Experimental and Theoretical Physics Letters 4: 78–80. Bibcode 1966JETPL…4…78Z.
- Smolin, Lee. (2006) The Trouble With Physics Houghton Mifflin Harcourt. 2006, p. 222 (pbk) ISBN 978-0-618-55105-7. dewey= 530.14 22
- Image Credit (Public Domain): NASA. “CMB Timeline300 no WMAP.” Wikipedia Commons. Last modified October 26, 2010.
- Image Credit (Creative Commons): Argonne National Laboratory. “IBM Blue Gene P supercomputer.” Wikipedia Commons. Last modified April 4, 2009.