BoA Physics
by wjw on September 17, 2016
Friend of the blog Michael Wester tells me that noted science foundation Bank of America has issued a report to its clients that there is a 20-50% chance we are living in a Matrix-style simulation. (Where are my cool shades and shiny leather coat?) The report cites Elon Musk, Neil de Grasse Tyson, philosopher Nick Bostrom, and others.
You may well wonder why the Bank of America takes such an interest in the question. Perhaps they want to assure their clients that any money lost by BoA never really existed.
Still, you may remember my Implied Spaces, where I suggested that the existence of cosmological constants would point to the artificiality of the universe, because such a constant would just be an arbitrary number or formula inserted by the creators to make their experiment work the way they wanted it to.
Still, “artificial” is not the same thing as “simulation.” To run a computer simulation of the entire universe would run up against the Bekenstein Bound, which limits the amount of information that can be contained in a universe with a finite amount of energy. To describe a single hydrogen atom takes a megabyte of information, so to describe the actions and interactions of all particles in the universe would take . . . well, a heck of a lot of megabytes. Any computer big enough to run a simulation of the universe would have to be larger than the universe itself, and with a lot more energy.
Personally I think we’re all living in the Hardwired universe, albeit a few years before the action of the book begins.
Just curious how you came to 1-MB-per-H value? Also, a simulation of the entire universe wouldn’t be necessary – wouldn’t even need a simulation of the entire earth… all you’d need to do is pay close attention to the observational capability of all individual humans and their instruments, and increase the functionality of your system at the time of those observations. If, for one example, someone wants to take high-resolution scans of a small part of the night sky, then the simulation would have to radically increase its immediate capability to handle that measurement. Once the measurement is over, the amount of extra processing power and storage could be reduced and some sort of snapshot of the information at that point could be kept for the next time a similar measurement is made. If someone measures subatomic particles, then the same thing could apply. Nobody is constantly measuring all information of what’s happening, say, in the center of the earth to a high resolution, simply because we haven’t gotten that far technologically speaking.
Of course, this proves nothing for or against the possibility of us being in a big simulation or a hardwired universe, but I suspect it would be an easy way to make it much more possible than trying to constantly simulate every possible particle of every possible point, everywhere.
Just my own meandering thoughts. Take it with a grain of salt – I’m no genius.
A Hardwired world… oh, crap. Yeah, I’ve been noticing for a few years now that we seem to have transitioned to a cyberpunk dystopia, and I Am Not Amused, as the Lady said. It *sure* ain’t the future I expected my kids to have.
Oh, and this is NOT the Real 21st Century, I want the REAL one back, NOW, thankyouveddymuch. And where’s the website that I can buy my ticket on the PanAm shuttle to the Wheel?
mark “next year in orbit”
Mason, that particular statistic came from an article by Steven Baxter in an anthology called Exploring the Matrix, to which I also contributed.
The problem with simulating only those parts of the universe that someone is viewing at the time is that you have to know where everyone is looking at all times, and if they’re looking =anywhere else,= the whole charade becomes obvious.
In fact, why bother to create a huge simulation when the sinister tricksters behind this all can make use of brains-in-bell-jars, as in Descarte’s original thought experiment— and as in the Matrix movie, for that matter. Flooding a single mind with false data is a lot more cost-effective in terms of computing power, or so I’d imagine.
One more thing, that I thought of last night: if it’s all a simulation, then why do people die, for no good reason, at like 36, or 43, or….? Or why is MS and diabetes and obesity here – they don’t make good-looking characters….
mark
Reasonable, WJW. I’m just trying to assume true “free will” on the part of the simulated beings but full fidelity in their field of view. You’re correct that it would take a massive amount of processing power, but I’m trying to reduce it down to the minimum necessary in order to at least make it even feasible. Of course, whoever created the simulation may have broken the Moore’s Law barrier and can do the processing necessary for the fidelity, and have sufficient power, memory, and processing capability to make this as large (or subatomic) a simulation as they want or need.
In graduate quantum electrodynamics classes, I had a discussion about it with my professor, and it’s absolutely a possibility that our universe exists simply because it happened to be the one that popped up out of the void and had self-aware beings that were there to collapse their own waveforms, forcing the state to hold instead of drop away. Of course, that has nothing to do with your initial question, but it’s just an interesting side-topic.
And for anyone reading this, just a jumping point for research outside of BofA:
http://www.simulation-argument.com/simulation.html
Comments on this entry are closed.