Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Assume the speed of light (or ...) is the processing speed limit for the computer which simulates the universe."

Wholly unfounded assumption. If the universe is a simulation, there's no reason why any particular constant would be the processing speed limit.

Indeed, even talking about processing speed limit is probably an incoherent concept when considering the question of whether the universe is a simulation. If our universe is a simulation, then some other entity is simulating our universe. We know absolutely nothing about the world of that entity, and have no grounds to conclude or assume anything whatsoever about the machinery of the simulator.

The egregious mistake in this "test" is the implicit assumption that the machinery some entity (in a universe we know nothing about) would use to simulate our universe would be a computer of the same basic architecture as the one sitting in your bedroom.



Moreover, the simulation speed would not be related with any "speed" inside the simulation. If the processing speed decrease, the simulation will be slowed, but inside nobody would possibly notice it. If you stop the simulation, everything is stopped, anybody would be "frozen", and there's no way to detect something like that inside the simulation.


Exactly. Maybe to calculate one iteration of our universe normally would take a second for the "outside computer", but because of very heavy load it now takes a year. We'd be none the wiser.


It's possible to form a hypothesis about the nature of the machine simulating our universe. An experiment could therefore provide corroboration for that specific hypothesis.

It then becomes a case of deciding how strong the corroboration is and applying liberal slashes with Occam's Razor to see if an alternate hypothesis also fitted the results.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: