i've always wondered if all the weirdness that occurs at small scales where classical mechanics breaks down was just the effect of a sort of spatial aliasing where the continuous universe is undersampled onto some kind of discrete substrate.
> However, Savage said, there are signatures of resource constraints in present-day simulations that are likely to exist as well in simulations in the distant future, including the imprint of an underlying lattice if one is used to model the space-time continuum.
They did their experiment and did not find anything that would indicate an underlying lattice, but I can't find the paper right now.
I'm also ignorant of (though extremely curious about!) real physics, and have had the exact same thought since learning audio DSP principles! I'm always amazed that quantum mechanics was developed before Claude Shannon's theory of information and Hartley and Nyquist's sampling theory. I guess the question is whether Planck time is an actual physical limit or "just" about measurement.
In general, I'm curious devices that detect/measure things at super small scales can have both very high accuracy ratings and very high confidence.
Presumably these devices measure things previously unmeasurable - or at least with as good of accuracy.
I mean, I get that we have hypothesis and have reason to believe nature is going to behave in some way. And then you build the device to measure it, and it comes within some range that's not surprising, and it is inline with previous devices that weren't as accurate.
If you're building a conventional scale - it just seems more reasonable that you can have high confidence and high accuracy because the stuff you're measuring is big enough to physically see and interact with and there's almost limitless things you could use to cross-reference, etc.
Is there an ELI5 for how you can measure things subatomic with ridiculously high accuracy and confidence?
I guess I'm just in complete awe of how this is possible - not doubting that it is.
At a large scale it is also very difficult to create high accuracy. Simply defining a meter or kilogram is a high difficulty task, however there is a canonical "kilogram" that you can go and visit in France. Along with a carefully maintained set of proofs that were built off of the canonical "kilogram" which in turn are used to make the calibration weights we all work with. We then measure/calibrate the accuracy of any weight measuring system by how well it tracks to the canonical kilogram.
Similarly, for any "new" measurement, there will be a number of "calibration" measurements performed to ensure that the results are in-line with other measurements of well known things.
Ultimately everything has to be convertible to everything else. Kinetic force must be convertible to particles and even time must be inherently material if it can interact with space and material.
If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This seems unlikely for a number of reasons.
This doesn't mean spacetime is a nice even grid, but it does suggest it comes in discrete lumps of something, even if that something is actually some kind of substrate that holds the information which defines relationships between lumps.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
This would be true if objects existed at perfectly local points. However we know that a perfectly localised wavefunction has spatial frequency components that add up to infinite energy. Any wavefunction with finite energy is band-limited. At non-zero temperature the Shannon-Hartley theorem will give a finite bit rate density over frequencies, and since the wavefunction is band limited it will therefore only have the ability to carry a finite amount of information.
This and the comment under it are making my point for me. Relativity assumes spacetime is continuous. Quantum theory implies that quantum phenomena are bandlimited and therefore the information spacetime can hold is limited.
The difference is we know what the quantised components of field theory are. We don't have any idea what the quantised components of spacetime are supposed to be, or how they operate.
The various causal propagation theories (like causal dynamical triangulation) may be the first attempts at this, but it's going to be hard to get further without experiments that can probe that level - which is very difficult given the energies involved. Without that, we're just guessing.
I often wonder if we simply constructed math the wrong way around.
People tend to mentally construct the natural numbers from set theory, wholes from naturals, rationals from wholes, reals from rationals and so on.
But what if there is some universe (in the math sense), which is actually complete and decidable, it's just that the moment you take discrete subsets of it, you also remove the connections that make it consistent or complete.
The very act of formalising mathematical concepts into words and paper is a quantisation step after all, because both are symbols.
Maybe there are proofs that can be inuitioned about (assuming brains are continuuous in some sense) but neither verbalised nor formalised.
> But what if there is some universe (in the math sense), which is actually complete and decidable, it's just that the moment you take discrete subsets of it, you also remove the connections that make it consistent or complete.
Presumably this hypothetical universe implements arithmetic, so it’s not complete.
Incompleteness means that there are true statments for which there are no proofs.
But that doesn't preclude proofs that are beyond the proof system that you proofed incompleteness for.
A non discete/symbolic proof might exist after all.
It would therefore not be the existence of the natural number subset that causes undecidability, but the missing parts of the non-natural superset required to talk about the proofs that cause undecidability.
The incompleteness proof comes to a russel paradox like contradiction, caused by a sentence of the form "I am not proovable." encoded via goedel numbering onto peano atrithmetic.
But proof by contradiction itself is problematic, because it relies on the law of the excluded middle, which only holds in classical two valued logic.
If you construct math from the top down rather than from the bottom up, then it is natural that it also has a multi valued logic.
In fact, it also would have infinitely valued logic. Infinite sentences, infinite theorems.
Such math is non expressible for us, because we rely on discrete descriptions.
But that doesn't preclude its existence.
Why should it though? What evidence do we have, if we can't express it or fit it into our existing mathematical framework.
To illustrate, think of Goedelsz approach and turn it backwards for a second.
Instead of taking predicate logic and assigning each sentence a natural number, imagine that predicate/classical logic is a different view on the natural numbers.
Now that means that there might also be logical interpretation of the reals, the hyperreals and so on. (We could do the same with Alephs, in fact they might be the more fundamental objects.)
Indeed! I suspect/realize that the incompleteness proofs are in fact artifacts of our discrete symbolization - a discrete symbolization that though has been incredibly effective about reasoning about continuous 'phenomena'.
Perhaps that is exactly what mathematics is? and all it can be? Or perhaps there is a 'higher mathematics' that we cannot reach yet? (or ever?)
Yeah, it's frustratingly difficult to investigate these avenues of thought, because they are almost by definition out of our grasp. Even worse the barrier to esoteric pseudo-science is very thin.
How do you know arithmetic is not complete or consistent? In ZFC arithmetic is complete and arithmetic is consistent. This assumes that ZFC is consistent. The second order Peano axioms are categorical so I assume you mean only the first order theory.
At any rate, what does any of this have to do with information capacity in the universe? Is the information capacity of the universe related to the consistency/completeness of arithmetic?
> How do you know arithmetic is not complete or consistent? In ZFC arithmetic is complete and arithmetic is consistent. This assumes that ZFC is consistent.
Aren’t you just begging the question by assuming ZFC is consistent to demonstrate that arithmetic is consistent?
I don’t think so. The point is, saying “arithmetic is inconsistent” doesn’t mean anything without talking about where this theory resides. The larger point is that this has no relationship to whether or not the universe is infinite so it shouldn’t be talked about at all within that context.
Suppose for a moment that it makes sense to say arithmetic is part of the fabric of the universe (whatever that is supposed to mean). How would one know if arithmetic is consistent or complete within the context of being part of the universe? Suppose ZFC is part of the universe. Then arithmetic (the model of it as being part of the universe) is complete and consistent. Now what? I claim nothing of consequence follows from this in relation to whether or not space is continuous.
Arithmetic is complete in ZFC. Well, in each model of ZFC sits a model of arithmetic (first order Peano axioms) and that model is complete. The incompleteness theorem doesn’t apply in this case because the incompleteness theorem is a statement about the first order Peano axioms and not about the situation in which they are residing in a larger theory (which in our case is ZFC). The Peano axioms are not able to prove their own completeness but if they reside in a larger theory then that larger theory may be able to prove their completeness.
If ZFC (or some other theory) implements arithmetic, then the first incompleteness theorem says that if ZFC is consistent then there must be true sentences in ZFC (not necessarily sentences of arithmetic) that can’t be proved in ZFC. Correct?
Yes! ZFC can’t prove it’s own consistency or completeness but a larger theory can do this. I think ZFC + Inaccessible Cardinal can prove ZFC is consistent.
I had two points in these posts. One is that none of this pertains to whether or not space is continuous. The other is that the statement “arithmetic is consistent” is provable in some contexts. It depends on what the actual theory one is dealing with. In PA it’s not provable but in ZFC it is.
If the universe “contains” a model of PA then is that model consistent or not? How does one know? (I doubt it’s meaningful to say that the universe contains PA though.)
> I had two points in these posts. One is that none of this pertains to whether or not space is continuous.
To me, “space is continuous” seems like a proposition that must be demonstrated a posteriori and I wouldn’t expect properties of formal systems to serve as evidence for the claim. So I agree.
> The other is that the statement “arithmetic is consistent” is provable in some contexts.
Agreed.
> If the universe “contains” a model of PA then is that model consistent or not? How does one know? (I doubt it’s meaningful to say that the universe contains PA though.)
Okay, yes, how can we interact with or measure PA as implemented by the universe? How do we (or can we) meaningfully talk about the universe implementing PA?
Yes those questions seem interesting to me, but I don’t have anything intelligible to say about them.
> If spacetime is continuous you effectively get infinite precision -> information density - at every point (of which there are an infinite number).
Even if space is continuous, that doesn't mean we can get information in and out of it in infinite precision.
Look at quantum physics. Maxwell's equations don't suggest existence of photons (quantized information). But atoms being atoms, they can only emit and absorb in quanta.
Obviously that "something" is the float type used to calculate the simulation. Probably some ultra dimensional IEEE style spec that some CPU vendor intern booked anyways. ;)
Im pretty sure my math is right here, so… It would be more appropriate to say they failed to shell out for the 256 bit processor. Because at 256 bits per int a vector of 3 ints can easily encodes any location in the observable universe as Planck length coordinates.
Ya, but the observable universe only exists for cache locality reasons. You dont want to have to transfer information across too many processors as your bus bandwidth limited. The actual universe is much larger.
We should be careful then not to overload the 256 range. If we probe too closely at the QM level or use too many quantum computers it might overload the local processor node and crash it. Be a bad day for everyone.
Someone else asked about their hand moving in a pixelated versus continuous way in reality, and it occurred to me that if spacetime were discrete that would be a good reason for entanglement, intuitively speaking. Otherwise there wouldn't be an obvious way for information to be transferred across the discrete points? Maybe I'm wrong about this but it seems that way on first thought.
That doesn't mean anything has to be a particular way, but it at least would be intuitively consistent to me.
When I think about that, I wonder if that 'quantization of space' is what determines the speed of light. And perhaps explains why inertial and gravitational mass are identical.
Since gravity is the curvature of spacetime, quantizing gravity would mean quantizing spacetime (or quantizing geometry) also, which would lead to their being smallest units of space and time, perhaps somewhere around the Planck length and Planck time.
If you look at it with the analogy of vinyl records and digital representations of that music. You can reach a point in which you are quantizing something that equally can represent the original in a way that is not discernable in difference (maybe 192khz 32 bit float for some but still quantized).
You equally, hit limits in human perception and technology/physics limitations more so.
Maybe the universe is always N+1 with N being the best sampling rate known to man. Sure we can infer, but when you want to know the answer to the exact decimal point, sometimes you have to accept that 1/3 is 1/3 and never exactly 0.333333333333 however much recurring you have.
That was my point, just because you can only perceive at one limit, does not mean there is nothing beyond that like Ultrasound. Anyhow, was analogy about perception and sample rates and what your measuring. Clearly didn't make it that clear and sorry for that.
Given that the real number line exists (as a mathematical construct), and has powerful properties (as a mathematical construct), it would be surprising if 'nature' did not take advantage of it.
If space is relative (ie. not a separate thing itself) it seems so, you have the shortest possible distance, so that delimits what the chunk of space is.
One thing that's always irritated me about popular explanations of string theory, like Brian Greene's, is that they'll use phrases like "It turns out that..."
I think, "Wait a second! Nothing has 'turned out', because there are no experiments."
So now there will be. I applaud this effort. If something is untestable, then ignorant people who equate "faith in religion" with "faith in science" are right. Let's get some data that can only be explained by Theory X, and then see if Theory X predicts more things that also turn out to be true. If Theory X turns out to be one of the many variations of string theory, then you've got something there.
Yeah, I agree, I don't think it's a good construction even in the field. It's always more clear and less annoying as a reader if it says "it follows from some math that" or something like that.
In the same vein it bugs me when physicists “explain” the interior of black holes. Until we have a quantum theory of gravity we really don’t know what’s inside a black hole.
Question in this regard: current theory seems to predict a singularity with “infinite” space-time curvature (IIUC) in the center of a black hole. This seems utterly improbable to me. Is it common sense to treat this as a given or is it considered a weird quirk of general relativity that is sought to be overcome?
In this universe of ours, there isn't anything profoundly improbable about a point particle with a mass of a million suns. We commonly treat electron's as point particles with tiny mass - and there isn't yet any hint in theory or experiment that this model is incorrect (when the particles dynamics are calculated through QM).
That being said, physicists dislike discontinuities. A theoretical point particle with high mass means that there must be a mechanism to merge large quantities of mass into black hole matter. If it is a point particle with extremely large gravitational fields, these final interactions would also require some version of quantum gravity to predict. In fact, it's entirely plausible that what we observe as black holes are actually not GR singularities at all https://en.wikipedia.org/wiki/Black_hole#Alternatives
There is hope for improving observational data on black hole interiors by measuring the gravitational waves of merging black holes. Different types of black hole singularities will have different event horizon shapes, and the merger of these black holes should produce different gravitational waves, amongst other observable effects. https://en.wikipedia.org/wiki/Kerr_metric
this confirms that i'm ignorant of real physics.