Hacker News new | past | comments | ask | show | jobs | submit login
Does time really flow? New clues from intuitionist math (quantamagazine.org)
153 points by nikolasavic on April 12, 2020 | hide | past | favorite | 105 comments



Here's what looks like the original paper; it's a mere 3 pages long:

https://arxiv.org/abs/2002.01653

It's a lot of prose with hand-wavy analogies. My takeaway is that the author seems to have become enamored with intuitionistic logic but seems to lack much concrete experience working with it.

For anyone intrigued by constructive mathematics, there's a nice talk and paper by Andrej Bauer (nice coincidence) called "Five Stages of Accepting Constructive Mathematics." It's a nice mix of prose and rigor of varying levels:

http://math.andrej.com/2016/10/10/five-stages-of-accepting-c...

The metamath[0] proof verifier also has a database of theorems on intuitionistic logic:

http://us.metamath.org/ileuni/mmil.html

It can be neat to compare proofs and theorems there with their counterparts in the classical logic database.

[0]:http://us.metamath.org/


The article argues for intuitionistic mathematics, not constructive mathematics. This is important. Just as classical mathematics is constructive math plus some arbitrary unprovable assumption (law of excluding the middle), intuitionistic mathematics is constructive math with another unprovable assumption (the existence of the choice sequence).

The article also made a claim that physics assumes classical mathematics. Which is wrong. The equations of physical models and their solution stays exactly the same in constructive mathematics as in classical one.


I'm not sure the original paper has enough substance to precisely pin down the author's preferred axioms. In practice, the adjectives "intuitionistic" and "constructive" are used with enough author-specific meanings that you just need to read their definition each time.

FWIW, the term "intuitionistic mathematics" sounds a bit odd to my ear. Usually you hear about "intuitionistic logic" or "constructive logic" which are both part of the larger program of "constructive mathematics."

Anyway, the "Five Stages" paper I link does a good job of introducing the broader ideas of constructive mathematics and, perhaps, gives a taste of what Gisin is excited about.

> The equations of physical models and their solution stays exactly the same in constructive mathematics as in classical one.

I'm not a cosmologist, but AFAIU the Hilbert formalism of QM relies on Hilbert spaces always having a basis, which is famously equivalent to the Axiom of Choice. I'm not sure you can formulate the concept of self-adjoint operators without that, at least for arbitrary Hilbert spaces.

My suspicion is that AoC simply broadens the class of phase spaces to include pathological ones that "don't actually matter," so your point probably stands in all practical applications. However, it would take (hard) work to carefully extricate AoC from the current formalisms.


I got impression that intuitionistic mathematical analysis refers to original Boyer papers. Which is in modern terminology is constructive mathematical analysis plus an axiom of the choice sequence. Intuitionistic logic on the other hand is a classical logic without the law of excluded middle. This is what the constructive mathematics uses and which was also used by Boyer.

And yes, as in practice all physically measurable spaces are of finite dimensions, so using a formalism that works assuming AoC for arbitrary spaces is OK. The only danger is that one reads too much from it and assumes that the real world is like that.


EDIT to clarify the parent comment. Essentially the paper claims that using classical mathematical analysis (which is the constructive analysis plus the Axiom of Choice) when interpreting the equations of physical models is poor choice. It is better to use intuition mathematical analysis (which is the constructive analysis plus axioms of choice sequence) as it provides better match with our intuition.

But for me the extra axioms are still arbitrary. As all our measurements have finite precision, all those extra axioms does not matter in real calculations. If some extra axioms helps to discover results that are also applicable to the world of finite precision, then go for it. But not claim that the world is such.


Thanks. I gave up about 20% in, thinking that whatever it is, it's either BS or over my head. Which is where much cosmology and theoretical physics leaves me. And yes, that probably says more about me than them ;)

But whatever, I rather think that subjective time involves movement at the speed of light through some n-dimensional manifold. But maybe that just reflects the SF that I've read. Such as Stephenson and Egan.


Hah, I came here to say much the same thing about the flow of time, much for the same reason (Stephenson, Egan, Asimov).


Ah yes, Asimov. It's been decades since I read his work. What would you recommend?


The Intelligent Man’s Guide to Science


Thanks. Somehow I never heard of that. It seems interesting, albeit 60 years outdated. But then there's the 1984 update.


The Dead Past is also a fantastic (short) story by Asimov. /2c


> We envision two possible mechanisms that could explain the actualization of the variables: 1. The actualizations happens spontaneously as time passes. This view is compatible with reductionism and it does not necessarily require any effects of top-down causation. Note that this mechanism resembles, in the context of quantum mechanics, objective collapse models such as the “continuous spontaneous localization” (CSL) [23, 25]. 2. The actualization happens when a higher level requires it. This means that when a higher level of description (e.g., the macroscopic measurement apparatus) requires some physical quantity pertaining to the lower-level description to acquire a determined value, then the lower level must get determined. In quantum mechanics a similar explanation is provided by the Copenhagen interpretation and, more explicitly, by the model in Ref. [37].

https://arxiv.org/abs/1909.03697

Seems like a framework for the Simulation Theory.


Yeah I did have this idea at one point that if we’re in a simulation, than there is a cost for simulating things at finer and finer detail, which could manifest itself in higher energy requirements that we see in particle accelerators, but that we aren’t actually learning anything new at those levels, maybe we’ll find some new fundamental particles that make up the electron for example, And we’ll find new particles that make up those, none of which even existed until we started poking at them, since they aren’t necessary for the simulation to run.


Regarding the cost of simulation, you might find what Dr. Michio Kaku said during his AMA.

>I mentioned that a digital computer cannot simulate even a simple reality, since there are too many molecules to keep track of, far greater than the capabilities of any digital computer. We need a quantum computer to simulate quantum reality, and hence, once again, the weather is the smallest object that can simulate the weather. Therefore, I don’t think we live in a simulation, unless the simulation is the universe itself. -Dr.Michio Kaku


But a digital computer at present in this world may not be a good reference to the capability of the computer that supposedly simulate this world. We know nothing about the "real" world that computer resided, and nothing about that computer.


We do know that the real world is richer than the simulated world, since it holds a computer that runs the simulated world. Therefore if you exist, then it's more likely that you're the result of evolution in the real world than the result of evolution in the simulated world.

Imagine the warehouse-size computer that is needed to simulate a bacterium here on Earth. Computers are dusty, and dust contains bacteria, so if you're a bacterium, then it's more likely that you're one of the billions of bacteria in the dust on the computer, than the bacterium being simulated by the computer. The same reasoning should hold for other worlds.


This is faulty reasoning. The game "The Sims" has sold over 200 million copies. If the average number of characters created per game is over 40 then there have been more sims characters than people in the world. Add in a few more games and there have been more game characters than people that have ever lived. And that's with computing being in its infancy not even 80 years old yet. Give 1000 years and its not even close.

Additionally your fidelity is backwards. Thre fact that the simulation is simpler than the real world means we can fit many more people/entities in it - because the computer doesn't have to simulate at full fidelity.


This is true if you assume that the simulating world is what’s being simulated in the simulated world. That is, if bacteria exist only in the simulated world, then if you’re a bacterium there’s 100% probability that you’re simulated.


That's a powerful argument against the usual anthropic argument for the world being a simulation. I haven't encountered it before but it makes total sense.


The assumption that real world is richer than simulated world is just that, an assumption. For one it assumes that both are finite.


It's logically necessary, not just an assumption. The simulated world with all its richness is by definition a strict subset of the simulating world. So the latter must be richer than the former.


Only if you talk about the simulated features of the simulated world, rather than compare the "simulated world as seen by its inhabitants" with the simulating world.

We don't have dragons on earth, but I can simulate dragons.

In the sense that this simulation exist in our world, you are right that the simulating world will then always be "richer" because it contains the simulation.

But if I could enter the simulated world, I could ride dragons. I can't ride dragons in "our" world, so in that sense it is clear that we can simulate things that do not have a concrete existence in our own world, and I to me at least that would make the simulated world "richer" in that respect by making things possible in the simulation that requires you to be in the simulation for it to be possible.

Similarly, we can clearly simulate something with more detail - e.g. we could simulate a world where our elementary particles can be subdivided endlessly, if we choose to. In the simulating world this would "just" be a simulation, but in the simulated world it would be that worlds reality.

There is even no reason why, with sufficient resources and time dilation, it would not be possible for the simulating world to simulate a world equivalent to the simulating world, so it could well be turtles all the way down.


I believe it is not correct. You can put new features that do not exist in the physical world in a simulation. For example, you can double the number of quarks in a proton as long as you define a mathematically consistent interaction to allow so.

Then you created a richer simulated world.


This is not true for infinite sets where a subset can be equivalent to the whole.


Could you possibly explain/reason why this must be, without using "by definition"? Many people in this thread agree with you on this, but I don't understand it (see my other comment using a video game analogy).

Is the richness you describe in your comment implicitly constrained to that which exists physically perhaps?


Unless I misunderstand what your conception of a simulation is, I don't see why a virtual world is limited by the constraints of the parent world, any more than video games are limited by the constraints of our world?

I would think this would apply to the individual molecule tracking requirement above as well.


If we don't know anything about the "real" (or at least, "realer") world, I don't see how we can argue for a simulation either.

If we don't know the "real" rules, how can we make any proposition? At best we might argue that our world is inconsistent, but we cannot say that's any more or less likely than a simulation (who says the world has to make any sense?).

The only semi-useful simulation arguments have to assume the "real" world is sorta like our world or at least functioning according to math. But that returns us to the simulation cost problem.


If we have posited a thing with unknown properties and say maybe that is generating our universe, we've pretty much taken the opposite position to Occam's Razor, which is often stated as "Entities should not be multiplied without necessity."

https://en.wikipedia.org/wiki/Occam%27s_razor


Occam's Razor is a good principle for choosing scientific hypotheses. But I think when it comes to studying big questions like fundamental fabric of this physical world, we are at a lost here. Essentially we elect to believe, without justification, that the world got to follow a reductionist paradigm - if the world can be made simpler, it must not be more complex.


Right, but that is then a totally unfalsifiable idea. You can't prove the non-existence of something you can't begin to define.


I guess that's the gist of all these arguments -- they are ideas not something we can prove or disprove.


Even a Commodore 64 could simulate our universe given enough time and memory. Our only clock source is the passage of time itself so even if it was being simulated extremely slowly we would never notice.


> Even a Commodore 64 could simulate our universe given enough time and memory

Given enough energy, we ourself could snap a universe into existence as well by clapping our hands?


a c64 would need infinitely more memory to address the infinite memory. I'm not sure whether that means uncountably infinite memory


Not really. Imagine something like a Huffman encoding. Every node in a tree can have a finite address while the tree has infinitely many nodes.


Thank you for this quote. I've long felt this. Another thing about the simulation hypothesis is that it seemingly offers no explanatory power. That is, I'm not sure how a universe that is "simulated" and one that isn't (thinking about it more I'm not actually sure what the difference is) would actually differ. Maybe would explain the observer effect in QM cause of lazy loading or something?


The simulation hypothesis is not intended to provide explanatory power on the matter you're looking for. It provides explanatory power on other questions though.

A simulation is also indistinguishable from the real thing from the inside, or it wouldn't be a faithful simulation.


This assumes we're simulated from a universe with identical laws and finitude, which I would never even consider.


This is flawed thinking I've seen show up in a lot of arguments. I even had this argument with Kurzweil at some point.

If we want to create a realistic simulation, we only need to simulate things with sufficient fidelity to convince observers.

That basically means:

- Only things that have been observed needs to gain some form of persistence

- Each simulated entity should have error margins that account for their continued existence, or their allowable location and other factors. Once an error margin for an aspect of a simulated entity exceeds the factors of the pseudo-random instantiation of that entity or class of entities, you do not need to explicitly retain or simulate the state of that entity. E.g. you don't need to retain the state of an "NPC human" longer than ~120 years at most; most you can discard much sooner; most details you never need to generate in the first place because it is not known to the entity that observes it.

- Over time, perturb the simulation back towards the outcome of a static generator function that takes time as a parameter, to wipe out state and simulation. That is, "undo" all changes the "players" in the simulation do in as plausible way as possible.

Do that, and you only need to simulate the "players" you want to simulate and those entities closest in time and space to them most of the time. There'll be a "halo" of other entities that slowly reverts to the baseline simulation.

I keep wanting to use this approach as a simulation basis for a game, in fact.

E.g. consider Elite: When you leave a starsystem, all the ships in it despawns. If you go straight back, it breaks immersion. The approach above applied to it would instead record a cone from the last known position of each ship that grows over time by likely speeds and directions. As more and more of the larger end of the cone expands outside the starsystem, you increase the chance that they despawn. If you return rapidly, you find ships have moved a believable distance.

Or Minecraft: Currently mobs despawn outside of a given area, and simulation only runs within a certain number of chunks from the player. But most things stay exactly as they were. As such all player modifications must be saved. But for many things you can create functions that makes the world more dynamic with less simulation. E.g. if trees slowly grow and age and fall down, and rot, and new trees grow, then that means that the longer since a player has been in a given chunk, the more of the players modifications can be discarded and replaced by a generator function with time as one of the parameter.

We've pruned simulations since some of the earliest generative games.

To address the quote: We don't need to simulate the weather in its full apparent complexity, as long as the observed aspects of the weather falls within parameters that makes them plausible but sufficiently unpredictable.

For most detail of the weather, that just means it needs to conform to the overall parameters. E.g. you don't need to simulate raindrops that "player" sees; just the overall volume of rain in the overall area if it is near a "player". Even less detail if further away.


"To address the quote: We don't need to simulate the weather in its full apparent complexity, as long as the observed aspects of the weather falls within parameters that makes them plausible but sufficiently unpredictable."

What's the point of a simulation pretending to simulate a complex system when it could have just implemented a simpler system in the first place?

Also, What is "observable"? If you mean "what people [well, simulation subjects] might look at", recall that people do take past data of weather, including data that's not collected at the time (e.g. tree ring data).

Assuming that the simulation can't examine future actions of the simulated people (if you could, why bother with a simulation?), it would need to simulate pretty deeply too, to be "plausible" with everything they might collect in the future. It isn't clear at all that a good simplification exists that would do that more cheaply.


> What's the point of a simulation pretending to simulate a complex system when it could have just implemented a simpler system in the first place?

Why are you assuming the simpler system would achieve the goal of the simulation? If the intent is to see how a "player" responds to an environment that is a realistic reproduction of a given environment, then placing them in a simulation that isn't a realistic reproduction may be insufficient.

> Also, What is "observable"? If you mean "what people [well, simulation subjects] might look at", recall that people do take past data of weather, including data that's not collected at the time (e.g. tree ring data).

The only "observable" data that complicates things are data that has been affected by "players", and where the effect has been observed by them and may be remembered by them. Anything that has not been affected by "players" can, assuming a simulation that consists of a deterministic base parameterised by time with a layer of "player modifications", be generated purely from the simulation. Anything that has been affected by "players" requires you to record those changes and alter the simulation based on those recorded differences.

Recording those changes has a significant "cost" because if they have knock on effects you also need to account for those knock on effects.

But as long as you store that state, you can still recreate any past state.

(consider here Minecraft as a gross simplification: If I chop down a tree, but leave a stump, and then walk outside of the chunks the simulation runs, and walk back again, I still expect to see the stump; this requires storing state where you otherwise could have re-generated the chunk purely from the seeds of the simulator - the ability to minimize that state is what has the potential to make a complex simulation viable)

What I was getting at was that a way of optimizing away most of that stored state is to keep track of when you can plausibly perturb what should be pseudo-random values in the simulation to "nudge" things back towards the base outcome of the simulator. When you can, you can then throw away saved state going forwards from that point in the simulation.

(e.g. add decay and growth, and the changing environment means you can bit by bit throw away state where throwing it away can be justified by the passage of time altering the environment)

The recording of things like weather data has no impact on that. You're not changing what the weather was. You're reducing the cost of recording aberrations from the baseline simulation to cut the cost of simulation going forwards.


"Why are you assuming the simpler system would achieve the goal of the simulation? If the intent is to see how a "player" responds to an environment that is a realistic reproduction of a given environment, then placing them in a simulation that isn't a realistic reproduction may be insufficient."

In a chaotic quantum system, everything has an influence, including the butterfly 2000km from where you are sitting now.

We can't 'just implement the parts mattering to the players', because the butterfly could cause a storm in your location a month from now, and we can't know whether it does or does not without actually computing it.

So if the system is modeling all the butterflies, you have a simulation cost problem. If it not modeling the butterflies, than the system does not truly implement 'the weather', but a simpler system, in which point it would have been easier to simulate a simpler system with less noise.

"The only "observable" data that complicates things are data that has been affected by "players", and where the effect has been observed by them and may be remembered by them."

Lets look at the example of past data. Quite a lot of it is not by "players" (lets assume humans are the "players" here). Tree ring data, geological formations... How does our simulation deal with that?

A) Calculate everything in advance. That's not very promising.

B) Calculate everything the 'players' may care about in advance.

In some systems, that's equivalent to A. Even in other systems, the players pick their own motivation and tests. If the simulation is of any worth, we may not know in advance what tests they chose, so that's not promising either.

C) When the players are somewhere, calculate the past of anything they might be examined by them.

That would require enormous amounts of state or massive calculation. Calculating the past of quantum systems can be as complicated as approach A.

D) Read the players' mind and just give them what they want (why not? It's not outside the scope of a full simulation).

There are some coordination issues here (what if different players expect different things? or if player has a flaw in the original calculation?), but that seems manageable. Even more so if we are allowed to rewrite the players' mind. Of course, that means there'll never be a proof of the simulation since it will be prevented/rewritten immediately...

E) Don't bother with consistency, lie to the players all the time and hope they could rationalize everything. I'd have expected the world to be weirder and more inconsistent than it is in that case.


> We can't 'just implement the parts mattering to the players', because the butterfly could cause a storm in your location a month from now, and we can't know whether it does or does not without actually computing it.

Because a central aspect of quantum mechanics is that we can not possibly predict what "should" have happened, then it follows that as long as the simulated outcome does not introduce predictability that shouldn't be there, a generated outcome is impossible to distinguish from a real one.

But that does of course not matter, because we have nothing to compare it against, so we don't know what the outcome "should" have been.

If we are in a simulation, then quantum mechanics could very well simply be the simulators attempt at an information barrier to prevent "obvious" patterns that would make it easy to discern that we're in a simulator, as well as a barrier that provides a lower limit for what the simulator needs to be able to simulate.

But there is another problem with this argument:

> So if the system is modeling all the butterflies, you have a simulation cost problem. If it not modeling the butterflies, than the system does not truly implement 'the weather', but a simpler system, in which point it would have been easier to simulate a simpler system with less noise.

This argument rests on the assumption that a "simpler system" that would be "easier to simulate" would have less noise. That's a shaky assumption in itself.

But we tend to interpret "clean" data as artificial, and simple mechanisms also makes it easier to spot inconsistencies. If we are the way we are because we are simulations of entities outside "our" universe, then it would be clear that if said entities don't want us to realize we're in a simulation, then we would want to create simulations that look noisy.

At the same time, if the intent is to simulate the world the "outside" entities live in, then there is no way to avoid noise if these systems are noisy in their world.

Luckily that does not mean the noise needs to be perfect, and apparent/pseudo-random "noise" is cheap and easy to generate from very small functions. We're getting pretty good at using noise to generate terrain simulations that are looking increasingly plausible from small inputs, for example. It is in fact a significantly simpler approach vs. trying to model everything perfectly, because noise can hide a lot of imperfections and generate very complex-looking features from deceptively simple inputs.

And again what you need to remember is that we don't have a basis for comparison. We don't have pictures of the "outside world" to compare to in order to look for flaws in the simulation. If we are in a simulation, then for what we know the simulation is really awful and full of flaws, but close enough for the purposes of those running it.

> Lets look at the example of past data. Quite a lot of it is not by "players" (lets assume humans are the "players" here). Tree ring data, geological formations... How does our simulation deal with that?

You're missing:

F) Use an approach at least as old as Elite: Layered generation (Elite uses a pseudo rng, but it doesn't need to be), and generate values dependent on the higher level simulation to "drill down" to generate the state at any given point in time. You can make this just as consistent as you want, because at each level if you want you can use contiguous functions that makes it appear like reasonable causal chains. If you have to mix in "player aberrations" you need to actually simulate those steps and save the results, which is why you then want to keep track of a timeline for how rapidly you can perturb out the player aberrations), but everything else can be thrown away and recalculated depending on what is cost effective.

You want geological formations? Use a generator function that returns the placements of tectonic plates and other geological features at time t, and a given position. This is no different from how e.g. Minecraft generates terrains other than adding a time parameter to allow for things like tectonic shift and erosion and the like to be layered into the generator.

You want tree rings? Look up the landscape data for the current chunk, run a generator function that returns the trees in that chunk, and their data, look up the tree the player has felled, record the aberration and generate the data for the felled tree. Plenty of game engines already e.g. gives you different amounts of resources for felling trees of different size; this is not a hard problem. And increasingly the believability of environmental simulators is a multi-billion dollar business already today. Now keep in mind that for the purposes of the simulation argument, for the odds to be heavily in favour of a simulation only at least two simulations of this place and time needs to be run before the heath death of the universe. That's a lot of time.

But a lot of the perceived complexity of that also rests on the assumption of a world roughly like how we perceive it: billions of people etc. But without knowing how many "players" there are vs "NPCs", it's impossible to estimate the complexity of the simulation. Most of us only ever see a tiny part of the world, and meet a tiny number of people, and interact deeply with a much smaller number of people. If it's all a simulation, you don't know how much it is actually simulating.

Maybe you're alone in here. Maybe I am. Maybe you didn't exist before you read this comment. Maybe you won't exist after you read it.


This makes the assumption that the "player" is somehow a privileged entity whose interactions are deemed more important than those of other elements of the simulation. A sort of "The Matrix" style simulation.

But from what we can tell, we are not special. We are made of the same energy and matter as everything else and our awareness of the world doesn't seem to be unique either. If there is no privileged "player", then you cannot make the kinds of optimizations you're talking about because any given arrangement of matter and/or energy is potentially aware.


We have no basis for telling whether or not we are special. In fact, we have no basis for telling whether other people are "players" or "NPCs". If we are in a simulation, you have no basis for knowing if we are "made of the same energy and matter" or not, or if you're simulated by the same code, or if the simulation just feeds you sensory data to make it appear that way.


But unlike a game, the observers aren't limited to limited perspectives in our 'real' universe.

You may say, thousands are observing a game on twitch perhaps even if different camera angles of different players in a multiplayer game.

But, the observers of the real universe are of far greater magnitude and I often feel this game assets based reasoning fails to account the scale of our Universe due Earth Bias.


No, this concept does not fail to account for that. It explicitly accounts for it; things examined more closely are rendered with more fidelity.

The simulation would still be a complicated program beyond our ability to understand, but there are information-theoretic reasons to believe it's of a finite complexity, of a size that a more intelligent being might find perfectly manageable.

Human sensoriums are information-theoretically bounded to be quite a bit smaller than you may think. People often don't really "get" how information works. It doesn't matter than the particle accelerator has numbers on a screen that claim it's collecting petabytes of data per second... you aren't collecting petabytes of data per second. All you ever see is a tiny slice of that thing's output, bounded by the amount the screen can carry and you can perceive. A minimal simulation of the accelerator's output is a great deal smaller than the actual data it is collecting. The bare minimum information necessary to simulate your sensorium is not really all that much, on the order of something that could fit on a modern Wifi signal, and not ever necessarily the latest standards. While we have no idea how to write a program that could simulate it as if a universe was behind it for that cheap, that doesn't mean no entity in the multiverse does.

The game analogy is a bit deceptive in that it invites you to imagine it all works exactly like modern games, but you have to imagine something wildly more sophisticated, and while not necessarily "unbounded" in resources, certainly a great deal more blessed than any current computer cluster we have.


> But unlike a game, the observers aren't limited to limited perspectives in our 'real' universe.

How do you know?

How many "players" or "observers" are there? How do you know that?

How large a proportion of our universe do they observe?

The reality is that we do not know. We don't know if there's even a single "player" or "observer" over time, because for what we know there could be that only brief disconnected "slices" of time are ever simulated for single observers - we never directly observe the past and present at the same time; we only "know" the past from our memory in the present, and we have no way of knowing if that memory is an accurate representation of anything. Nor do we have a way of knowing if our sensory input is an accurate representation of anything.


I guess it boils down to whether or not we're willing to accept infinite regress or not. Of course, if not, one must necessarily accept that reality 'comes from' nothing.

Although I'm a VR developer, I can't believe it's "all turtles all the way down" :@)

Actually I've been working on a theory to define randomness from complete uncertainty, without the "fair coin" metaphor based on constructivism, which is deterministic.


>if not, one must necessarily accept that reality 'comes from' nothing.

UDASSA (http://fennetic.net/irc/finney.org/~hal/udassa/) is the closest thing I've seen to a believable idea about how reality could come from nothing. It's basically a more specific version of the mathematical universe hypothesis.

>Actually I've been working on a theory to define randomness from complete uncertainty

If you're trying to make a deterministic theory of physics and eschew randomness from the world, then you're going to run into big problems from quantum mechanics. The idea that results from random quantum interactions comes from hard-to-measure factors in the world that we just happen to be uncertain about is called a hidden variable theory, and Bell's Theorem shows that those theories are incompatible with the principle of locality. Though there is one popular local and deterministic interpretation of quantum mechanics that avoids this issue: the Many-Worlds Interpretation. I wrote a post in another thread recently that's relevant: https://news.ycombinator.com/item?id=22845499.


But wouldn't a simulation be simulating the laws of an "actual" world? Why would the actual world have such a cost? Oppositely, if the actual world had such a cost, that's equal evidence that we are in said actual world.


Why would they? People make simulations of worlds that aren’t like the real world all the time.


Sort of a "turtles all of the way down", but only if you actually go looking for the turtles. This entire article makes my brain hurt in ways I'm really not used to. (I'm used to brain-hurt, you can't be intellectually curious without experiencing that occasional side-effect. It's just this particular flavor of it is... disconcerting)


In Fall, Stephenson argues that the speed of light reflects resource limitations in the machine(s) running our reality.


When looking for explanations to a mystery, the best debugging method is to look at edge-cases. You eliminate all unimportant noisy data and begin to focus on the data that at least has the potential to cause the mystery. Ideally that data doesn’t change. Ideally it is a constant signal within the noise. Some underlying and unchanging constant.

The speed of light has always struck me as one of the more constant, arbitrary, and yet vital pieces of data we know about our universe. Suitable for debugging whatever kind of reality is is that we find ourselves inhabiting.


The question I always end up with - how many simulations deep are we? Universe A builds a simulator to see how it works, it then builds a simulator and so on. We can never communicate with the original universe, so we build our own simulation.

As always if simulating universes is possible then the probability we're in level 0 tends to 0.


The could be no original universe. Just consider infinite sequence of simulations. Or universe A can be simulating universe B which is in turn simulates A. This is not possible with finite systems, but the moment one considers infinite ones, all bets are off.


From Excession by Iain M Banks:

"Eventually in the progress of a technologically advanced society, occasionally after some sort of limited access to hyperspace, more usually after theoretical work, it was realized that the soap bubble was not alone. The expanding universe lay inside a larger one, which in turn was entirely enclosed by a bubble of space-time with a still greater diameter. The same applied within the universe you happened to find yourself on/in; there were smaller, younger universes inside it, nested within like layers of paper round a much-wrapped spherical present."

"In the very centre of all the concentric, inflating universes lay the place they had each originated from, where every now and again a cosmic fireball blinked into existence, detonating once more to produce another universe, its successive outpourings of creation like the explosions of some vast combustion engine, and the universes its pulsing exhaust."

"There was more; complications in seven dimensions and beyond that involved a giant torus on which the 3-D universe could be described as a circle, contained and containing other nested tori, with further implications of whole populations of such meta-Realities… but the implications of multiple, concentric, sequential universes was generally considered enough to be going on with for the moment."



I often think that the flow of time is really just an artifact. At any given point in time, a human has a memory whose state is dependent upon the points in time before it. So at any given point in time, it appears as though we've travelled through time up until that point. However, as long as causality is preserved, I don't really see any necessity at all for movement through time. I mean, the phrase is self-referential - "movement through time". Movement is defined by change in position over time. I think from a practical perspective, the concept of time "flowing" or "moving" is really just saying that causality is preserved. There is an order. That's all. And from an experiential perspective, as long as you have causality and memory, it's going to appear as if time "flows". I'm not sure there is any meaning to be derived beyond that.


> I often think that the flow of time is really just an artifact

All human concepts are artifacts. "Flow" is a concept that presupposes the concept of "time", i.e. it is self-referential as you say. "Movement" presupposes "time". Citation marks are used here to highlight the conceptual nature of these terms - to say that time is something other than our concepts is true since the phenomenon is different than the concept, but once you conceptualize those differences you are again creating artifacts.

This process flows a common pattern, but usually just means one has found a new way of looking at a phenomenon which is different than conventional concepts, usually involving the removal of some dimension from view and viewing it as a two-dimensional or three-dimensional object, or similar (even 11-dimensional). It can be a truer view, if it explains something causally that previously was just guessed or misunderstood, but usually it just an expression of some hoping-for-a-goldmine theoretical framework that is going nowhere.

Some phenomena are especially prone to these kinds of gymnastics - economics comes to mind, attempts at creating historical models another. They take extreme complex and volatile phenomena and extract (reduce it to) a few dimensions, which may explain things with in a very limited frame of reference, not in its totality. It is certainly true of that which we denote as "time", a phenomenon that we have no problem communication about (we know what "time" is) but which at the core as a phenomenon is a mystery to us.

The insight that terms differ from, and can never fully explain phenomena in and of themselves, is an insight that comes with philosophy. At least since Kant. Science tends to downplay this, to the detriment of our understanding of the world.


I don't think it's necessary to introduce human perception to illustrate the directionality of time.

Mathematically, there are innumerable physical laws where time is a variable, where the current state of something is dependent upon the previous state - and specifically _independent_ of the future state.

...or maybe that too is just an illusion. Perhaps we misinterpret those equations (eg simple Dynamics in Physics I).

...but regardless, discussing human perception is an unnecessary complication in the discussion.


That doesn't explain why humans have the concept and sensation of "present".


This is due to the (extremely) limited capacity of consciousness. It is only able to handle a window of a few seconds.


Talk about ADHS!

Although, I like to think I'm able to conceptualize my whole life uo to now as a stretch of time that is the present, which becomes smudged to something smaller than a point if looked at from a far, in a broader context of past, present and future. The missing link is the inherent uncertainty of the potential effects that my actions cause in the future. So, I want to argue it's not the time perception for which I have limited capacity, but the possibilities that are sheer endless. Of course, considerations of the past are pretty much as uncertain as the future. Hence feelings of remorse can cover both hindsight and fear.

Of course, lower beings, users under acute drug overdose or other reduced states of conciousness have shorter feedback loops. Say, you are half-asleep and consistently dip in and out of consciousness because the darn bird outside started chirping again, then each moment will be largely disconnect. Vice-versa, prolonged states of consistency (wording?) may count as one. What I'm saying is, time is the rate of change.

If v=s/t, logicly t=s/v.

If we ask how long the answer is usually measured in time. The way doesn't change, it's constant. The speed has a theoretical maximum. And what changes is the scenery outside, the worsening nausea, the numbing legs.

Whereas trolling on HN feels like almost no time passes, because it takes almost no effort, and changes hardly anything, except that I might grow tired (avoiding thoughts about Heisenberg and Einstein, och)


I think the 'frame rate' of the human mind is around 13 milliseconds. We fill in the gaps so our perception is of a continuous reality.


I can't find the source, but here's a reference:

https://www.dailymail.co.uk/sciencetech/article-2542583/Scie...


True. Why now? Why not any other "time"? These are questions that are very similar to "Why 'I'?", "Why not anyone else?". Maybe time and consciousness are deeply related, or even based on the same principle?


> A real number with infinite digits can’t be physically relevant.

but

> Popescu objects to the idea that digits of real numbers count as information.

I don't know where to stand, what about information encoded in geometry, like pi. If I get a spherical system, in a small enough space - no, in any space - then there's a "cutoff" to the actual number of digits to pi. Because a chunk of spacetime can't contain infinite information? sounds good.

> Quantum math bundles energy and other quantities into packets, which are more like whole numbers rather than a continuum. And infinite numbers get truncated inside black holes.

Layman here, but AFAIK concepts like black holes aren't consistent with quantum mechanics so I'm not sure it's wise to use concepts from both theories at the same time. (i.e QM predicts that wave functions evolve deterministically but GR predicts information loss in black holes, these two views conflict).

It's way beyond my grasp but some theories seems to quantize space, I wonder how those agree with the notion of "thickness".

I'm disappointed that the article doesn't point into any mathematical theory that models the "thickness" that comes from removing the empty middle theorem.


>then there's a "cutoff" to the actual number of digits to pi. Because a chunk of spacetime can't contain infinite information? sounds good.

It's more subtle than that. The "infinite digits" of pi isn't information, no more so than the endless decimal 1/3 = 0.333... is "infinite information". You can't use it to "store" anything. This is a distinct notion from the practical reality that real spacetime is quantized. An alternate universe with un-quantized spacetime might, or might not, allow you to store infinite information in a chunk - but every digit of pi would be relevant there.


Would it be correct to say that, under intuitionist thinking, actually constructing 1/3 = 0.333... (on paper, in a computer, whatever) would take infinite information (not to mention energy and space)?

Though if I understand correctly, intuitionist math would also hold that true infinite 0.3333.... also cannot be constructed?


1/3 is a rational number, so intuitionism is fine with it. It's also fine with "computable reals": real numbers that have an algorithm for computing them (that algorithm can be represented with a finite amount of information). What cannot be constructed in intuitionism is a "non-computable real": a real number for which there is no algorithm to compute it. The vast majority of classical real numbers are non-computable.

Another way to look at it: if we have a limiting process that approaches some number X, classically we could use the law of the excluded middle to prove that it must eventually reach X, hence X "exists". Under intuitionism however we cannot say that X "exists" or construct X, all we can do is construct the process that approaches X and say this process exists.


You can write down an algorithm to generate the digits of .33..., so that set of digits exists as a "potential infinity". Same with Pi and the square root of two.

It is numbers that haven't been constructed that intuitionist mathematics doesn't generally think have been proven to exist.


> exists as a "potential infinity".

Not a mathematician nor a physical theorist by any means but some might regard the putting together of "exists" and "potential infinity" (even if using ") as an oxymoron. It's an endless discussion, of course, I personally think it all boils down to Zeno's paradox remaining, well, an unsolved paradox for the foreseeable future.


Imagine the lazy Fibonacci series. As long as you keep taking a number, the next one in the series is generated. It’s not incorrect to say it’s “potentially infinite”. And it exists, as the live algorithm that keeps cranking as long as you put in energy.


There must be a philosophical term for it but imho "potentially existing" (or having the potentiality of being constructed) is not the same thing as actually "existing" (in the reality that surrounds us).

Leaving aside the fact that we're not even sure numbers "exist", for better or worse, their "existence" is just us abstracting away some quantitates for different stuff (we've passed from counting cows or sheep on clay tablets 5000 years ago to believing that there could actually be an infinite number for us to count to).

And yes, I do believe there's a huge impedance mismatch between the world as we experience it around us and the different theoretical constructs that we now call physics or maths. I'm a Hume-ian, a guy who didn't take mathematical induction for granted (presumably not the Fibonacci series either).


Essentially Pi is the name of an algorithm that generates its digits.


Or 1.000000..... for that matter. Which seems absurd on its face.


Pi is computable. What is computable, can be represented in finite information (the algorithm used to compute the number). Almost all real numbers are not computable.


Maybe this will help people, maybe just make things worse, but for what it's worth, here's my $0.02: I read the first half (with an early tangent to wikipedia for "intuitionist math") feeling profoundly uncomfortable with the entire premise.

Then at the halfway mark, I realized that intuitionist math feels a lot like David Hume's approach to metaphysics and epistemology, which always felt right to me.

Intuitionist math still makes me feel uncomfortable, but now at least it also seems consistent with a framework of thought that doesn't. I'm not sure I've ever been quite so profoundly intellectually ambivalent.


The first smell for me was the name "intuitionist". The concepts involved make a lot of sense to me, though. This kind of number system follows the rule of "TANSTAAFL" (Robert Heinlein's "there ain't no such thing as a free lunch"), namely that you cannot have zero-cost infinities as real numbers require.

As a side note, if this notion pans out... welcome back Free Will.


Intuitionism should be familiar: it's effectively the logic underlying most programming. It's basically what allowed theorem provers like Coq to extract a runnable OCaml program from a logic proof.


I watched Conway's 6th lecture about Free Will today, and was thinking about the consequences of his theorem. As I understand, he proved that Free Will is mutually exclusive with determinism: "Everything happens for a reason."

Then I started thinking about time and causality. Does time really exist? Time dilation really exists, after all. What if there were an elementary particle that has no Free Will? Would that make it eternal?

I feel that eternity and infinity are deeply connected, but I'm not mathematically smart enough to prove it. If you'd like to discuss further though, please send me an email!

https://www.youtube.com/watch?v=IgvkhgE1Cps&list=PLhsb6tmzSp...


“Information is destroyed as you go forward in time; it’s not destroyed as you move through space,” Oppenheim said.

Strangely, this reminds me Stephen King's "The Langoliers".


Destruction is part of the process of creation.


> If numbers are finite and limited in their precision, then nature itself is inherently imprecise, and thus unpredictable.

Is this not what Planck's constant implies? We can only know position and/or motion to a certain degree, and not exactly? Does not quantum mechanics already include this idea?


How does discreetness imply imprecision and randomness? If anything, it should indicate some degree of certainty.


I'm an amateur, but I like to think about this:

If spacetime is quantized, then the speed of light would be 1 planck length / 1 planck time. Assuming spacetime is actually quantized to that metric, we can then ask: How does something move at 2/3c? Or two discrete planck lenghts in 3 discrete planck times?

In one instance it could be:

t=0,x=0, t=1,x=0, t=2,x=1, t=3,x=2

It could also do:

t=0,x=0, t=1,x=1, t=2,x=1, t=3,x=2.

It implies a hidden variable, or at the very least a hidden phase of some sort. All sorts of oddness abounds when you consider all velocities are then quantized fractional values of c.


If you can have real numbers at each spacetime point (as opposed to boolean values) then you can easily get a speed less than c. This is similar to simulating the wave equation on a grid on a computer.


But then your 2/3c is an average, it's not the real speed of the object at any actual instant.


It depends on your rules for rounding I think.


You're thinking of the Heisenberg Uncertainty Principle


Pretty interesting:

>"The dependence of intuitionism on time is essential: statements can become provable in the course of time and therefore might become intuitionistically valid while not having been so before." https://plato.stanford.edu/entries/intuitionism/


Now tie this with computability theory. Almost all real numbers are uncomputable. It's obvious that they don't "exist" in any sense if by "exist" we mean "computable". Now consider that entropy, is information, is the measure of randomness, or uncomputability, is time. What is uncomputable, coming into existence, is the passage of time.


> Hilbert and his supporters clearly won that second debate. Hence, time was expulsed from mathematics and mathematical objects came to be seen as existing in some idealized Platonistic world.

I guess that was so because the view was common, before. It should be difficult to find and prove that the line had better read "... continued to be seen as ..."


We know that our experience of physical reality is flawed (thanks to quantum mechanics and relativity). We also know that it's not the only possible way to experience it (thanks to drugs). Does it matter then if physics is formulated in a way that's a closer match to our experience?


Not the only possible way to experience it, or is it messing with a functioning motor?


I do like the idea of viewing time in the context of physical information. That makes a lot of intuitive* sense to me as a layperson

*Intuitive in the colloquial sense of the term, not in the way it's being used in this article


Arxiv is great, but why it the media writing up non peer reviewed articles?


My key takeaway from the article: Ooooh. That looks like a great chair!


TL;DR A reformulation of physics based on intuitionist mathematics seems to aid human intuition, makes no impact on calculations.


It might make no impact on calculations (I'm not really qualified to say that) but it seems like it would make a big impact on epistemology and determinism. I think?


Not really. The math is chosen to fit the physics, remember, not the physics chosen to fit the math.


I think you're right about that for determinism, but the acceptance of intuitionist math requires a different epistemology. It is literally a different set of assumptions on what is knowable and what is true.


None of our instruments are infinite precision, nor are very many physical constants, so it sort of makes sense that tossing out real numbers doesn’t make any difference.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: