Hacker News new | past | comments | ask | show | jobs | submit login
Noether's Theorem Revolutionized Physics (quantamagazine.org)
133 points by MindGods 1 day ago | hide | past | favorite | 77 comments






I think he sets quite a high bar here. Like, just going through the basics of Noether’s Theorem is so far beyond what normally passes for popular science (which usually boils down to “the universe is really big you guys” and “quantum stuff is weird and nobody understands it”). Like, just using words like “conservation laws” and “Lagrangian” is risky already.

Personally, I would have liked it to dig deeper (as I already heard the basics of Noether’s theorem, but am not a physicist or have studied it in any great depth), but Quanta is not a scientific journal, it’s a pop-sci magazine. The article is a great intro.


It's for people who know about conservation laws and think they are fundamental and somehow unvariable.

I followed the link, that's what he means by 'cool stuff that's missing':

> In short:

> The key to Noether’s theorem is the requirement that we can freely reinterpret observables as symmetry generators, and vice versa — in a way that’s consistent with the action of symmetry generators on both observables and symmetry generators.

> In classical mechanics this is achieved by a hybrid structure: a Poisson algebra, whose elements are both observables and symmetry generators.

> In an algebraic approach to quantum theory, this requirement singles out complex quantum mechanics. i =√−1 turns observables into symmetry generators, and vice versa.

How would one explain this to the audience of Quanta Magazine?


How?

First have a clear statement of the context. Second, define and explain all the obscure terminology and mathematical notation -- at least give good references.

Gee, in one discussion of Lagrangians, saw "configuration space" and "constraints". Okay, studied topological spaces, vector spaces, inner product spaces, measure spaces, Banach spaces, Hilbert spaces, probability spaces, but never saw a definition or explanation of a "configuration space".

"Constraints"? Kuhn-Tucker optimization theory has a lot on constraints, for one of the issues there was a question, and I solved and published it. Thought I had some background in "constraints", but the Lagrangian explanations didn't make clear what they meant by "constraints".

A "symmetry generator"? This is the first I ever heard of any such thing although ugrad math honors paper was on group representations.

Then there was "phase space": What does that have to do with "phase" in light waves and sound waves? Sounds like one word with two different meanings?

E.g., "Poisson algebra": Okay, there is the Poisson stochastic process, e.g., seems to get assumed in "half-life" calculations, and with more in

Erhan \c Cinlar, {\it Introduction to Stochastic Processes,\/} ISBN 0-13-498089-1, Prentice-Hall, Englewood Cliffs, NJ, 1975.\ \

and there is abstract algebra, e.g., groups, rings, fields, ... But a "Poisson algebra"?


I wonder if the cut off point of science popularisation is related to the point where maths becomes the most useful way to explain what's going on?

Yeah I think that’s probably true and why I greatly admire efforts by people like Steven Strogatz and especially Sean Carroll who are leading the way from no-maths pop-sci to high school maths pop-sci where you know you don’t want to actually work with the maths but you can start to get an appreciation for the components of it and what the implications are.

Sean Carroll’s Biggest Ideas in the Universe YouTube series is fantastic. Just enough math to be interesting but nothing requiring a math degree.

https://youtu.be/HI09kat_GeI


Optimal would be something like 3blue1brown math animations, but for physics instead of pure mathematics.

This is a booming genre. For example this one popped up in my recommendations yesterday https://www.youtube.com/watch?v=uVKMY-WTrVo

I'd definitely watch that.

Feynman might have disagreed.

Are you referring to his quote "If all of mathematics disappeared, physics would be set back by exactly one week." or his more general belief of if you can’t explain something in simple terms, you don’t understand it?

There's some related comments by him here: https://kottke.org/17/06/if-you-cant-explain-something-in-si...

Maybe the issue is that some things are too complicated to be explained without maths and aren't accessible to lay people as they're so counter-intuitive.


Mostly the latter. Doing the maths doesn’t always mean understanding something. As a simple example it’s pretty easy to treat gravitation of a spherical object mathematically as a point source, but it was quite a conceptual leap, and it doesn’t mean you understand why you can treat it as a point source (and when you can’t).

I’m also reminded on the scene in “Severance” where the members of MDR are unable to explain their job to another department.


It's strange as I typically think the opposite - you don't really understand something until you can measure and quantify it. e.g. we don't have a handle on consciousness yet as we can't reliably measure it.

However, when it comes to a "deep" understanding, then maybe numbers aren't enough and we need to be able to intuit what's going on that creates those particular numbers.


If you read the baez article, he references a great article by Atiyah, I've posted it: https://news.ycombinator.com/item?id=42989419

“inverse temperature is imaginary time” is in the final sentence of the abstract of the paper he links - cool

https://arxiv.org/pdf/2006.14741


Here is a cosmological issue from Noether's Theorem. An expanding universe shows time asymmetry, therefore it might not have conservation of energy.

This looks like it actually happens. Photons going through empty space go through cosmological redshift, reducing their energy over time. The energy does not appear to go anywhere - it is just gone.

I have no idea why this example is not more widely discussed.


I think it's widely known amongst physicists that energy conservation doesn't hold at cosmological scales.

I've not heard of redshift being a case of this -- I'd imagine because the scales at which conservation breaks down are, to my recollection, none where you could observe red-shifted photons, or anything at all because these "scales" entail causal isolation. Eg., two areas of the universe which are totally causally isolated from each other, may across them, violate various laws of conservation.

However I do not recall seeing any reason for the latter claim, and it's something I took to be implied about the kinds of conservation violation that GR entails (ie., GR is a locally-conservative theory).


The necessity of time symmetry for energy conservation can be a little overstated. As long as the laws are holonomic, there will be a conserved quantity corresponding to the motion. You can call that quantity the energy. It won't be conserved if you jump forwards in time without allowing the state to change, but that can't ever happen, so is it really an issue?

The problem with 'motion' as the load-bearing property here is that there's an infinite number of derivatives at stake: x', x'', x''', etc. -- what is "the motion" ? What is supposed to be conserved about that motion?

The intuition here seems to be that there's a continuity of some property involved in the "transmission of matter over time" which is unbroken, but its not clear what this is supposed to imply.

It doesnt seem to imply, for example, that the universe operates like a closed system of motion simply because this property (whatever it is) is unbroken. There can be "global motion" without the need for discontinutiy/randomization/discretizatino/etc. in the trajectories of matter.


You are confusing the map for the territory.

Under our best current theory (map) General relativity, total energy might not be conserved globally, the divergence of the stress-energy-momentum tensor is zero, meaning that energy is conserved locally within a small region of spacetime.

Physics is about producing models that make accurate predictions, it is a map, not the territory itself.

The 'crisis in cosmology' e.g. Hubble tension is most likely a sign that current models of the universe are incomplete.

Energy is conserved in static spacetimes and asymptotically flat spacetimes.

The Friedmann-Robertson-Walker spacetimes that cosmology often uses are not static nor asymptotically flat.

It is widely discussed, but all models are wrong, some are useful.

Noether's theorm is a power tool to find useful models.


> Physics is about producing models that make accurate predictions

Very few models in physics ever make accurate predictions -- only in very limited experimental circumstances, mostly ones inaccessible at the time these models were developed.

The ability to craft these experimental conditions, which enable accurate prediction, is predicted on the models actually describing reality. How else would one control the innumerable number of causes, and construct relevant devices, if these causes did not exist and the devices werent constructed to measure reality?

No no, the hard sciences are not concerned about prediction at all. They are concerned about explanation -- it is engineers who worry about predictions, and they quickly find that vast areas of science -- esp. physics -- is nearly impossible to use for predictive accuracy.


I like mostly agree with you but I kind of think as prediction versus explanation as more of a spectrum where you can weight both. Like I mostly think about it from a machine learning perspective where if you do the matrix inversion you can say well this is exactly where these coefficients come from but a random forest you might only get a shap value and an transformer will never give you the exact answer as to how it arrived at the solution since it is measuring a latent space. And in physics you desire a system of equations that can be used to describe some dynamics. And if it is terrible at doing it, then you are not going to trust the model much. But like the power of a model comes from its predictive ability. Like how Ptolemaic model mostly gets the planets right but for the wrong reasons and newtons law of gravitation gets it mostly right for the right reasons and it didn’t need to get regular adjustments like Ptolemaic. And so in that example you have both the predictive ability and the explainability both being important in different ways.

I recommend galit Shmueli paper called “to explain or predict “. I also like the “two cultures” paper by Leo breiman. These are both machine learning / statistics views on this topic.


Techniques (eg., of ML or non-ML) do not decide between explanation and prediction. It's common in ML to speak like many computer scientists do, completely ignorantly of science, and suppose somehow it is the algorithm or how we "care about" it which matters -- no.

It is entirely due to the experimental conditions which are a causal semantics on the data, not given in the data or in the algorithm -- something the experimenter or scientist will be aware of, but nothing the computer scientist will even have access to.

Regression is explanatory if the data set is causal, has been causally controlled, the data represents measures of causal properties, these measures are reliable in the experimental conditions, the variables under question each have causal relationship, and so on. Conditions entirely absent in the data and in the algorithm, and in anything to do with ML.

In a large number majority of cases where ML is applied, the data might as well be a teen survey in cosmo magazine and the line drawn an instrumental bit of pseudoscience. This is why the field is not part of scientific statistics -- because it aims to address "data as number" not "data as casual measure". The computer scientist thinks that ML can be applied to mathematics, or games like chess which is a nonsense scientifically (since there are no empirical measures of the causal properties of chess).

ML is the algorithms of statistics without any awareness, or use of, any scientific conditions on the data generating process.


Western reductionism/Laplacian determism was falsified several times by counterexamples like quantum superposition, Cantor diagonalization etc.

The System Identification Problem has also been shown to be equivalent to the halting problem.

https://philarchive.org/rec/DIEEOT-2

The common joke about spherical cows in physics also points to the predictive, descriptive nature of the field.

The equivalence of various QM interpretations also points to the scientific realist views as being incorrect.

Rice's theorm, Gödel, Wada property etc... also demonstrate the problems of confusing the map with the territory.

There are further topics like indecomposable continua that arise frequently and naturally in nonpathological dynamical systems. Especially with time delayed ODEs, Hamilton systems etc?

Are you arguing that Hamilton systems aren't 'physics'?

The value of western reductionism is finding 'effective procedures' but teaching it as being reality is more about didactic convention and convenience.

'Hard science' is a term for study the universe through theories, hypotheses and experiments.

It is still about making predictions that match observations.

This it is descriptive vs prescriptive.


You're assuming an idealization is a fictionalization, rather than a way of getting at an essential property (ie., a stable, real, causal feature) of a system.

Treating a cow as spherical is a means of selecting is real property of volume, as it is causally efficacious in say, a gravitational field -- whilst discarding is accidental-random variations in volume across all cows.

That we can treat cows as spherical, and obtain relevant dynamics should show that this early 20th C. instrumentalism is false. By idealization one selects the actual properties of objects for explanatory modelling -- one does not invent them or otherwise construct a merely instrumental fiction. Cows have volume, whose variation is accidental across cows, their volume expressed as a sphere selects better for their essential volume.

Very few, if any, theories of physics are predictive in almost any situation without this idealization -- because it is impossible to describe, eg., the volume of any actual cow. An actual cow has uneven density, shape, etc. and would require a significant amount of data to describe -- nearly all of which does not bare on the role its mass plays in a gravitational field.

What idealization does is create hypothetical scenarios which imagination all irrelevant causes are controlled, and all accidental properties are uniform (/ of a known distribution) -- so that the model can focus on Explaining the target Essential property in question.

These hypotheticals are not inventions, they are means of targeting what is being explained.

If you look at the predictive accuracy of scientific models, as applied in any actual scenario, they are fall apart -- almost nothing at all can be predicted, because all actual situations comprise innumerable accidental features which cannot be modelled.


Determinism is falsified by quantum measurement, not superposition (which is deterministic), and even then only pragmatically, rather than philosophically.

Quantum superposition is indecomposable, measurement is an artifact of the Copenhagen interpretation.

You can view wave function collapse as invalidating your priors and it still works without observation.

The special case of quantum superposition being indecomposable function, or one you can't Curry if you prefer.

QM is actually lucky that there are only two exit basins, n≥3 is where you get the stronger form of indeterminism.

Classic Chaos is deterministic with inf precision, but riddled basins require absolute precision which is stronger, and the wada property is still not deterministic with absolute precision.

Example of the Newton fractal and the wada property.

https://gauss.math.yale.edu/fractals/MandelSet/RealNewton/NM...

Binary black holes and the Wada property

https://arxiv.org/abs/1807.10741

Think of it as going north from Mexico, crossing the border and finding yourself in Canada.

You can also approach it from KAM and porous sets etc... if that works better for you.


Classical mechanics makes sufficiently accurate predictions to enable essentially all the engineering we do and was invented after we have been engineering things for a few millennia.

If you construct highly controlled experimental conditions predicated on classical mechanics being true, then in those scenarios, the model predicts.

But in almost all cases it fails to predict, because the situation is vastly too complex to model. You are only able to construct devices (eg., steam engines, baloons, etc.) which are "simple" in the relevant ways, because classical mechanics successfully explains real properties of objects.

If it didn't, you'd have no idea how to take an ordinary situation like, "dropping some objects off a cliff" into one where you could actually predict where they will land (ie., by waiting for a day with no wind, by shaping the objects to limit drag, and so on --- without controlling for these accidental features, you'd not be able to predict where anything would land other than "down there somewhere").


We have been dropping artillery shells relatively accurately using classical mechanics for a couple of decades now.

By constructing a gun and a shell -- this is my point. We engineer situations we can predict.

In those scales (cosmology+Field Theory) energy is derived and defined as the "thing" that is conserved after you apply time symmetry.

Also you kinda want time symmetry in a physical system otherwise you have no guarantee that today's laws of physics will be valid tomorrow.


As much as we want time symmetry, we still live in an expanding universe. So we have at least some asymmetry.

Time symmetry applies to the physical laws governing the universe, not the distribution of mass in it.

For a simpler example look at Newton's 3 laws. They have time symmetry but still allow for things to move around!


It also doesn't apply to the structure of space-time. Which is not a situation that Newton's laws can deal with.

"as the Universe expands, photons lose energy. But that doesn't mean energy isn't conserved; it means that the energy goes into the Universe's expansion itself, in the form of work."

-- https://www.forbes.com/sites/startswithabang/2015/12/19/ask-...


That only works if radiation density and expansion are connected. General Relativity says that they aren't.

How does general relativity say that?

The term for the stress-energy tensor has the wrong impact on expansion for a simple relationship of any kind.

You can rescue it with pseudo-tensors, but then you force the existence of a preferred reference frame. Which undermines the principles of GR.

Some still argue for that. Others don't.


It's widely discussed in physics, I don't know why you would have the impression it isn't.

Wave-function collapse also violates conservation of energy. (Unless you believe in Many-Worlds, that is.)

A state that can have the expected value of the energy change post measurement isn't considered to have an energy, rather an expected energy.

This is about the energy of the system as described by the Schrödinger equation.

If a measurement changes the expected value of the energy, one or both of the initial and final states won't be time-invariant, and you're talking about the time-invariant Schrodinger equation.

I recommend the book Einstein’s Tutor” which came out last year.

https://lee-phillips.org/noether/

This is probably the best layman’s approach to Noether, her impact, and how she probably didn’t think much about the theorem later because she wasn’t interested in physics and abstract mathematics was her consuming passion.


I approve this message.

The beauty and power of the Noether Theorem is what pushed me to theoretical physics.

I consider it one of mankind’s greatest achievement.


TLDR: Do an experiment, then move 10 meters to the left (or rotate 90 degrees, or wait a few days) and do it again. The results don’t change, because the laws of physics don’t change. This realization alone is enough to produce conservation laws. Translational and rotational symmetries produce conservation of linear and angular momentum, and the time symmetry produces conservation of energy. Each symmetry you find leads to new physics.

It's such an aha moment.

PBS Space Time: https://www.youtube.com/watch?v=04ERSb06dOg


It also works backwards: for (most) conserved quantities, you can also find a symmetry.

> Each symmetry you find leads to new physics.

There's a few caveats and asterisks for that. Eg Noether's theorem only applies to continuous symmetries. Eg Noether's theorem has nothing to say about mirror symmetry or time reversal symmetry.


Another point to appreciate is how universal this principle of symmetry is. It is used in every branch of physics going from Classical Physics (Lagrangian Formulation) to quantum physics (with Feynman's Path Integral Formulation), from conservation of momentum to conservation of electric charge in (U(1) Symmetry) of fundamental particles. The fact that she was able to do this as a woman 100 years agos is also amazing.

I wonder what the conservation law is connected to the 'analysis invariance', i.e. the fact that no matter how well you've thought through everything beforehand, there will still be some recalcitrant pocket of the experiment that behaves confusingly. Maybe that's the 'conservation of surprise'.

Well i have heard the term 'conservation of misery' quite a few times since starting my PhD

Symmetries produce conservation laws if you accept and understand Lagrangian mechanics. That's a big asterisk IMO especially if you've never heard of Lagrangian mechanics and then you try to understand Noether's theorem.

Doesn't getting from Newton to Lagrange already rely on the existence of conservation laws? Apparently if we take Lagrange as fundamental, then it works, and a variation of it works in quantum mechanics, so it does seem to be fundamental, but if you're trying to get from Newton's laws to Noether's theorem, you can't get from here to there without fully grasping Lagrange first.


I've never understood why Marie Curie is so celebrated in the popular press, but Noether is largely ignored. Noether's work is much more important, IMHO.

Noether did maths, Marie Curie did physics, physics is more popular than maths.

It is easy to see why, symmetries, conserved properties, boring, nuclear power, atomic bombs, deadly radiation, exciting.

Einstein is a rare instance of a popular theoretical physicist. But all the mathematicians whose work led to the theories of relativity are largely unknown to the public.


The article begins with claims of violation of conservation of energy but no examples. Then it posits that violations of energy conservation are explained by symmetries. What does that mean in English? Is there something else that changes in concert with changes of energy?

Noether proved that for every symmetry of the action there is a conserved quantity. The action is an expression (a function of the coordinates) from which the equations of motion can be derived. Examples of symmetries of the action are time reversal, spatial translation, spatial rotation, and complex phase rotation. The corresponding conserved quantities are energy, linear momentum, angular momentum, and charge.

In general relativity, symmetries that exist in the action for a flat spacetime are violated in curved spacetime. So, in curved spacetime, the corresponding quantities are not conserved. One example is energy. The reason has to do with the fact that integrating a tensor along a closed loop in curved spacetime might yield a nonzero result due to the curvature itself, rather than due to the dynamics of what is being integrated.

I don't think that the article goes into any of this. The introduction about general relativity did seem like a curve ball, even if it's historically accurate.


“Noether” nominative determinism strikes again in physics.

> In the fall of 1915, the foundations of physics began to crack. Einstein’s new theory of gravity seemed to imply that it should be possible to create and destroy energy, a result that threatened to upend two centuries of thinking in physics.

Not just seem to imply, but they do imply[0]. Does that mean that we can build a machine that generates energy and negentropy forever (e.g. an artificial Sun), thus, we can outlive the heat-death of the rest of the Universe? Yes, absolutely. But there are other existential threats, like the collapse of false-vacuum. In the end, it is not known if we have limited or unlimited time here, but Noether's theorem doesn't answer that.

[0] : https://www.google.com/search?q=general+relativity+and+conse...


That some violation of energy-(if-we-don’t-count-energy-from-large-scale-spacetime-shape-stuff) (not counting energy from large-scale spacetime shape stuff may be sensible, because AIUI you can’t really obtain it as just a sum of local quantities, and like, it depends on boundary conditions or something) occurs doesn’t imply it is possible to exploit to obtain more energy.

Does this violation even ever result in more usable energy rather than less?

Like, red-shifting photons reduces their energy…

I suppose if we wanted to do the opposite, it would be making the contraction of space result in photons being blue-shifted, but uh…

Well, that would result in things getting closer together, and unlike expansion, that seems to run into a limit at some point?

I don’t think the laws of physics as they currently are, are sufficient to support an eternity of life (or civilization). For there to be hope of that, it must be hope of something or someone outside of the laws of physics we inhabit (or are well-approximated as inhabiting).

A new heaven and a new earth.


That's wrong, because the quoted part is wrong. Relativity doesn't say you can create or destroy energy. It only says that you can convert mass to energy (and vice-versa) - because in the end they are actually the same thing. And together, they are conserved. That means we still can't have perpetuum mobile stuff unfortunately.

You're talking about E=mc^2, which follows from special relativity. That was revealed in 1905; 1915 marked the advent of general relativity, where energy conservation no longer holds.

The time translation invariance which gives rise to the conservation law is a special case of GR's broader energy-momentum conservation, namely the static one where gravity and such are disregarded altogether as in the Standard Model.

This all ties back to the present crisis of foundations, as string theory and other approaches to reconciling GR with the Standard Model strain at the edges of what Noetherian tools can yield. (see: supersymmetry)


Nope. It's just a bit more complex to define what "energy" even is on a dynamical spacetime (remember that our usual constant known as time is part of a varying field in GR). But there's nothing stopping you from coming up with an equivalent conserved current due to a global symmetry as laid out by Noether. This fact is even used e.g. in the Hamiltonian formulation of GR. See here for a detailed explanation: https://physics.stackexchange.com/questions/2597/energy-cons...

This is an old misunderstanding that dates back to the early stages of GR research and has nothing to do with any current crisis.


> remember that our usual constant known as time is part of a varying field in GR

...and so you have to pick an appropriate underlying vector field you call time to cancel this out and get back the invariant, throwing a wrench into calculations... as a reply to the post you linked points out. At the end of the day, you haven't demonstrated that it preserves the invariant so much as you've changed the question to find another conserved quantity and called that energy instead. This lines up with my broad observation that we're out of runway for the 20th century's symmetry-reliant problem solving and hence have to be increasingly clever with setups to apply generalizations of them.

I definitely learned something new today, though. To boot, these pseudotensors are tamer than I thought they'd be - I expected calculational hacks with no formal analogues explored only in old papers, but sections on jet bundles is something I'd expect in a differential geometry text. Maybe we'll see progress along these lines in the next couple decades.


That was Einstein's 1905 paper about SR not his 1915 paper about gravity.

Not much meat in the article unfortunately. Far too short to contain anything substantial


It's sad she died when she did. Had it been a few years later penicillin would likely have saved her.


What about information? We know that it is conserved, so what is the corresponding symmetry?


Time symmetry I think? Any computation has to be reversible in theory?

"Like most Jewish academics in Germany, Emmy Noether was fired after the Nazis came to power in 1933. She left later that year for Bryn Mawr College in the U.S [...]"

Compared to what's happening now, it's totally frightening.


Also, before that she worked for several years as an unpaid faculty member because she joined the faculty because David Hilbert recognised the importance of her work but there was some sort of Prussian regional bureaucrat who would have had to sign off on her getting a paid position or something and they and/or the university didn’t believe that a woman should teach at a university. So one of the giants of abstract algebra who made a key discovery in physics got screwed over for being a woman and then screwed over again for being Jewish.

Incidentally, there was a report on Friday that German research institutions are seeing a substantial increase in applications from the US, presumably due to the US government cutting research funding.



Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: