Hacker News new | past | comments | ask | show | jobs | submit login
Physical Intuition, Not Mathematics (2011) (realphysics.blogspot.com)
105 points by k0doque on July 4, 2015 | hide | past | favorite | 65 comments



Feynman also said the following (in the "The Character of Physical Law" lectures):

Every one of our laws is a purely mathematical statement in rather complex and abstruse mathematics. Newton's statement of the law of gravitation is relatively simple mathematics. It gets more and more abstruse and more and more difficult as we go on. Why? I have not the slightest idea. It is only my purpose here to tell you about this fact. The burden of the lecture is just to emphasize the fact that it is impossible to explain honestly the beauties of the laws of nature in a way that people can feel, without their having some deep understanding of mathematics.


This is very important. Many students hit a wall when physics passes beyond lay intuition and whatever may they already know well. Advancing in mathematics is essential to feeling comfortable in advanced physics.


I discovered this last year after arrogantly jumping into the first volume of Feynmans Lectures on Physics.

50 pages in, I decided to take a step back and read a calculus book first, but wait my algebra and trig are crap so back to the basics. So yesterday I hit LCM and GCD applications and factoring which are very basic. So, I'll probably resume the initial book in a couple of years or so...


Math is wide and deep. You won’t need to cover every topic in math to get going with physics. If you really are interested in physics there are many things in math, which are, well, less important (for doing basic physics). For example LCM, GCD and factoring. I guess, these things are somewhat important in Computer Science, but I never encountered them in a physics problem. So to get started with physics, I would suggest that you focus mainly on analysis (differentiation and integration) and vector algebra. As an addition maybe the basics of complex numbers. This can be learned relatively quickly.

With these you should be able to follow the Feynman lectures or watch the very fine „Theoretical Minimum“ series by Susskind (http://theoreticalminimum.com)


Thanks for the pointers. Much appreciated.

I'm doing a full review of mathematics at the moment. Not in depth, more of a "here's an application of the GCD function" so I know what tools to use to solve specific problems. All this is beneficial for the day job as well who expect to see some value from my time spent even though I'm not being totally honest with the objective to them. Realistically I want to think abstractly in the terms of mathematics and develop some intuition.

Was completely unaware of the Theoretical Minimum series. Thanks for that.

Edit: I'm reading Mathematics: From the birth of numbers by Jan Gullberg as a text. Wonderful book. Covers just about everything and is beautifully written by a non mathematician with no assumptions spared and no education target. In fact the forward is mainly bitching about the education system. Slightly worried I will get distracted by this book but that's never a loss!


I do not know "Mathematics: From the birth of numbers" but judging from the Amazon quick view it seems to cover a lot of ground (BTW: one thing I missed in my list are the basics of differential equations).

Over 1000 pages is quite a long read, though. I never managed to read a (science) book as big as that from cover to cover myself. One thing I learned through the years is to never use only one book for learning. Books have different styles and not every style fits to every student. Additionally one book might be good at one specific topic and weak on another. So nowadays I always use a couple of books (or online resources) to learn a new topic.


Its huge yes but a lot of it is fluff and history. It does serve to keep it interesting however.

Quick page shot to show the scope and density: http://i.imgur.com/sV1WYFd.jpg

I have a number of other books as well that I use as a reference as well so no problems there (calculus for the practical man has some different insights). Oh and betterexplained.com.


On a related note, Mary Boas's text, Mathematical Methods in the Physical Sciences does a great job of giving you the necessary bag of tricks to learn all of undergraduate level physics (and probably much more) without diving too deep into any single topic. It should be sufficient to give you lots of intuition until you decide to pursue something at much greater depth (although doing that alone, and without a professor/PI/expert of some sort is realistically, almost definitely a waste of effort).


You will certainly need them if you have any interest in quantum information.


If one has any interest in quantum field theory, I would add complex analysis (countour integrals in particular) to that list.


Perhaps my math book can help you learn the basics a little faster: http://noBSgui.de/to/MATHandPHYSICS/

One suggestion: make sure you don't just read about math, but also try solving exercises and problem. These are very important for actually learning the material.


Thanks - that looks excellent. I will definitely take a look at your book this evening.

Agree with solving problems; this was what was missing from my school education. Literally rote and box ticking with zero applications.


I think just as much as it's important to have intuition is to have a healthy relationship with your intuition. Knowing when to trust it and when not to. Intuition often happily leads you very far down the wrong path. Math can, too, obviously, but doing math properly involves many self checks. You frame the problem in many different ways and can see if they line up. Your intuition is just the way you see the world. If the way you see the world happens to be wrong, you'll think the wrong thing. For instance, I might intuitively think that if I digitize a band limited analog signal I'm throwing something useful away... that you're throwing away the data between the samples. There's clearly wiggles there... those wiggles must encode something! It turns out though that a digitally sampled band limited signal can be perfectly reproduced. Perfectly. That's totally counter-intuitive, by which I mean there's little real world experience that would tell you otherwise!

So, I think you should think about intuition, math, whatever else you have in your pocket as tools. They give the right answer when used correctly and sometimes give the wrong answer even when you're sure you're using them correctly. I think though that intuition CAN be more insidious because it's what you've experienced! It HAS to be true, you think.


The problem in discussing this subject is the idea of "intuition" itself because the term doesn't have an unambiguous definition. Meaning varies from one instance to another, even one moment to the next, and disputes arise from imprecise communication.

In the example, I think what Feynman is describing, we commonly call "visualization", to be able to "see" the problem in imagination. That is no less a form of abstraction that is often vital to problem-solving. Of course, not every problem yields to this approach but it is a powerful feature of our basic cognitive tool set.

Einstein wrote that his early success in formulating his idea of special relativity was the outcome of his intuition about the physical properties of light, etc. he studied. Later on, the mathematical abstractions became more powerful, and at that point intuition about the "physical" nature of phenomena was insufficient for understanding.

But I think there are forms of intuition that apply to very abstract ideas, or what seem to be so to us. I once heard a physicist say "we never really understand higher mathematics, we just get used to it". Feynman would probably have agreed with that sentiment.

"Getting used to it" is really the equivalent of developing an intuition about the subject. I remember first learning about programming recursive functions, mind boggling in the beginning. After a while, it began to "sink in", that is, it became intuitive, I no longer had trouble "seeing" how it worked. The key is familiarity, something once strange is now digestible.

So there's nothing binary about intuition, it covers many forms of thought, and incorporates reasoning about emotion, having a "feel" for the problem in question. There are limits to our abilities, at the highest level it's genius, but there are no clear boundaries.


I agree that there are various patterns of thought that can be labelled intuition. I would like to add on that you have to practice to develop any of these different types intuition. This is I think what Feynman is saying and he is emphasizing the type of intuition (run the experiment in your head) that is very useful for physicists.

In every field there is probably a different type of intuition that is useful. In abstract math maybe 'getting used to it' very fast is very useful. In geometric math maybe being able to visualize objects in space is useful. In programming maybe seeing the run of the code in your head is useful.


You don't see the code run in your head, though. You get a feel for how this design pattern works at a glance (intuition), or you carefully step through the code like a debugger.


This was reaffirming. I spend several hours last night unable to sleep trying to solve a problem. As you describe: I had a design and kept throwing scenarios at it and debugged it.


> "we never really understand higher mathematics, we just get used to it"

Correct me if I am wrong but I believe this statement is attributed to Von Neumann. It has really stuck with me over the years, just because of how true it is.

I agree that what is important is self-consistency, precision and rigor. Intuition is for the birds. Like you have stated it is ambiguous and often deceptive. Mathematical abstractions enable us to circumvent these pitfalls.


I just started reading this book, "Thinking Physics", which teaches physics the way Feynman talks about, by trying to build intuition. It asks short questions like Feynman's about the table, and expects you to think about the answer for a while, before giving it to you and presenting the physics behind it.

http://www.amazon.com/Thinking-Physics-Understandable-Practi...


Even very advanced mathematics has this feel. You can get a ways with formulae and the like—and it may even be that there is no other language by which you express yourself—but ultimately you build intuition and, as they say, the best proofs arise as a way of making something perfectly obvious apparently so.

There's a strong argument here to be made that all human reasoning is embodied. This isn't the same as a weaker one which might now be left trying to discuss why human's can talk about experiences we can never know—like the behavior at the surface of the sun. Instead, I mean more fundamentally that our brain is one designed to operate in our universe and that our universe plays by many nice rules. Things decompose and move, time flows, causality dominates. It appears increasingly that all of the tools to understand our universe are within these simple forces your brain can't not build an intuition for. To fail to do so would lead to catastrophic inability to function.


While this is a lovely sentiment and a perfectly fine way to approach the sciences (and also engineering oriented disciplines), not all physicist think that way.

There is a lot of beauty in deriving laws of nature without relying on physical intuition. A lot of beautiful results are based on purely requiring laws to be self-consistent and seeing that only one possible law is self-consistent. For instance check out what Scott Aaronson says about probability in quantum mechanics. While Feynman in his famous lectures just says that quantum mechanics is counter intuitive and you are not supposed to truly understand it, Scott Aaronson uses math to explain how to correct your intuition (and I am stressing, this is not just about learning the math, it is about basing your intuition on the math, not on the everyday experience).


QM is counter-intuitive only insofar as your intuition is naturally wired for a classical mechanics world. Given enough practice, QM eventually becomes second nature too. You pretty much _need_ to go through that process to function at the highest levels.

There's a good post by Terrence Tao about this topic, I think it was posted here some time ago:

https://terrytao.wordpress.com/career-advice/there%E2%80%99s...


The math behind quantum mechanics is intuitive if you view it as a generalization of probability theory to allow negative numbers. The counter-intuitive part is that the resulting system turns out to be a good model of the world.


http://www.scottaaronson.com/democritus/lec9.html

The above is a good accessible exposition of this perspective. Though it would help to have some acquaintance with ordinary probability as well as basic linear algebra.


Wrong on two levels:

Our physical intuitions are Galilean, not classical, mechanics (that is, they are non-Newtonian). For example, our intuitions tell us that an object set in motion eventually slows down and stops. That's Galilean (also termed "folk physics" or "naive physics", usually by cognitive scientists).

Most of us had to study formal physics to advance to Newtonian classical mechanics.

Quantum mechanics (QM) is completely non-intuitive, at least as far as intuition about either folk physics or Newtonian classical physics is concerned. IIRC Feynmann says as much in his book "QED: The Strange Theory of Light and Matter", Chapter 28 beginning:

"I think I can safely say that nobody understands quantum mechanics. —Richard Feynman

The quantum theory is not explicable in commonsense terms..."

Of course Feynman used his real-world physical intuition all the way through to his most abstract work. In one case he characterised the internal structure of the proton as being like "marbles inside a tin can." Try and write those equations!


There's a lot wrong with what you wrote, but the most glaring is your use of the term "Galilean mechanics".

> Galilean invariance or Galilean relativity states that the laws of motion are the same in all inertial frames. Galileo Galilei first described this principle in 1632 in his Dialogue Concerning the Two Chief World Systems using the example of a ship travelling at constant velocity, without rocking, on a smooth sea; any observer doing experiments below the deck would not be able to tell whether the ship was moving or stationary.

> the term Galilean invariance today usually refers to this principle as applied to Newtonian mechanics, that is, Newton's laws hold in all inertial frames.

https://en.wikipedia.org/wiki/Galilean_invariance


I'm not sure why you're being downvoted; you're absolutely right that, intuitively, humans seem to understand physics in a Aristotelian manner. It's quite obvious why; Aristotelian physics is an efficiently computable approximation of real physics that works OK for caveman level technology on earth.

This is very similar to how scientists for a long time believed in classical Newtonian mechanics, because it's a reasonable approximation of the truth at large scales and low velocities.


His comment has technical errors, but more importantly it just doesn't address the parent's argument. It's a non sequitur.


I think you're calling Aristotelian physics "Galilean mechanics."


You're right, my bad! I should have said "Aristotelian mechanics" instead of "Galilean mechanics". Galileo saw his physical intuitions but, being the scientist he was, corrected them.


There is a way to use physical intuition in QM too, although it obviously isn't the physical kind (tables and flipping them over). It tends to be mathematical, but it certainly isn't rigorous chugging through equations. The canonical example of this are the test questions you see in undergraduate QM courses that show you a space potential graph then ask you sketch the wavefunction in different regions. You have to have an idea what a wavefunction does in certain regions, oscillate? decay? how many nodes, etc.


I can sketch you a graph of (x-1)(x-2)(x-3)(x-4), but it isn't because of intuition. It is because I know how to measure key characteristics of a polynomial (roots and asymptotes) and can smoothly interpolate (simple interpolation is maybe intuitive)


Sure, but I'm not sure how that is related. What I'm essentially talking about is you have a DE that is of the form -i h\phi_t -b\phi_{xx} + V(x)\phi = 0 and graphing \phi as you vary V(x). That is not obvious unless you know how solutions of the equation behave for different simple cases of V(x) and continuity. That, I argue, is intuition.

Actually, what you mentioned sounds like intuition to me. You didn't make hand plot the polynomial, so that is relying on intuition over rigour, which I think is what the OP's quote from Feynman referred to.


Related to this, there's a fair amount of research on the connection between embodied cognition and the learning of physics (and mathematics), as well as examples of 'embodying' physical forces and laws.

An example from Hans Freudenthal is from a standard physics question. If there are books on top of a table, what are the forces acting on the books? Most all students draw the downward force of gravity, but some forget about the force the table exerts upward back on the books. You can have students get on their hands and knees and put books on their back, or have them lie on their back and hold up books with their hands and arms. They 'embody' the table, in a sense. When you add a second book, you feel that you have to exert more effort (which correlates with force) to hold the books up.


The problem I have always had with the analogy of a person holding up the books with their back is that a table doesn't have to do "work" to hold up the books, but a person does. They table isn't burning any calories.


The table has tension, though. Ad opposed to a table made of paper which would not exert force on the book--the book would overcome the table's tension and set the table in motion until the floor exerted an upward force.


yeah, so think of your muscles as temporarily pretending to act like wood (rigid), and it just so happens that humans have to expend energy in order to carry out this feat.


This is discussed by Feynman as “physiological” work. See: http://www.feynmanlectures.caltech.edu/I_14.html#Ch14-S1


A problem is that the reverse is also true. Simple example: stretch a rope around the earth that fits tightly and add 1 m to the length. Can a cat go under the rope (it's always a cat). Intuitively that seems like a "No". The simplest of formulae says "Yes" ( 1m/2pis = +/- 15 cm). Here, pretty much everyone has to resist intuition and trust numbers. I think the best way to conceptualize intuition is some sort of unconscious sub-process carried out by the brain at all times that has to be shut down or de-prioritized most of the time to allow for hard, "empirically-validated", processes to dominate the global activity (similar to early life "pruning"). As all neuronal processes, these need to be activated to remain available.


Intuitively it can also be answered: suppose you pinch the rope, so it is tight (you step on it and pull from between your two feet, for example). You'd have a handle for the earth that will go to about just above your knee. A cat fits!


Good point. That's the power of thought-experiments! It's about stimulating intuition.


Maybe youre right, but your example is not a good one. Your problem reduces to, does 1m of rope provide enough to cover a cat? You actually have greater than 1m of rope since the cat has width.

Maybe a better example is, what is the best way to accelerate a ball horizontally from a fixed height? IE how can we most efficiently translate potential energy to kinetic energy in the horizontal direction?


Does anyone have a good theory for what they think intuition is?

I'm interested in process. What is the mapping algorithm that transforms problem into solution? While intuition appears to be magic, I believe that there is a very concrete process happening in our subconscious.

My personal guess is that we're transforming the problem into a format more suited for different modules of our brain to process.

For the table problem we apply 2 transformations. One is referred to as a calculation, the other as intuition.

"Calculation" involves transforming the table into symbols (mathematics) for the language/logical part of our brain to process; the other transformation involves turning the table into a sort of fuzzy 3D visualization for the imaginative/spatial part of our brain to analyze.

Both transformations yield transformed results. Symbols yield a symbolic/numeric solution, a fuzzy 3D visualization yields an equally fuzzy 3D solution.

Funny how two different process that are both seemingly systematic are called different things. Why is one called intuition and the other not?


I am no expert, but it seems to me that intuition is what the machine learning algorithm develops as a model.

The model itself cannot explain how it arrived to its conclusions; that could be done with the training data which are long gone.

Similarly to machine learning, humans can develop intuition about things by learning and training in the subject (that's why I believe rote learning is actually quite useful).

Just like with intuition, it crucially depends on (and varies with) the input data (experience) and there can be different models, but successful models (those that give good results on training data) are quite similar in appearance.

Of course, the big disadvantage of intuition is that you cannot explain it to others, even if it works. They have to believe that you are expert and made correct judgements (that you have correct model). That's why science (and especially mathematics) has tried to formalize the process, so that people could double check the reasoning and wouldn't have to rely on expert authority. That's why the two processes are called differently, I think.


I agree. This is a good theory.


The idea of intuition is explored in psychology through Dual Process Theory. See "Dual-process accounts of reasoning" below.

https://en.wikipedia.org/wiki/Dual_process_theory#Dual-proce...

Kahneman and Tversky spent a good deal of time trying to identify the major heuristics employed (and biases caused) by the more "intuitive" of the two systems used for thinking.

(You may note that at the time this paper was published there was no concept of Dual Processes and the phrases "System 1/2" are never used.)

http://psiexp.ss.uci.edu/research/teaching/Tversky_Kahneman_...


I think intuition is just using (or trying to use) a higher level of abstraction to approach a problem.


The technique served me very well doing physics and engineering in an earlier life. I struggle now with the chaos of software systems because I'm unable to "feel" what's going on.


It seems like a common intuition in software is the "code smell", where you can just look at some code and tell if it will be "maintainable" or "good", for some value of that.

I've come to think of it more as a "familiarity" though - where if you're familiar with writing stable and maintainable systems, you'll get the right smells - but some people have opposite feelings about systems, and it seems like they are people that have the opposite type of experience as well.

Other types of intuition I think are based mostly on pattern recognition. If you have a system that makes a set of choices that follows a well known pattern to you, your brain can start predicting what has been done where, and with enough confirmation, can start feeling confident about the behavior of other parts not yet scrutinized. Once again, if things are done in an unfamiliar way, all that evaporates and one needs to fall back to looking at the low level strategies and techniques.


I think there are two separate issues here. One is 'just' another way of understanding or visualizing a concept or process: "look the problem over and see if you can understand the way it behaves, roughly, when you change some of the numbers". Isn't this exactly what you do with a debugger when you step through the code? Or if I'm designing a data structure or algorithm, I have to play around in my head, or with pen and paper, before even thinking about going near a keyboard.

The other issue is 'allowing' oneself to imagine and visualize these things (math/physics equations / software). I don't think the point is to restrict yourself to things for which you have physical intuition, it's more that it's easier to go from physical experience -> intuition about physics phenomena that are closely related to physical experience -> intuition about math /physics /software that don't have a counterpart in our physical experience as humans e.g. quantum mechanics


As hard as the actual calculations may be, the main feature of a mechanics is that it is conceptually simple.

Now, software just isn't. Managing complexity requires a completely different intuition from exploring a simple space.


That's what this is about: http://worrydream.com/SeeingSpaces/


This is true of programming too. You're better off learning how to code first and then learning the theory. Once you've got a few simple projects under your belt data structures and algorithms will make a whole lot more sense.

You need to learn arithmetic before you learn algebra.


I disagree. Programming languages are just formalisms for writing down the theory and because they're made in part for machines to understand, they are not optimal for humans. I think you're better off learning the theory first using some high level pseudocode before you battle with the quirkyness of real world languages.

Data structures and algorithms are made to solve real problems. You don't have to be able to code up those problems to understand them and see how the algorithm works. Some of the cleverest theoretical CS guys that I know freely admit that they're bad programmers and couldn't implement the algorithms that they describe in their papers.


I think this is interesting because the implications go beyond just physics and mathematics.

Sometimes we learn things in one domain that could be easily applicable in others, yet since we've never practiced said things in another context, it's hard for us to make this connection. Often, all we need is a little nudge in the right direction, which could open to us a new world of insights that we didn't have before.


I had a bad anxiety attack one time. The room I was in felt like it was undulating. Everything started to become fluid like. I can't explain it, and should probally draw it, but years later I still wonder if that's the way the real world actually looks?

Never told anyone--out of fear it could happen again.


Intuitions are insights (hints) from so-called ancient, non-verbal (pre-linguistic) instinctive "knowledge" or "genetic memory". It is not only the kind of knowledge of how birds "know" how to make nests or men know to run out of building when earthquake happen (without any prior training), but also intuitive knowledge about the nature of reality, properties of physical environment, which has been "trained" before any language, even before human species has been evolved (we share "pre-cortex" brain centers with our ancestors which were shaped by environment). This is where intuitions are coming from.


That is "instinct", a different concept from "intuition", thought overlapping in some aspect.


It strikes me as kind of silly the way he tries to separate the mathematics (i.e. what he calls simple calculations) from the physical intuition; all mathematics beyond say sophomore year in college is very intuitive but at an abstract level (and the earlier stuff too, they just teach the early calc/stats classes to scientists/engineers so they drop the intuition and make it about pure mindless calculation), mathematicians are primarily intuitive creatures who have just developed intuition over years of hard work about abstract objects which are far more difficult to intuit about than, say, a table.


I would be interested to know more as to how people think in the language of mathematics when deriving and reading mathematical formulas as I really struggle with it if I don't get the intuition.


Well, as a working mathematician I find that the hard part is typically coming up with the right definitions. Once you have the right concepts clearly defined (in terms of things you already know), the actual new ideas tend to arise with ease.

Similarly, it is often the case that new fields of mathematics arise from someone defining a new concept that was previously imprecise. Once you have the right language to discuss something, discovering its properties becomes much more straightforward.


Mathematicians use and think about math in a way that is almost completely orthogonal to the way physicists use and think about it. To physicists math is more like a natural language, and definitions are conveniences. This is why mathematicians often say physicists can't do math: by their definition of math this is correct.

For physicists the math becomes a language and a safety net and a set of heuristics that let us simply the problem to the point of being about to reason about it effectively. A great deal of what we use math for amounts to book-keeping. The human imagination is as capable of dreaming up impossibilities as it is incapable of dreaming up the way the universe actually is, and math helps us avoid doing the former while we use systematic observation, controlled experiment and Bayesian inference to figure out the latter.

Because "thinking about the mathematical representation of physical reality" is such a profoundly unnatural, unintuitive act, and because the math is so strict and simple, it is very hard to for us to use it to imagine impossibilities, whereas if you have a conversation with a layperson you will find they almost instantly run off the rails into nonsense because they don't have the math to keep them on track. So laypeople believe in perpetual motion machines and the like with surprising ease, because they "just make sense" to their intuition (which maps pretty well to Aristotle's physics).


Isn't this obvious? Why does someone have to say it?


The table is obvious because it's used as an example. Developing intuition for something list say... relativity is not as obvious.


Relativity is impossible to imagine and intuit, also particles and quantum phaenomena. Physicists always tell the public to "not try to imagine" these things, because that's the natural and obvious thing to do: to imagine and try to intuit.

This student at the example is probably crazy or fake.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: