Hacker News new | past | comments | ask | show | jobs | submit login

Talking about not understanding rigor in the context of calculus is silly.

Modern formalism is because of calculus!… and the fact that contrary results were obtained regarding continuity and derivatives due to subtly different conceptions of the terms.

Enter the Weierstrass function:

> Weierstrass's demonstration that continuity did not imply almost-everywhere differentiability upended mathematics, overturning several proofs that relied on geometric intuition and vague definitions of smoothness.

https://en.wikipedia.org/wiki/Weierstrass_function




I mean I'm open to the idea that pedagogically, full rigour is not always required (depending on the audience)... although there are certainly always certain people who are not going to be satisfied with seemingly "intuitive" explanations such as "infinitesimally" small, people are different after all.

But instead of saying "I'm not proving things rigorously because [reasons]", he's claiming that rigour doesn't really exist or has no importance which is kind of crazy for the reasons you mentioned.


Yep — I’m a big proponent of explaining the intuition; I even go for intuition first, because it’s only by having a notion of what you want to discuss that you can make a meaningful formalism. After all, how do you know what the right formalism would be without an intuitive notion of what you’re trying to model?

But when you want to resolve a question like “what can we say about continuity and differentiation?” suddenly the answer depends intimately on your formalism. And the only way to resolve that we have conflicting ideas about the intuition for “continuous” and “differentiable” is for us both to formalize that intuition in a model — then compare the two to see where we disagree.

- - - - -

As an aside, you can formalize the notion of infinitesimals, but it requires a lot more machinery which introduces its own quirks. And it was only because we formalized calculus (and tried to formalize all of math) that we had sufficiently advanced model theory. I won’t claim to be an expert, but the topic is Nonstandard Analysis. I believe similar constructions show up in game theory and quantum mechanics.

https://en.wikipedia.org/wiki/Nonstandard_analysis


You can introduce nonstandard analysis, but it's not necessarily incredibly intuitive either, even if you skip a formal construction (which requires either ultrafilters or model theory). I think that's still a bit removed from just giving ad-hoc reasoning about "infinitely small" quantities without specifying precisely what you can and cannot do with them.


Rigour is over-emphasized in the teaching of Mathematics to the detriment of its Understanding. Teaching should always be conceptual first before representing them formally and using rigour.

The best example is Faraday vs. Maxwell. In one communication Faraday actually says it "frightened" him to see Maxwell's mathematics but that his "Conclusions" were so clear that he could think and work from them. Faraday thought and worked with concepts and mental models while Maxwell gave them a formal and rigorous representation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: