I would hard disagree that undergrad level Analysis or even just the trickier corners of vector calculus are within the bounds of what programmers can easily pick up without dedicated and guided study. Everybody's gangster until they have to parameterize some bullshit helical structure in R3.
Comparable levels of programming, what we expect of CS juniors, are regularly picked up by "the guy who is good with Excel" in office settings as it's mostly a function of experience and exposure, not theory.
And now my worthless anecdotal evidence: I self taught myself into professional programming and it was a simple matter of banging my head against a wall until shit started working. The feedback loop, "did the thing crash or not", permitted me to learn on my own. I wouldn't even begin to understand how to self-teach myself Stokes Theorem or some shit, and have zero ability to author the proofs required to reach the conclusions higher level mathematics are built on.
I think you're hitting the nail on the head here. Something about the learning process makes programming much easier to pick up.
What if we had something similar for mathematics?
Rapid feedback, error messages, maybe even linters and highlighting for the "mathematical syntax".
I've though about this before and I think tools like this could unlock math for a lot of people, and also increase the effectiveness of professional mathematicians.
When learning math / seeing other learning math I've noticed that simple errors such as typos often slow down or hinder understanding of the subject.
Take a look at the Natural Number Game! [1]
It does exactly that: "Rapid feedback, error messages, maybe even linters and highlighting for the "mathematical syntax"."
After you get the hang of the system, you can play with the interactive theorem prover behind it: Lean [2]. There's also plenty other interactive theorem provers (Coq, Isabelle, HOL, Mizar, Metamath, ...) but Lean has a lot of traction amongst mathematicians at the moment.
There are no limits to the math you an do with this. There is mathlib [3], the main mathematical library. It covers a lot of undergraduate material [4], and plenty of stuff beyond that [5]. The community has even covered some state of the art research math in Lean [6a, 6b].
You are very welcome to hang out on the leanprover zulip [7]] and ask questions about the Natural Number Game or anything else that is Lean-related.
Is it the learning process, or the subject itself?
Programming works with manmade abstractions, carefully designed to have very few interactions, keep mutable state contained, and to have all the parts structured in a hierarchy without recursion.
In math you have systems of equations that all reference each other. And they all happen at the same time because there's no steps and time or lines of code just 5 equations that all interact.
The atomic pieces are much larger and tied together and it inherently seems to require being able to fit nontrivial ideas in your head.
Programming lets you design a complex architecture one tiny piece at a time without considering other pieces.
Isn’t math just manmade abstractions as well? I’m not saying they’re equally hard (I think maths is harder too), but you seem to be overly simplifying programming, under the assumption that the system you are working with and building on is actually well designed, rather than what is actually more likely (a munge of SOME clean design with many layers of hacks on top).
I think Maths is harder because the abstractions are higher level, have less intuitive bases, the feedback loop is longer and doesn’t have robust testing. I don’t think the abstractions themselves are that much harder in general, but getting an intuition and doing anything useful (correctly) at a really high level of abstraction is quite difficult.
My view is that math is man made. It's a mash of notation for various ideas, more or less unambiguous, more or less rigorous. Some are more well designed than others. It's closer to natural language than programming languages.
Not every programming language has a concept of time. Sometimes that's a good thing, it depends on what you're used to.
Math arguably has less mutable state than most typical programing languages.
I'd love to play around with such tools, but I think they'd only get you so far before they'd start to become a hinderance.
The linter in mathematics is whether the other mathematician (whoever you're proving to) knows what you mean. If you're locked into a rigidly defined syntax, an obvious line of questioning is: what's not expressible in this syntax?
I fear that by the time the tooling was agreed on, built, and taught in schools, you'd have something like APL, which might be cool to code in, but from which the mathematical conversion would have moved on a while ago. Efforts like that, after all, are how math becomes engineering.
Consider, for instance, Russel's theory of types, which was interesting math at the time and now strikes the student with an engineering background as "pretty much just Java" (or any "normal" statically typed language).
If you're learning math for career reasons rather than just pure curiosity, engineering is the main/possibly only place you'd use it besides statistical analysis.
Yeah that sounds about right. I was more musing about the philosophical boundary between math and engineering (math being a creative pursuit and engineering being about outcomes).
I can't quite pin it down, like with a definition, but I'm tempted to say that if it has a linter it's not math anymore, even if it once was.
> I wouldn't even begin to understand how to self-teach myself Stokes Theorem or some shit
Input it into a proof assistant, and rely on the same sort of feedback "does the computer accept your proof, or get stuck". The hard job of formalizing stuff for this purpose has seen significant progress, e.g. by the Lean mathlib project.
> Input it into a proof assistant, and rely on the same sort of feedback "does the computer accept your proof, or get stuck". The hard job of formalizing stuff for this purpose has seen significant progress, e.g. by the Lean mathlib project.
As a math teacher who disagrees with the premise of the GGP post ("This couldn’t be more wrong. … I’m sorry, but mathematics is orders of magnitude more intensive and difficult than most programming"), and thinks that any good programmer can learn mathematics—of course there are code wodgers out there who don't really understand their craft of programming, and so can't translate that knowledge to facilitate an understanding of mathematics—I think I also disagree with this. I've never tried it, but I can't imagine someone learning about Stokes's theorem in anything like this way. One of the many axes along which I imagine this failing are that the state of human readability in proof assistants is, well, let's say it's less well developed than the, cough, stellar state of the art in compiler error messages.
But, more importantly, you can, at least in principle, know every single step in a proof of Stokes's theorem without understanding in any real sense why it's true—and a proof assistant in particular will force you into the weeds of minutiae that absolutely do not help to build any intuitive picture—and, even if you manage in the process to piece together that understanding of why it's true, you will never thereby gain an understanding of why it's interesting (e.g., among other things, its connections to physics and the entrée it offers to differential geometry).
The flip side of a proof assistant's 'minutiae' (and there's plenty of room to disagree wrt. whether paying attention to those minutiae helps with gaining a better, more accurate intuition!) is its ease of refactoring a proof. A proof assistant can instantly tell you whether a seemingly nicer, better-abstracted proof B really manages to prove the same thing as proof A, something that's very hard to do without the use of precise formalized statements and automated checking.
Self teach Stokes Theorem by inputting it into a proof assistant? Are you serious? That is very inefficient; the OP was talking about learning the kind of vector calculus taught in first calculus sequence. I think just watching a short YouTube video and doing a few exercises will work and is a proven method. Proofs of theirebs are very often much more complicated than applying them (understatement intended).
If you don't understand the proof of any theorem, you haven't really "learned" it in any real sense. Wrt. doing computational exercises in vector calculus, that requires knowing the "rules of the game" which is also something that you can test precisely in a proof assistant.
First, of course you can understand a theorem without knowing the proof! Uniqueness of prime factorization, for instance, is notoriously tricky to prove, but it would be a stretch to say people who haven't majored in maths haven't learned it in any real sense.
Second, even if you wanted to understand the proof of a theorem, doing it with a proof assistant is an atrocious way to go about it.
Many proofs of theorems in calculus would require topology to understand. You’re suggesting that you cannot be competent in vector calculus without knowing the proofs at a professional. I think I, along with probably everyone, will have to disagree with that.
I would quibble with whether this is exactly equivalent.
In programming I knew I needed to sort a list or find a most efficient path because some practical problem I was trying to solve demanded that I do that. Frequently I had a basically crap but working independent solution before I learned the names "EWD" or "A*". I independently discovered that I needed virtual interfaces (before I knew them by that name, "I wish pointers to parent classes could call implementations in subclasses") and then discovered language facilities for polymorphism and OOP.
Without formal or at least guided instruction I would never think to move towards or discover "I wonder if there's a relationship that makes these double integrals of curls of vector fields easier to solve for".
Programming has a high coupling between necessity, experience, and theory. In mathematics that coupling is much, much, much looser. Self learners in programming regularly re-discover and re-implement, typically less efficiently, all sorts of fundamentals of CS. The equivalent in mathematics rarely happens post-algebra.
I think your comparison is a bit unfair. Essentially, CS is as hard as mathematics because it is mathematics.
For example, take any good static analyzer that implements abstract interpretation. It generally works using Galois connections, which is just abstract algebra.
Dijkstra's algorithm or A* came pretty early in the history of CS. It would be fair to compare their difficulty to something similar in mathematics, say some basic results in Euclidean geometry.
CS may be mathematics but programming certainly isn't
If you're discussing pure CS, the thing you can write down in a book and for which a computer is a largely theoretical device, sure CS is mathematics.
If we're talking about the practical reality that CS majors in America today are trying to achieve, and their undergraduate programs are trying to prepare them for, that's becoming a working programmer and has very little relation to mathematics.
A* is already intermediate level programming. CS is math because it's what we call the parts of programming that are math.
But so much of programming isn't. It doesn't require deep understanding, static analysers are advanced level things that are way beyond what many working programmers ever encounter.
Do you have more information on this approach? Sounds very interesting. I've read and toyed a little with things like Lean and I'm interested in that field but the barrier seems a bit high (without pre-existing knowledge) to just "input" a theorem and toy with it.
Comparable levels of programming, what we expect of CS juniors, are regularly picked up by "the guy who is good with Excel" in office settings as it's mostly a function of experience and exposure, not theory.
And now my worthless anecdotal evidence: I self taught myself into professional programming and it was a simple matter of banging my head against a wall until shit started working. The feedback loop, "did the thing crash or not", permitted me to learn on my own. I wouldn't even begin to understand how to self-teach myself Stokes Theorem or some shit, and have zero ability to author the proofs required to reach the conclusions higher level mathematics are built on.