Fantastic endeavour.
I'm someone who has needed to force myself in mathematics for my career. I needed extra help at school, but had no real trouble in any other class.
I thought I was just too dumb for maths. I came to realise that my problem was not in mathematical concepts which it turns out are fairly easy, but in the language of maths itself and how it is taught (some exceptions such as Strang who is just a wonderful educator). Programming in particular really helped me to learn. That's not to say that certain areas aren't still difficult but reformulating away from the traditional notation and teaching style has helped me.
There is so little mathematical notation to learn though. A typical high school course has like a couple of symbols and a few keywords to learn, that can't really be the main reason you had a hard time to learn math when other courses has hundreds to thousands of words you have to learn?
There are some big problems with mathematical notation.
The first is that it is terse--often excessively so. The clearest way this manifests is the sheer predilection for single-variable names. Every variable in every equation invariably gets reduced down to a single letter, and sometimes you need to go so far as distinguishing based on font face or boldness or what random diacritic you can throw on that variable name to keep it a single-letter variable name. Even when it doesn't reach that extreme of an issue, it makes skimming difficult, because you now have to go back through the prose to figure out what 'H' means, and spotting the definition of a single-letter variable in prose is really easy, right? (No, no it is not).
Another factor that can make it infuriating to read mathematical notation is the sheer overloading. Subscript notation is a good one for this--if you see a single letter subscripted by something else, it could mean either getting a particular entry in a sequence of entries, it could be an index into a vector or a matrix or a tensor (and are you getting a row or a column or whatnot? all of the above!), maybe it might be a derivative. Or maybe it's an elaboration of which of the possible terms that get summarized to that letter is actually being referred to (e.g., E_a for activation energy).
Of course, the flip side of the notation problem is the fact that some concepts have multiple different notations. Multiplication is a common example: to multiply a and b, you can use a × b, a · b, or screw any symbol altogether and just say ab (thus a(b) is also a way to indicate multiplication, which totally has no potential confusion with applying a function [1]). Differentiation is the worst; virtually every introduction to derivatives starts with "oh there's multiple notations for this"--is it really necessary that so many need to exist?
[1] There was one paper I once read where I literally had to spend several minutes staring at a key formula trying to figure it if I was looking at function application or multiplication.
> is it really necessary that so many need to exist?
Yes it is necessary - often different contexts require different notation, sometimes as an abbreviation, sometimes not.
I do not believe that mathematical notation can be improved (a poor analogy would be trying to improve, say, English language itself); what does evolve, though, is the understanding of mathematical objects and mathematical frameworks - which can often lead to simplifying things and, sometimes, notation, too.
> I do not believe that mathematical notation can be improved (a poor analogy would be trying to improve, say, English language itself)
I strongly hope you mean that heavy-handed top-down approaches aren't likely to work. Because this reads as though you're saying we've somehow reached the optimal point on every axis for both mathematics and general-purpose communication.
The (original) purpose of mathematical notation is to facilitate (to some degree, automate) reasoning and calculations (mathematicians tend to conflate them) on paper. For the existing frameworks it’s already as good (efficient) as it gets, given the 2-dimensional nature of said paper. Would adding new characters beyond the existing set (which is already quite large), introducing new alphabets in addition to the Latin and Greek (OK, we have the aleph), adding more font styles and sizes - would any of that constitute an improvement worthy of note? We have already exhausted the freedom given to us by the 2-dimensional paper-space as a computational medium - consider, for example, machinery that heavily relies on graphs (diagram chasing; Dynkin diagrams, etc.) or tables (matrices, tensors, character tables of groups, etc.).
And that is why perhaps so much of us like programming: there is exactly one correct way to interpret code. And stylistic differences (different notation for the same thing) are usually discouraged with style guides
> there is exactly one correct way to interpret code
No, there are different compilers, languages or even language versions or compilers settings. Programming notation is way harder and more confusing to learn than the math equivalent just because they are so many more styles languages or versions etc. And we all know that learning all of those symbols and keywords isn't the main difficulty with learning programming or new programming languages.
Of course different languages have different syntax, but it is a pretty hard requirement that within a (specific version of a language the syntax rules are both explicit and consistent.
Those are issues with sloppily written math papers, not with high school math. This repo is just high school and intro college course math, there isn't much overloading or confusion there.
I think a lot of difficulty arises in math noation because most people (myself included) read a lot of math but don't actually write a ton to express my own thoughts. I'm decent at engineering math but grinding through problems, reading papers, and writing are all different skills.
I'm very similar. Show me a big mathematical formula and I'll have to dig into it for a while to understand it. Show me the code that implements the formula and it makes sense a lot quicker (for me anyway).
I am the exact opposite. I look at code written by some analyst or whatever at work, and it's usually borderline unreadable if you ask me. Pretty print it as latex and I can visualize what it actually does almost immediately.
Besides, I don't really care about "the code" i.e. the mechanics of computation but rather the properties of that expression that exist regardless of how it is communicated i.e. it's shape or how it behaves asymptotically etc.
I have had this same experience of finally understanding a mathematical idea by seeing it implemented in a programming language. You can always eventually understand a program because there can't be any ambiguity or the compiler couldn't decide how to compile it. Math is not supposed to have ambiguity but it arises because there is too much convention and assumptions in the notation (granted sometimes they are saying something broader than can be expressed in code). But higher mathematics as a field seems to me to be like a programmer who uses single letter variables, never write comments and really like clever bitwise operators.
Every math paper I've read and written have written out definitions for all the notation they use, even have a section called "notations". Don't you read math papers? Or do you mean applied math papers? Applied math is more like engineering though so they get sloppier with notation, I guess the same thing happens when computer scientists do maths.
But that isn't really a problem with math, but a problem with sloppy people. Or you just didn't read the notation sections explaining everything.
As a programmer and occasional electronics hobbyist most math I encounter is in the context of whitepapers, or physics/EE texts or similar. I don’t really read papers in pure mathematics. Perhaps the difficulty is due to sloppiness by non professional mathematicians.
I found myself in a similar boat (I'm a software engineer with no college degree).
I did well in school with geometry, algebra, and pre-calculus but I did so by memorizing not by understanding.
A decade later I ended up going through a lot of Khan Academy videos to refresh myself and then diving into discrete math and linear algebra textbooks. It really helped me to finally understand the core mathematical concepts that we use in programming algorithms.
Math generally suffers from a great deal less precision than Haskell. It’s always a treat for me as a Haskell user to find a math or physics text that’s truly notationally precise. Only authors who really understand what they’re doing at a deep level can pull it off.
This a very good point; I would also stress that mathematics itself is highly “functional” (in its modern form anyway), and that it is stateless (unlike its representation in an imperative language).
I really like this. It took me a minute to figure out that having GitHub in dark mode was making the embedded math images effectively invisible. Just in case this trips anyone else up, switching to light mode will make this doc make a lot more sense :)
Because this document should be published to a static site so there's opportunity to aid in legibility. Personally I find the characters-per-line to be too much, and there's no value in the GitHub branding encompassing the document.
Ouch! For me, looking at the formula I can instantly see what it is, whereas I have to think a bit when I look at the code.
Given how often I've used summations in physics/maths, I shudder at how verbose it would be in code. Something I could write in one line may take up a whole page.
The code on the right is an explanation or implementation you could jump into. The replacement for use would be the function name, such as `summation` or `product`. A computer language is already used to communicate almost all math these days: LaTeX. A slightly lighter syntax, perhaps with optional unicode names, built on a symbolic math package would not be any more cumbersome.
Haha, I posted it actually. I knew it would make it here again, this is where I first saw it years ago. I saw the repo get about 1k stars since and I had to disable my likes notification.
Most latex symbols are named after their shape (e.g. \nabla or \Rightarrow) rather than their meaning or just one of their meanings (e.g. \times or \frac or \over or \binom). So I’m not sure detexify really helps that much.
I am currently paying a math teacher to teach me derivatives all over again. I know how to solve those since I am from a science background but don't know what they really mean.
even simple things dy/dx, is that
dx multiplication or is d a function. It took me 2 frustrating classes to even let the teacher know what I am trying to ask.
Math as algorithms would make self learning so much easier and fun. I would love the field of mathematics if it was not so ambiguous.
> I would love the field of mathematics if it was not so ambiguous.
IME the ambiguity is usually intentional. Take dy/dx for instance. In some sense, d/dx is acting as an operator (or "function" to use programming terminology) acting on the function y to produce y'. However, the intuition here is that we're calculating a slope (ie. a ratio), and the particular ratio we want is the limit of [a sequence of] Δy/Δx. Both of these viewpoints are important to keep in mind to understand calculus, and the point of the notation is to aid that intuition. There are, as I'm sure you're aware, alternative notations for derivatives, and the d/dx notation survives because it suggests things beyond its formal definition.
I think this becomes the most clear when we consider the chain rule, dx/dy * dy/dz = dx/dz. We can't just "cancel out" the dys (outside of exotic approaches such as nonstandard analysis), but allowing that sort of visual pun in our notation makes the formula easier to remember and gives a sense as to why it might be true. And, if we didn't know the chain rule ahead of time, it would likely prompt us to try and prove it in the first place.
The bigger story here is that a lot of mathematics runs on human intuition, and finding a notation which communicates and facilitates that intuition is quite important.
I think it would be neat to have a browser add-on that would explain meaning, use and code equivalent of a mathematical notation when user would hover over it.
Why would you want to have something like this translated into code:
a + b
-----
c + d
Math, in this sense, is already a language, and what you see is already code. (So, basically, all you are talking about is translation, like, from Ruby to Python.)
It could be ordered from most to least likely like search results. Multiplication is more common so put it at the top. People will assume that it's not the right thing and work down the list
I really wish that entire science would just move on from usage of symbols to algorithms, it would considerably reduce the total information one needs to know before starting to study some fields with more esoteric symbolic notations.
I am not sure how you would replace the structural formulas used in organic chemistry, for example. Going beyond science, most engineering disciplines use schematic notation that you would find esoteric, yet with no chance to be translated into algorithms in any meaningful way.
I thought I was just too dumb for maths. I came to realise that my problem was not in mathematical concepts which it turns out are fairly easy, but in the language of maths itself and how it is taught (some exceptions such as Strang who is just a wonderful educator). Programming in particular really helped me to learn. That's not to say that certain areas aren't still difficult but reformulating away from the traditional notation and teaching style has helped me.