As a programmer and hobbyist math reader, I found category theory to be very unrewarding (and I gave up on it) because of the lack of interesting theorems and lemmas. My takeaway was that there's Yoneda lemma and really nothing interesting before you reach that. Like, CT describes a set of rules but very little emerges from those rules.
My complaint has nothing to do with whether CT is useful or practical. By contrast, abstract algebra can be taught (e.g., in Herstein's books) as pure abstraction (very dry and without presenting any real-world connections), but you reach Lagrange's theorem right away -- which is an simple-but-awesome result that will wake up your brain. You reach Cayley's theorem and others quickly, and each is more exciting that the last. And this is all while still in the realm of purely-abstract math.
> As a programmer and hobbyist math reader, I found category theory to be very unrewarding (and I gave up on it) because of the lack of interesting theorems and lemmas. My takeaway was that there's Yoneda lemma and really nothing interesting before you reach that. Like, CT describes a set of rules but very little emerges from those rules.
The fact it's so rare to find a counterintuitive fact in CT, so that you rarely find yourself proving something and you mostly spend your time constructing effective tools, it's a feature not a bug!
McBride's "Don't touch the green slime!" is a great paper/saying about this principle. Work with the compiler, not against it. The compiler is very dumb so it understands only trivial things.
There's a legacy pseudomachist culture of 'hard is good' in mathematics (but really, it's transversal to science) which rewards working hard and not working smart.
> As a programmer and hobbyist math reader, I found category theory to be very unrewarding (and I gave up on it) because of the lack of interesting theorems and lemmas. My takeaway was that there's Yoneda lemma and really nothing interesting before you reach that. Like, CT describes a set of rules but very little emerges from those rules.
This is correct. The only somewhat interesting result in Category theory is the Yoneda Lemma, everything else is machinery for diagram chasing. It's ubiquitous in the same way that short exact sequences are ubiquitous -- and about as interesting as short exact sequences.
I think most hobbyists or those who are intellectually curious would be better off studying physics or engineering first, and then picking up the math with applications along the way. For example, you can study multivariable calculus and then complex analysis, which naturally leads to questions of topology as you find try to solve integral equations, and then obstruction theory can come up. Lots of cool stuff to study that is actually interesting. I would never foist category theory on someone who finds math to be interesting and enjoyable -- that would be like throwing a match that just caught flame into the rain.
My formulation is "if it's not trivial, it's probably not good" when I implement the necessary functions for the "type class" (to take a haskellism) to work. If your `bind` implementation monad doesn't look like it could have been written by someone who just used the function types, it's probably not right. Thanks for the link to the mcbride-ism:
> As a programmer and hobbyist math reader, I found category theory to be very unrewarding (and I gave up on it) because of the lack of interesting theorems and lemmas. My takeaway was that there's Yoneda lemma and really nothing interesting before you reach that
One of the most interesting things in CT are adjoints. They happen literally everywhere. For example, static analysis via abstract interpretation is an example of adjoint (which in this case is called Galois connection). Free structures also give rise to adjoints.
Sine there're a lot of discussion in this tread. My opinion about category theory and programming is:
- You don't need category theory to be a programmer (or good or 10x or 100x programmer)
- It makes sense to learn category theory because its beautiful like it makes sense to study arts, literature or philosophy. Not because it useful, but because it gives you pleasure.
as with every abstraction, its value is in the value it brings you. That sounds a bit tautological, but I think that's the crux of the matter. If you like abstraction, and abstractions help you think, you are going to find value in them. If you feel they bring you nothing, then there is no need for you to use them.
My problem with category theory (my limited study of it, several years ago) was that it describes and defines a list of properties, but those properties don't combine to reveal any unexpected, exciting results.
Again, with my abstract algebra example from above: after just a couple of basic abstract algebra definitions, you learn about subgroups. Simple enough, and not particularly exciting so far. But then you quickly reach Lagrange's Theorem, which shows that in a finite group, the number of subgroups divides the order of the parent group. And that means... if the group has a prime number of elements, then it can't contain any (non-trivial) subgroups at all! That's super cool and not at all obvious in the original definition of groups and subgroup. And it keeps going from there, with a brain-punishing amount of results that all just emerge from the basic definitions of a group, ring, and field.
In contrast, category theory just felt empty. This is definition of a functor. This is a monad. Here's different kinds of morphisms. Etc.
Dunno, maybe I just needed to keep reading. But my sense from flipping forward through my CT books is that it was mostly just more concept definitions.
>My problem with category theory (my limited study of it, several years ago) was that it describes and defines a list of properties, but those properties don't combine to reveal any unexpected, exciting results.
IMO, the main value of category theory is unifying existing math knowledge in one theory. I.e. it helps you see connections between seemingly unrelated areas of math. I.e. in some sense it's a pure abstraction.
- An operation may be "functorial", meaning that it preserves more structure than perhaps originally thought. For instance, the "fundamental group" operation is indeed functorial, which means that it acts on continuous functions in a nice way as well as topological spaces. Other examples are tensor-product, vector-space-duality, forming function-spaces in certain categories, etc.
- Two categories may be isomorphic, but this is trivial.
- Two categories may be equivalent, which while a weaker notion than isomorphism, is sufficient for most things which can be expressed in categorical language to be true for both categories. This is helpful when one category is well-understood and the other one is an object of present interest. (One application is showing that the category of representations of a fixed quiver Q is Krull-Schmidt, by showing that it's equivalent to another category with the Krull-Schmidt property).
- A functor between two categories may admit a left-adjoint. It then immediately preserves all limits (it's "continuous") which immediately means that a great deal of structure gets preserved by it.
- A functor between two categories may preserve all limits. It may (under some circumstances, expressed by the "adjoint functor theorem") therefore admit a left adjoint. This may be a non-trivial fact of interest in its own right. It's related to dualities in optimisation and game theory.
- There's isolated results like the Seifert Van-Kampen Theorem (which states that the Fundamental-Group functor preserves pushouts) which would be difficult to express without categorical language.
Ultimately, Category Theory appears to be a language for compressing complicated facts about structure-preservation of operations.
Category theory is helpful in advanced algebra, and helpful too in advanced topology, and is in its absolute element in any area which combines the two, like algebraic topology and algebraic geometry. In the latter two areas, you've got lots of functors between algebraic categories, lots of functors between topological categories, and even functors going between algebraic categories and topological categories.
There's also categorical logic, which is where the CS-adjacent stuff seems to be found. But this is of little interest to everyday programming, and is very forbidding for people who lack the requisite mathematical maturity. Only the most dedicated should enter its harsh plains, and should expect to gain nothing without tremendous efforts and sacrifice.
Where do adjoint functors occur in CS? They occur in advanced algebra, and they occur in topology, but where else? And indeed, the fact that they preserve limits/colimits may help speed up communication and thinking. But I'm not seeing CS connections here.
>The first example is trivial, and serves as an illustrative but useless example (which I don't need).
Yep. Most of the example of adjoints trivial if you know the field of math where they are used. The interesting part is why it happens almost everywhere.
Another one is a free monoid. With pair of forgetful/free monoid functors. It sounds a bit mathematical but for type T free monoid is a List<T> in a programming language with generics.
As a general, i.e. React programmer, there's not a lot of value. However, as I said, if you work in some areas, i.e. programming languages, or logic it might be even requirement in some areas to be productive.
My complaint has nothing to do with whether CT is useful or practical. By contrast, abstract algebra can be taught (e.g., in Herstein's books) as pure abstraction (very dry and without presenting any real-world connections), but you reach Lagrange's theorem right away -- which is an simple-but-awesome result that will wake up your brain. You reach Cayley's theorem and others quickly, and each is more exciting that the last. And this is all while still in the realm of purely-abstract math.