To be honest, Category Theory is the last thing a software developer ("working hacker") should worry about learning. It is indeed a fun subject, it has grown into a large branch of algebra, it has connections with functional programming (lambda calculus) etc. But if one wants to be a good programmer, there is so much of really useful and important (and difficult) stuff to learn before you can even come to appreciate what, if anything, knowing Category Theory can do to improve your software engineering skills, that I would dare to characterize it just as an obscure subject that in all likelihood is not worth spending your time on.
Have you actually watched the presentation? If not, I encourage you to do so. Category theory and abstract algebra help us understand what it means to build composable and reusable software components with well defined semantics. There is already ample focus on algorithms and discreet math at university courses, but very little of the above. This is despite the fact that modern software development is becoming more about composing existing code and less about reimplementing classic algorithms. So given the already ample focus on the topics you suggest, I think we can forgive Phil Wadler for giving a very brief glimpse into a promising subject that most hackers would dismiss as obscure/irreverent.
I thing that if you are the sort of person that really likes design patterns, the sort of applied category-theory that the likes of Erik Meijer, Bartosz Milewski, Rúnar Bjarnason or Gabriel Gonzalez talk about is actually a good next step after the GOF Desing Patterns :-)
There isn't a specific branch of mathematics you need to know in order to learn Category Theory. Its main role is in providing a way of describing other parts of mathematics (similar to the way set theory and certain logics are commonly used to formulate definitions and proofs). In other words, it's a kind of 'foundation,' so there isn't really anything under it. (This is my understanding anyway—please correct me if I'm wrong here.)
That said, it will probably be rather difficult to comprehend if you haven't been exposed to a good amount of other work in pure mathematics first. If you wanted one other more introductory subject, I'd recommend checking out Abstract Algebra (lots of other mathematics are expressed in its terms, also; so independent of Category Theory it would probably be time well spent. https://en.wikipedia.org/wiki/Abstract_algebra).
The purely mathematical subjects that are deemed important for programmers are parts of the so-called Discrete Mathematics. The Wikipedia article on it lists what is included. It is a vast collection of topics, but discerning what would be more important (or interesting) for you should not be hard.
Depends on what you do. I work in robotics, so probability and linear algebra are the two most indispensable subjects I know. Graph theory might be a little more generally applicable.
At the end of the day, you don't need a lot of math to write CRUD apps. The more math you know, though, the more approaches you'll have to formulate and solve problems.
As a working programmer that is curious about category theory, I feel like this misses the mark completely. I watched this yesterday, and this is what I remember:
1) I will tell you why category theory is relevant to you, the working programmer.
2) Here are the 2 most important data structures that you don't use in your day job.
3) Look at how they relate in this mathy not applicable way.
4) Conclusion: lambda calculus is awesome.
What I would like to see is:
1) I will tell you why category theory is relevant to you, the working programmer.
2) Here is a realistic but simplified problem and a solution written in python that is obviously the wrong approach to solving the problem. With category theory, we know this is the wrong approach because the math is wrong (and don't go into the math).
3) Here is another example that seems like a decent approach to solving the problem. With category theory, we know this is the wrong approach because the math is wrong (and don't go into the math). Doing it our way has these benefits...
4) Conclusion: learn category theory to become a better programmer.
If you can't do that, find some other way to illustrate the concrete benefits of learning category theory. If there aren't any concrete benefits, it's like telling baseball players they should learn physics to have a better understanding of baseball.
This is fantastically timed and very well explained. Our company is working deeply with Category Theory right now as we believe programming in the future will find itself focused primarily on the task of orchestrating AI to code.
Why don't things like Category Theory and Set theory update their terminology to align with more modern words of describing these?
Why is a "function" and "arrow"? A "function" implies an operation that is part of a larger system of organized operations. An "arrow" implies stone age technology. One is infinity more descriptive in my opinion.
Category theory uses weird terminology all over the place partly because it needs to distance itself from concepts that are "sort of like in set theory but not quite" and there are a lot of concepts that need naming thusly. Browse https://ncatlab.org/nlab/show/category+theory and you'll see a ton of examples of this.
W.r.t the "modernity" of the terms: I think category theorists are glad if they find any word that fits the concept in their minds. "Mathspeak" is not supposed to be intuitive to outsiders (and everyone is such an outsider at one point), but once you get more into it the names do start to make sense.
Generally mathematics is a field that breaks everyday-intuition constantly. Having a technolect, a "foreign language in a language" if you will, helps coping with that.
> "Mathspeak" is not supposed to be intuitive to outsiders
Isn't that antithetical to the entire concept of an organized science? As I understand we're meant to make commutative models of the world around us that have predicative properties. If it's not possible to easily communicate your model, then there isn't much of a point in using the terminology.
Think back to Newton's notations of calculus. They are improper and poor ways to demonstrate the information that is being spoken about [0]. I don't know anyone who doesn't actually use Leibniz's notations.
I don't think anyone can convince me that "Mathspeak" should be unintuitive.
Some examples of categories whose arrows aren't functions:
(0) Any monoid can be viewed as a category with a single object. The arrows from the object to itself are the monoid's elements.
(1) Any preorder can be viewed as a category such that, between any two objects, there is at most one arrow, precisely when the source is less or equal than the target.
(2) Given a directed graph, or more generally a quiver[0], there is a small category[1] whose objects are the quiver's nodes, and whose arrows are the paths (finite sequences of edges) from a source node to a target node.
State machines & object dependencies [and function composition, and... categories] are often drawn as directed graphs. When drawn like that, the circles/boxes represent objects, and the arrows represent arrows. The name comes from that kind of representation.
Then from what I'm understanding it's not an "arrow", it's something closer to a "step" or a "process". Some method to get from one location to another. An arrow means nothing without background information. If I say it takes "10 arrows to get from A to x(A)" as a way to explain there are 10 things happening in the function X, then wouldn't it be better to say it takes "10 steps/processes to get from A to x(A)"?
> The name comes from that kind of representation.
The representation, the way I conceive it, is a mode for displaying computational or data in a spacial manor. As such relating a computational or mathematical process to a spacial terminology would be a "step" not a stick with a rock at the end of it.
Does step help when thinking of the objects as individual data elements, the arrows as cons-links, map being a functor that respects cons-links [i.e. does not change the structure of the list]? The problem here is that implying process, ordering, or anything else is too narrow of a definition for what an arrow/morphism can be used to represent. That's why arrow (a thing that points to another thing), or morphism (a thing that helps define the shape) are actually better terms. Because Category Theory isn't about the things, it's about the relationships between the things, how these can be composed, and how higher-order operators that respect these relationships [and the new ones they represent for even higher-order operators] can form new relationships which can also be composed.
Given any set with a partial ordering you can construct a category whose objects are the elements of that set and that has an arrow from a to b if and only if a <= b.
The word function has a sort of precis mening in math as a rule mapping each NUMBER in some domain to some NUMBER in a codomaon. The word arrow however can refer to any directed relation including less than or divides or what not, as long as they are composable and associative
Sorry for my harsh comment; here's what I'm thinking of...
In math, you are dealing with many different kinds of object, not just numbers. In fact, one of the big realizations that led to modern mathematics is that not all mathematical objects can even be coded as numbers!
Your remark about "arrow" being more general than function is correct; but functions do not map only between sets of numbers, but between arbitrary sets, some of which contain elements that are not numbers, or even encodable as such.