I’m a math major. I learned category theory in school.
I think of category theory as an easy way to remember things. If some concept can be expressed as category theory, it can often be expressed in a very simple way that’s easy to remember. However, if you try to teach math using category theory from the beginning, it feels a little like trying to teach literature to someone who can’t read yet.
Anything directly useful from category theory can be expressed as something more concrete, so you don’t NEED category theory, in that sense. But category theory has the funny ability to generalize results. So you take some theorem you figured out, express it using category theory, and realize that it applies to a ton of different fields that you weren’t even considering.
The effort to reward of category theory is not especially high for most people. It did give us some cool things which are slowly filtering into day to day life—algebraic data types and software transactional memory are two developments that owe a lot to category theory.
Almost every field you can study in mathematics can be thought of as a “category”, and when you take a math class in undergrad, it usually focuses on one or two specific categories. Category theory is the study of categories in general. As in, “What if, instead of studying a specific field of math, you studied what these different fields had in common, and the relationships between them.”
The part where it gets wacky is when you realize that “categories” is, itself, a category.
I think of category theory as an easy way to remember things. If some concept can be expressed as category theory, it can often be expressed in a very simple way that’s easy to remember. However, if you try to teach math using category theory from the beginning, it feels a little like trying to teach literature to someone who can’t read yet.
Anything directly useful from category theory can be expressed as something more concrete, so you don’t NEED category theory, in that sense. But category theory has the funny ability to generalize results. So you take some theorem you figured out, express it using category theory, and realize that it applies to a ton of different fields that you weren’t even considering.
The effort to reward of category theory is not especially high for most people. It did give us some cool things which are slowly filtering into day to day life—algebraic data types and software transactional memory are two developments that owe a lot to category theory.