Here is one non-trivial insight: Every formal logic corresponds to a category.
When we do logical deduction, we have facts and rules. We can apply rules to facts to get more facts. More specifically, formal logic is built from situations where we have some fact P, some rule P => Q, and some conclusion Q.
In a category, we have objects and arrows. We can imagine arrows as mapping one object to another, or sending one object to another. We might have objects P and Q, and an arrow f : P -> Q relating them.
The connection continues. We can take multiple rules in logic and apply them sequentially, building a proof tree; we can take multiple arrows in a category and compose them sequentially, building a path. In some logics, there is an idea of ex falso quodlibet, or from the false fact, anything can be proven; similarly, in some categories, there are initial objects, which come with arrows from the initial object to every other object.
Just like how logical formalism eventually removed the possibility that logic is non-mathematical, category theory removes the possibility that logic is unstructured. Logic is actually incredibly highly structured, and those structures happen to coincide with structures in other parts of maths and physics.
Again though, how do I use this to solve everyday problems?
I've seen some vaguely cool stuff done with Category theory in Haskell, e.g.: composable tree parsing and traversal, but I've never seen anyone actually use any of this stuff to make a deliverable piece of software that is notably better than what developers can readily produce using traditional procedural languages.
I mean sure, I suppose it would be nice if the C# team added some sort of category theoretically "pure" tree processing sub-language akin to some bastard child of LINQ and XSLT, but... meh. I don't think it would see a lot of use in practice.
A few months ago, while employed at a shop, I wrote about a hundred lines of Python which performed a basic katamorphism on some XML, outputting a PNG. Nobody else on my team could conceive of this, and were mystified by the fact that I could just sit down and do this. It wasn't too hard, though, because I could imagine the entire katamorphism in my mind.
The traditional approaches weren't just stymied by this problem, BTW; the other developers wanted to go shopping for off-the-shelf components which transform XML to PNG, as if that would get us the specific katamorphism we desired.
The nature of your everyday work will inform what you can do with category theory. I personally use category theory to design programming languages; it's something for which category theory gives a turn-the-key recipe:
* Pick some objects of interest
* Define a relatively free Cartesian closed category whose objects include your objects of interest
* Enrich the category with interesting arrows
* Extract a combinator basis
The MarshallB programming language [0] is a good example of this recipe in action.
But the real interesting part here is the use of exact real arithmetic, not the category theoretic part. Category theory is just used as scaffolding. I would also say it is dangerous to see things only through the category-theoretic lens, because no doubt, as with any generalisation, there will be plenty of situations where it does not provide the right generalisation.
Better scaffolding leads to bigger, better buildings. Don't knock the auxiliary and ancillary techniques; they're still necessary for this sort of work to be understandable.
Also, from my perspective, the exact real arithmetic isn't super-interesting because it's not new; Turing was doing this sort of thing when he first got access to computers. Legend has that one of Turing's first recreational programs was computation of zeros of the Riemann zeta function, and ever since, there's been a tradition of exploring exact real arithmetic with computers and programming languages. MarshallB is one more incremental step forward in a big research programme of overcoming the inconvenient fact that we can't test real numbers for interesting properties with computers alone.
At no point have I advocated only category theory. Categories are always composed of some collection of arrows (and objects, but I prefer object-free presentations!), and those arrows are always homogeneous, if not homoousios, composed of a single substance. We can study that substance on its own. Perhaps an analogy to materials science would be appropriate. We ought to study both each building material, be it sets, vector spaces, relations, graphs, or type theories; but also the architectural principles used to make buildings, like abstract algebra and category theory.
Yes, nothing against better scaffolding, as long as you recognise it for what it is. Arguably exact real arithmetic is older than inexact real arithmetic ;-) But that's not the point, I think the point is that it becomes interesting now to actually do computations with it. It helped me program solid modelling operations for triangle meshes, when I was despairing doing it with normal floats. Switching to exact real arithmetic allowed me to see things clearer, and once I had what I wanted running with exact real arithmetic, I could switch back to floats to make it faster.
Very interesting. Could you describe 1. what was the problem you were trying to solve? 2. What a non cat. th. solution might have looked like 3. What a “katamorphism” enabled you to do?
XML files have a tree structure. A katamorphism on a tree, often called a fold in functional programming, is simply a function which recurses through the tree and flattens its data into a single summary.
I can't share the details of the program with you, but hopefully this helps. Wikipedia has more information and an example at [0].
I suspect I share your frustration; the academics who teach math tend not to have a very good grasp on why it might be useful.
The point is (probably, maybe) that nothing in programming is intuitive; all the abstractions have to be learned for a first time somewhere. A lot of math education is about cataloguing common patterns, naming them and providing an environment where students can check if they have internalised the obvious pattern or not.
This allows incremental building of knowledge between different people. So for example if someone is trying to build a Something with [Property Z] and person B points out "this is a [mathematical object X] therefore [theory Y] says the Something won't have [Property Z]" then the builder can understand the implications immediately and abandon the effort, or identify what needs to change to make the Something not an [object X]. Saves a lot of discussion, enables a lot of coordinating. You can ask an expert to check your work for [Object X]s if you never quite got confident in the concept.
A bit like how an engineer might look over another's design and say "the tensile stresses on that beam will be huge!" and the designer will be able to make a lot of assessments in an instant that would need a good half hour if the concept of tensile stresses wasn't part of a common language and the mistake-finder had to explain stress & strain from first principles.
So yeah; a lot of it probably is trivial but it saves hours of discussion and explaining out in the real world. And there is benefit in moving the explaining up early where it is dealt with in a 30:1 classroom setting and it doesn't slow down design work.
My view on this is that Category Theory is more of a framework for your brain rather than for your computer. As the founders of the theory had noted, categories were introduced as a basis for explaining what a functor is, and functors were needed to explain the phenomenon of natural equivalence. It is indeed practically useful to program your brain in such a way that you could better discern common patterns and be able to see that what you are trying to do is equivalent to something you’ve done before; being able to express this directly in code when possible (depends on the language) comes as an added benefit.
We're using applied category theory quite heavily in the design and implementation of the categorical query language CQL: https://www.categoricaldata.net . There category theory gave us algorithms that are missing from SQL (such as for left Kan extensions) and the ability to do extended compile time checking of queries against data integrity constraints; for most of CQL internals, it's unclear how you could make it work without category theory.
I have tried to learn category theory a few times, but it always quickly became mind-bogglingly boring with all the stuff that is trivial in any given setting, but which I find harder to digest in the general category-theoretic setting. I would say areas like logic have ample applications of category theory once you have truly digested its concepts. But it is such a boring way to get there :-)
The GP asked for nontrivial insights from applied category theory, not a recipe for use in everyday problems. I do not think every insight immediately leads to basic recipes that are used in day to day tasks. I haven't digested enough if the papers in the field yet to have a good answer for the GP, but I think you are not asking a fair question of a field as young as ACT.
When we do logical deduction, we have facts and rules. We can apply rules to facts to get more facts. More specifically, formal logic is built from situations where we have some fact P, some rule P => Q, and some conclusion Q.
In a category, we have objects and arrows. We can imagine arrows as mapping one object to another, or sending one object to another. We might have objects P and Q, and an arrow f : P -> Q relating them.
The connection continues. We can take multiple rules in logic and apply them sequentially, building a proof tree; we can take multiple arrows in a category and compose them sequentially, building a path. In some logics, there is an idea of ex falso quodlibet, or from the false fact, anything can be proven; similarly, in some categories, there are initial objects, which come with arrows from the initial object to every other object.
Just like how logical formalism eventually removed the possibility that logic is non-mathematical, category theory removes the possibility that logic is unstructured. Logic is actually incredibly highly structured, and those structures happen to coincide with structures in other parts of maths and physics.