Hacker News new | past | comments | ask | show | jobs | submit login
A monad is a monoid in the category of endofunctors, what’s the problem? (reddit.com)
203 points by joe_poole on Feb 10, 2019 | hide | past | favorite | 105 comments



Knowing that a ring (in abstract algebra) is equivalently defined as "a monoid object in the category of abelian groups" is not going to help you with your first abstract algebra course. It makes things that are glaringly obvious from the usual perspective totally opaque, and means you have to define things like the tensor product of abelian groups properly before you can even define a ring. I don't know of a professor who would even mention this perspective during a first abstract algebra course. You get much a much better intuition for what rings are by just using them.

Mathematicians who work in category theory have often seen many, many examples of algebraic structures before they get abstracted away to "a monoid object in the category of blah", and many mathematicians still find it an exercise to unpack these compact definitions back to something more intuitive, or more familiar. And even so, most of the time identifying your favourite object as some abstract category thing doesn't bring anything new or useful to the table for the study of that object. (It can help spot patterns between objects though).

I say this for any aspiring Haskell programmers: don't feel like understanding this statement will make you a better programmer. The best way to get better at using monads is to just use monads more, and play around with them. Tracking down all the definitions in the statement is a neat exercise, though.


Agreed. More often than not, it's just a layer of jargon that obscures the topic at hand rather than clarify it. Worse, I see a lot of folks using it to belittle and intimidate -- recently I met an instructor at a startup-focused programming school who put his (very novice) students through a lecture which began, "Is a burrito a monoid" and went down hill from there -- he was actually bragging about how lost his students were. And he told me about all this under the presumption that I didn't know category theory (easy mistake, I guess, I'm a woman after all) and my biggest regret is that I didn't ask him to explain Poincare duality to me. If only he knew that my paralysis was a result of biting my tongue at his pedagogical trashfire...


I doubt you being female had anything to do with it.


> startup-focused programming school

Curious what are these? They place into a startup after you are done?

surprised that they are teaching category theory if that is the case.


That's just the thing. Category theory isn't really a part of their standard curriculum -- it's something that a lone instructor decided to put into his syllabus.

My one-line description of the business doesn't really do it justice -- it's programming school with a focus on web & app development, and they have fairly good (dev-)community engagement. They also teach some business skills along the way, and students often end up with startups, or founding their own. Also many of their instructors currently work at startups.


That guy is an imbecile and your life is better for not having engaged with him.


> Poincare duality to me

next time don't pass the opportunity


I'd go a step further. If you're trying to do functional programming in scala/js/ruby/php you don't need to care about any of those words (functor/monoid/monad/category).

You can write high reusable, generalized, clean functional code without knowing a darn thing about the theory behind it. Moreover, if you do learn these words you probably shouldn't use them around your peers because they'll serve to socially alienate you and confuse them.


What the hey, I'll pile on more --

Category theory type stuff is a very useful thing to know when designing programs, because it is useful to know things like, "Well, if I give this data structure these properties, then it'll be well-behaved when I map over it."

But you do not need to learn all the technical jargon for the concepts in order to develop a command of them, any more than you need to memorize a natural language's formal grammar rules in order to speak it competently. (Since the grammar needs to be explained using natural language, one could even argue that it isn't really possible to do the one before the other, in a big picture sense.) Also similar to learning a natural language, learning all the technical, formal stuff isn't necessarily all that useful for helping you to achieve practical competence.

To stretch that analogy even further, getting too enamored of the category theory stuff is likely to earn you a certain measure of benevolent disdain from experienced functional programmers in the same way that getting too enamored of grammar rules is likely to earn you a certain reputation of disdain from people who write for a living. It's not that they don't value these things, it's that talking about them too much sends signals that you're more interested in reading the rulebook than playing the game.


In my view, "Functor" and "Monad" are the wrong emphasis, while "map" and "flatMap" are the parts that really matter, and are much easier to teach in comparison.

Most programmers are familiar with "map" today, and "flatMap" is not much of a stretch. But once you understand these two things, you're 90% of the way to understanding monads and why they're useful.


Very well said. The practical reality is that the “category theory” used in functional programming paradigms (like Haskell’s) has only a passing resemblence to formal category theory in mathematics. It’s a nice idea to formalize some concepts in functional programming algebraically, but learning formal category theory won’t realistically improve your programming ability. There is some conceptual overlap, but they’re just very different things.


On the one hand I know what you mean: I know practically no category theory (basically I know enough to expect "co" as a prefix to mean "everything the other way around"), but I can still produce good Haskell code.

But on the other hand, I look at some of the stuff Edward Kmett has produced, particularly his lens library which I use all the time, and its very obvious that the design has been informed by a deep knowledge of category theory.


As a contrasting point of view, the excellent graduate level introduction to algebra, Algebra Chapter 0 by Paolo Aluffi introduces category theory right from the start and it's a very good book. It just uses the concepts of a diagram and a universal construction, which have a high power-to-weight ratio.

I agree that the danker category theory is more useful for research than for teaching. In particular when you want to try to import definitions and theorems from one context to another, or you want to decompose a proof into its abstract skeleton and concrete particularities.

As an example of how "A monad is a monoid... etc." can be useful for the abstraction (i.e. library) builder, if you import this definition of a monad into the bicategory of endoprofunctors, you get arrows [1].

[1] http://www.riec.tohoku.ac.jp/~asada/papers/arrStrMnd.pdf


I think this is a good point. I find the language of category theory an invaluable tool in my own understanding of abstract algebra, and universal properties are a large part of that. I think that a lot of undergraduate mathematicians could benefit from seeing the universal properties of a product/coproduct, and reflecting on what this means in categories that they're comfortable working in. Universal properties are usually quite useful.

My specific gripe is with expecting that a statement like "a Lie group is a group object in the category of manifolds" is going to lead to any new theorems in the theory of Lie groups. It's just not, and you are not going to become any better at Lie groups by understanding that statement. It's good to understand at some point, because it unifies the definition of group / Lie group / algebraic group, etc, but it will really not give you anything new to work with.


> It's good to understand at some point, because it unifies the definition of group / Lie group / algebraic group, etc, but it will really not give you anything new to work with.

I think that's true at an elementary level, but on the other hand when one doesn't know what concepts they're looking for in a new area, category theory can be a machete cutting a swathe through the jungle. I think Grothendieck was right :)


> "a Lie group is a group object in the category of manifolds"

The point is of course to motivate the definition of a Lie group. Whether such strong motivation counts as "a better understanding of Lie groups" is largely a matter of semantics that there's some room to disagree over. Same goes for the categorical definition of a ring that's been mentioned at the start of this thread, or indeed the definition that OP talks about.


I love Aluffi's Chapter 0 and mentioned it in another comment. But I don't think it's appropriate as a first pass on abstract algebra, even for pure math majors. Maybe for an Honors Intro to Abstract Algebra course...otherwise, I think Aluffi is pretty clear in that while he thinks category theory should be covered early on, it's really a text for graduates and not undergrads. Abstract algebra should be covered before you get to grad school.


What is a good textbook in abstract algebra for the beginner?


Somewhat less of a "textbook", but good at showing applications of category theory - Fong, Spivak: Seven Sketches in Compositionality https://arxiv.org/abs/1803.05316

(Notably, this includes application to a "graphical" treatment of fairly-elementary linear algebra, which a different HN user is mentioning even as I write this, as a part of math where CT cannot possibly be useful and will only ever confuse students!)


It depends what "beginner" means -- what's your level of sophistication? Are you a budding mathematician or just an interested programmer?


Assume I know 0 math and want to learn mathematics for mathematics sake. What is the most gentle intro to algebra book?


True Beauty of Math Volumes I and II. These build up math from the ground up starting with sets and assume you have zero knowledge of math. What's cool is that you don't use any theorems, structures, or notation that you haven't proven or introduced previously in one of the books. I've found them to be very valuable in both understanding the topics that they cover and giving an intuitive understanding of math as process (which the all the classes I took in college tended to gloss over).

The first volume is roughly about set theory and the second volume is roughly about abstract algebra.

https://truebeautyofmath.com/

https://www.amazon.com/True-Beauty-Math-Foundations/dp/15120...

https://www.amazon.com/True-Beauty-Math-Giving-Life/dp/15448...


Bought the first book. Thanks for the recommendation!


If you know 0 math I would start with basic arithmetic (+, -, *, /) before doing anything else.


My undergraduate Abstract Algebra course used Gallian’s “Contemporary Abstract Algebra.” Unfortunately if you are truly looking for an introduction assuming no mathematics, you will need to learn high school algebra and a bit about sets and proofs first.


Pinter's A Book of Abstract Algebra.


A good recent release is Bartosz Milewski's Category Theory for Programmers.

https://bartoszmilewski.com/2014/10/28/category-theory-for-p...


In the order of increasing complexity and/or volume: Pinter, Fraleigh, Rotman, Dummit & Foote.


I like Algebra by Mac Lane and Birkhoff, if by beginner you mean undergraduate.


On the flipside, knowing that a ring is "like a group (which we already covered), with another operation that follows these axioms" is useful, and still mostly rigorous.

It's better than "eh just use it a lot and you'll figure it out."


Of course I'm not recommending learning ring theory without learning the (usual) definition of a ring. This is absolutely rigorous and useful, and immediately after you learn it you can start playing around with rings, and start proving some interesting theorems.

The analogue of this approach in Haskell is probably something like sections 1 and 2 of "Tackling the awkward squad: monadic input/output, concurrency, exceptions, and foreign-language calls in Haskell" [1], which is quite an old paper (from around 2000 or something?). It does not even contain the word "monoid", but instead shows what an IO action is, and how functions like >> and >>= can be used to sequence and thread them. It also provides a lot of context as to why monadic IO exists. After reading this, a new programmer could start playing around with making new IO actions using these operators. After reading about monoid objects, a new programmer is no closer to being able to write a Haskell program.

[1]: https://www.microsoft.com/en-us/research/wp-content/uploads/...


True, but that likeness is normally taught in abstract algebra without requiring category theory. For example this: https://i.redd.it/yqnbigi032i11.png doesn’t require any category to understand.


Not my point.

For education, there is a happy medium between providing theoretical context and hand-waving over technicalities and learning by example.

Very nice diagram though.


To give credit where it's due, that diagram is from a /r/math thread: https://www.reddit.com/r/math/comments/99y547/my_digram_of_a...


that's definitely a neat diagram but the examples (the blue) almost entail already know the definitively.


And "these axioms" are indeed the axioms of a monoid operation (for multiplication), plus some sensible notion of "consistency" with the underlying abelian-group structure. Which is all that the "monoid object" characterization is saying, actually. So, I disagree that the perspective is inherently unintuitive.


I think you may have a different definition of intuitive than the top level comment, or else you two are speaking to different things. Category theory is certainly "elegant" and provides a lot of insight in the connection between various algebraic structures. But I think it would be completely asinine to use category theoretic formalisms in a first course covering algebraic structures like groups, rings and fields. That level of abstraction doesn't make the distinctions between each structure more intuitive; if anything it obfuscates them and requires setting up far more machinery to explain what they even are. It would be highly inappropriate for an undergrad's first exposure.

That is my impression of the top level comment's point, and I largely agree with it. If you understand what a group is, I can explain what a ring is to you in a very straightforward way that still shows you the connections between the two things. But setting up category theoretic language would take up a nontrivial portion of the course itself, and at the end you'd have a much less specific understanding of the relevant theorems on rings and groups. You'd be missing all the trees for the forest.


Look, it's 2019. "Setting up category-theoretic language" is something that should begin in the first lesson of the first course in formal, non-K12 math, starting with the use of ETCS or a comparable "structural" set theory as the 'naïve' foundation of choice. It provides countless advantages not just in abstract algebra, but in a wide variety of other subfields of math as well.


To be clear, how much category theory are you envisioning setting up in a first course on abstract algebra? Are you basically just talking about a modern treatment of morphisms in the abstract sense, or do you also mean discussing things like functors?

My perspective on this is basically informed by the following: I like category theory a lot. I've chatted with Baez about this on a few occasions and insofar as I need to use algebra, I like the category theoretic formalisms that come along with it. I think if you had students with a lot of mathematical maturity but no prior exposure to abstract algebra (except maybe a rigorous treatment of linear algebra), you could accomplish what you're proposing.

To your point, one of my favorite math textbooks is Aluffi's Chapter 0. You're probably familiar with it, but if not: it builds up abstract algebra in a rigorous and modern fashion alongside category theory. However, there are a few caveats here:

1. Aluffi explicitly states it's a textbook for upper level undergrads and preferably graduates. In my experience, when authors say this they actually mean it's appropriate as a year one graduate course. That's not the time to learn abstract algebra for the first time!

2. Aluffi does a great job of covering things like morphisms and categories, but the size of the book is massive. He doesn't have any nontrivial coverage of deeper category theoretic concepts like functors until much later in the book; it's probably far enough in that you couldn't reasonably cover it in a single semester course.

3. I really don't think most math majors would benefit from it. Those who are applied math majors will get questionable benefit from a slower, more comprehensive approach to algebra than a faster approach that lets them get to applications. Moreover, not all pure math majors have the mathematical maturity to approach category theory before grad school. Proofs in traditional abstract algebra are a lot less abstract than category theory, and it's easier to motivate them even if they aren't ultimately as elegant.


The problem with "a faster approach that lets [you] get to applications" is that it will depend a lot on what applications you have in mind. And often it's not even faster in any real sense - there's a whole lot of pointless duplication involved in tailoring things to a lower level of abstraction. It may be better to begin with a more effective explanation of what sorts of "mathematical maturity" we're actually seeking here, so that attaining it is easier in the first place for the average student. For all their supposed "unintuitiveness", category-theory-based explanations do this quite well, in a way that I haven't really seen elsewhere - I mean, this very thread is one where we're discussing software developers learning category theory and benefiting from it, so why couldn't this be also a part of mathematical training?


I strongly disagree. From experience teaching category theory to undergraduates, it is totally obtuse before that have a large pool of examples on which to draw. In a very first course, students are still struggling to learn concepts from linear algebra, which will be the bread and butter of the rest of their mathematics. In a second course, they are struggling to learn why the definition of "linear map" is different from the definition of "matrix" they learned in their first course. You could say the same thing about metric spaces or topology.

Perhaps the third course is a nice place to start introducing some category theory, since they will now have actual examples on which to draw, and learn about the unifying concepts of "morphism", "categorical product", and so on. They might even be ready for some interesting functors, such as the fundamental group of a topological space.

If you were to teach category theory as a very first course, what would you teach? What examples of categories do you have? What examples of functors?


The consistency with the underlying abelian group could be restated as "multiplication is bilinear, and has a unit", which is the correct perspective to take if you want to "upgrade" from the usual definition to the "monoid object in the category of abelian groups" definition.

But understanding bilinearity and its connection to tensor products is the key part of changing between those definitions, and also the most useful step to understand. The final description is pretty uninteresting in contrast with universal mapping properties of a tensor product.


Yup, and I think it's actually a completely rigorous definition – all rings are commutative groups under one operation (often referred to as "addition"), plus another operation ("multiplication"), plus rules for how they interact.


The point of the category-theoretical definition is to explain how these "rules for how they interact" might arise, and in what sense these might be understood as 'minimal' rules. These are not "equivalent" definitions in any real sense, and to think of them as such is arguably missing the whole point of how category theory can be useful here.


> knowing that a ring is “like a group...” is useful

Turns out, not so much, though. The ring theory bears surprisingly little resemblance to the group theory, to the point that these concepts seem to have nothing to do with each other...


Except that a ring is literally a group under addition?

Do you seriously mean to suggest there's no pedagogical value in using groups to teach rings?


There's actually a big debate between math educators on whether abstract algebra courses should follow a groups first or a rings first approach so it is totally possible to teach rings without appealing to groups. (I'd still argue that both approaches have their advantages and disadvantages and that there can be value in using groups to teach rings though, I just wanted to say it's not as obvious as you might think)


I could be wrong here, but my read of that comment is that they're talking about the structural differences between rings and groups at the research level. Which I guess makes sense, since you're in very different territories if you're talking about something like representation theory of finite groups...

Otherwise I agree with you, there's a lot of pedagogical value in combining coverage of algebraic structures.


Yes obviously you need to know what a group is to even begin ring theory but ring theory looks and feels nothing like group theory.


The fact that "A monad is a monoid in a category of endofunctors" explains everything while clarifying nothing is exactly why Haskells repeat that phrase -- as a joke.


Having studied abstract algebra before using FP, I must say that I instantly made this connection when I came into contact with monads, and that it helped me a lot to understand them.

So I wouldn't be so quick to dismiss the idea of associating these concepts. After all, category theory is all about making that kind of associations.


This is a good observation. Though, most programmers don't study, and don't have to study, enough abstract algebra before being exposed to promises, optionals, and even lists (which all exhibit some form of monadic behavior).

What worked best for me was both playing with code, reading the definitions in code, and then look up some CT to get to the mathematical roots and where they fit in the intuitive picture. (Then you can continue with arrows, monad transformers, etc.)

Since my primary purpose for learning monads (and other algebraic constructs) was and is their use in programming, I concentrate on "practical" intuitions, and periodically check if I can go down to the algebraic base of them in a particular case. If I still can, I suppose I know enough math to get by in the particular area. If not, I open a book and clarify my understanding, and maybe glean something new.


If you already know category theory, sure, that's fine. (Or maybe if you already know abstract algebra really well.)

I'm going to go out on a limb, though, and speculate that the average person starting to use FP does not already know category theory.


For me category theory is more concrete. The basic reason is simple: categories provide type information. If all you want to do is grind out some calculations to pass an exam, then yeah, memorize the ring axioms and have at it. Anything more than that, and you need direction, or context, Ie. types, categories. Once you de-categorify that's when things become mysterious and confusing, and the impressive calculations abound. But it's just because you forgot the concrete thing that you are calculating.

It's all a bit paradoxical, and I'm not explaining it very well. (And I have no clue about the Haskell stuff.)


I fail to see how a category theory perspective is going to help with "grinding out" any interesting theorems about commutative rings, such as:

- Every commutative ring has a maximal ideal.

- The quotient of a commutative ring by a maximal ideal is a field.

- The quotient of a commutative ring by a prime ideal is an integral domain.

- The classification of finitely generated modules over a principal ideal domain.

There are some universal properties which define what it means to be a quotient, which can help you work with them. But for example, the only "categorical" definition of prime ideals that I know is based on that third theorem.

I think it is easy to forget just how much underlying theory you need to know before you can check that the categorical analogues of definitions are even the same as the original ones.


>and means you have to define things like the tensor product of abelian groups properly before you can even define a ring

Hmm, someone managed to do it at Wikipedia:

"In mathematics, a ring is one of the fundamental algebraic structures used in abstract algebra. It consists of a set equipped with two binary operations that generalize the arithmetic operations of addition and multiplication. Through this generalization, theorems from arithmetic are extended to non-numerical objects such as polynomials, series, matrices and functions."[0]

[0] https://en.wikipedia.org/wiki/Ring_%28mathematics%29


You missed the point of the parent comment.

They were saying that the definition you found, on wikipedia, is better than the definition of "a monoid object in the category of abelian groups" because the definition on wikipedia doesn't require knowing about tensor products etc.

You're proving their point. Wikipedia doesn't define it with that language for exactly the reasons the comment pointed out.


Yes, exactly. That's not category theoretic language, which is why that definition is so approachable. There are far fewer prerequisites to understanding what's going on in that definition.

That's precisely what the parent commenter is getting at.


Seems like all language would be "category theoretic".


No, not at all. Here are two equivalent definitions of a ring:

1. A monoid in the Ab-category.

2. A set R which is equipped with addition and multiplication.

The first definition is category theoretic, and requires you to know what abelian groups are and what it means for something to be 1) a category, 2) a monoid, and 3) a monoid in the category of abelian groups.

The second definition is straightforward if you can follow a few axioms and know naive set theory. It is helpful but unnecessary to understand that a ring is an abelian group which also supports multiplication in order to get the second definition. But even if you know this about rings, you'll still need to understand all the heavy lifting behind what the category of abelian groups actually means.

I'm not saying it's not useful. But I am saying one is clearly more accessible than the other, with fewer prerequisites.


All those words are in various categories.


All mathematics is abstraction.

Numbers are abstractions

Variables are higher level abstractions

You can have properties that you abstract and then see what you have. Banach spaces. Metric spaces. Three different ways to define compactness.

I liked to teach calculus by teaching the limit of a sequence first and then defining limits of functions by considering all sequences x_n that converge to x, and then looking if y_n converge. Building up definitions from simpler definitions, instead of the STUPID handwavy definition they use in Calculus 1 that can’t even handle limits of functions like cos 1/x and answer about limits of certain subsequences of x_n that approach 0.

I started a course about mathematics this way, building up from the very basics:

https://m.youtube.com/channel/UCuge8p-oYsKSU0rDMy7jJlA


My rule of thumb for explaining things to people is "start specific and generalize."

This goes the other direction. Which is sometimes useful, but can be hard to follow unless every step is meticulously explained and its necessity is justified.


I think this is very personal and you shouldn't assume this for everyone. I personally find general -> specific MUCH better, in fact this is how I learn mathematics. I first learn group in abstract, then I study dihedral group, symmetry groups etc... If done the other way around I'm always confused as to what are rules of the game.


Same. I generally prefer to see the forest first, then understand how all the different trees fit into it. Similarly when looking at a map, I very quickly want to zoom out and see where in the bigger scheme of things is the small thing I’m looking at situated.

Same with more complex concepts in math or CS - it helps me understand and retain them better if I know where they fit into a bigger scheme. Category theory is particularly interesting in this regard, as it serves to connect different branches of math into a mesh of Categories and Functors describing their interrelationships.


Learning group in the abstract, followed quickly by many examples of groups, is in my own opinion the best way to learn about groups. But this isn't quite general -> specific, this is general -> example, example, example.

I think this is where a lot of the Haskell explanations fall down, since they start with (definition of functor) -> example, example, example; then go into (definition of natural transformation) -> (no examples), and the problems multiply from there.


Most people learn inductive, not deductive.

They know how to map an object into an array or an array into another array. Many even know how to flatMap promises that resolve to promises.

But they don't know that they follow general rules that make their behavior predictable.


On another note, I suspect that the main problem everyone faces when explaining monads to someone who is not familiar with the term is "caused" by the dominance of C++-like programming languages.

Monads and functors are not really all that complicated per se, but because C++ templates and Java-style generics are not able to operate at their level of abstraction, people that are only used to thinking in these terms fail to map them into their "comfort zone" and usually give up. And instead of pointing out that monads are on a higher abstraction level, you get statements like "monads are like burritos"[1] which only confuse people further.

The explanation that finally made sense to me was when someone didn't only explain things in the abstract and them link them to the next ridiculous non-programming thing, but instead laid out the basic operations and then pointed out that options, lists and asynchronous tasks are monads because they support those operations. I was basically in the exact situation you describe.

Of course, that people can't agree on a single terminology to save their lives doesn't help either (return vs. lift, bind vs. flatMap vs. SelectMany). From the outside it's like one of those math papers where you spend a day figuring out which non-standard notation they used in order to understand the equations in it.

[1]: https://blog.plover.com/prog/burritos.html


> because C++ templates and Java-style generics are not able to operate at their level of abstraction, people that are only used to thinking in these terms fail to map them into their "comfort zone" and usually give up.

That's what "patterns" are for, in C++/Java and any low-level languages that don't provide the level of abstraction you're after. Functors and monads are patterns in these languages, just like subroutines are a pattern in assembly language but not in C, and "objects" or "closures" are patterns in C but not in C++/Java. Monoids are usually called the "Composite pattern", hence saying that monads are monoid objects is just stating that they share analogies with the Composite pattern at a higher level of abstraction. So what's the problem?


> That's what "patterns" are for, in C++/Java and any low-level languages that don't provide the level of abstraction you're after.

Keen observation. If you read through most explications for monads that float around the internet, though, you will find that design patterns are usually introduced to the reader with much more didactic care. Texts explaining, say, the Command Pattern will usually try to lead the "average" (or even junior) Developer to understanding by providing motivation ("let's write an Undo feature!"), introducing new concepts in the context of some simple, but usually tangible scenario ("let's say we have this editor…") and explain things in terms of the stuff the reader can be assumed to know already.

Even if you look for explanations of monads aimed at non-FP developers, usually they start with something like "For some monad `M`, …", followed by either mathematical expressions or some very abstract function signatures in Haskell (which a majority of readers has probably never seen before). At that point, you already have lost 90% of the audience.

I have yet to try to explain the concept to an unsuspecting colleague, but if I were to try, my approach would be to start with familiar things. For a C# dev, this could be LINQ, for example. In this case, you could even get people pretty far, since the inline query syntax is pretty much just a do notation that was bludgeoned with an SQL hammer. If you then point out how other things than IEnumerables and IQueryables (IObservables or Tasks, for example) work in similar ways and could be treated the same way, I suspect you should be able to nudge the average C# dev along far enough to give them a basic understanding of the concept and some of the upsides without instantly resorting to single letter type signatures.

I'll really have to try that sometime.


You aren't wrong.

But some patterns are easier to understand than others.

Most are created through experience gathered while programming and are rather parctical solutions to frequent problems.

Mathematical concepts are patterns too, but they were often "developed" by people who had nothing to do with day-to-day programming challanges and seem rather foreign for the average developer.


Same here.

Someone said "Functors are objects with a map method, monads are objects with a flatMap method" and then explained some different type of objects that have these methods and how the work in that specific cases.

After 2-4 cases I saw the principles and rules behind them.


Related tangent, in hopes it might be helpful to others:

my mneumonic for distinguishing "inductive" vs "deductive" is "PIG":

Particular -> Inductive -> General

See also "infer" vs "deduce".


I wrote The Monad Challenges with roughly this idea in mind.

http://mightybyte.github.io/monad-challenges/

The idea was to guide people through various important particulars of monads. It still needs some work, but I think the basic approach is promising.


(The problem, in fact, is that the significance of monads in Haskell in terms of the side-effect model has nothing to do with category theory. The "side-effects" quite literally occur outside of these theoretical constructs.)


The laws are important though. They ensure that the "statement-like" things we build out of expressions do behave like statements.

For example, it would be weird if we had three statements A B C that wrote to console, and sequencing A with B and then sequencing the result with C resulted in a different program than sequencing B with C and then sequencing A with the result.

Or, for non-IO things: it would be weird if we had three levels of nested lists, and concatenating two times "from the outside" resulted in a different list than concatenating each inner list first, and then the outer list.


On the contrary, the significance has everything to do with the category-theoretical pattern, albeit the reasoning for that is not immediately obvious and has to do with a further category-theoretical tool known as the Kleisli category. A monad with no enforced "outside" structure is known as a free monad, and is basically the equivalent of what most people outside FP call the Command pattern (or rather, a well-behaved version thereof). And "Commands" have everything to do with specifying side effects!


The historical context helps here. Monads came to Haskell to deal with IO. Once there, people realised that a whole host of things are monads under the hood, most of which are pure. The word "effectful", somewhat confusingly, is often used for applicative/monadic code that is in fact pure under the hood. A good example is the State monad which does not mutate state in any way, it's completely pure computation that happens to model state.


In other words, IO Monad Considered Harmful. https://blog.jle.im/entry/io-monad-considered-harmful.html


Further reading (at a much slower pace): https://bartoszmilewski.com/2014/10/28/category-theory-for-p... Has examples in C++ and Haskell.


His videos on you tube are also very informative.


I thought I was smart before reading this.


Don't let new jargon and lack of experience with a paradigm influence your notion of self. You are as smart as you were before, and now see the universe just a might bit clearer than before.


Them failing to teach you Category theory in a single reddit post has little bearing on your intelligence.


I'm sure this would be something you could learn for yourself as well. Category theory/logic/universal algebra/etc don't require any magic; they do require you banging your head against them for some time though...


I got it now. It's burritos all the way down.


Its offputting to me trying to learn these things when Im lost at the very first sentence of the explanation.


Thus is a great exposition on some of the relationships between the entities of the theory and Haskell, which a lot of papers that exploit CT don’t always make clear. Bravo on that score.

I wish some of the category theory driven Haskell tutorials would swap out their “prove the laws” exercises with implementation exercises. While it’s great to be able to prove the laws yourself, part of benefit of using an existing theory as an underlying model is that you get to take advantage of the laws (and long history of proofs and developments it took to ascertain said laws) to create implementations or gain particular insights—it seems like it’d be a more useful exercise for programmers to implement some programms utilizing the ideas rather than proving te laws yet again—this seems like a more expedient way to develop insights as to how you would actually use these structures.


A similar "joke" in mathematics is that "a sheaf of groups is just a group of sheaves" (meaning that a sheaf of groups on a topological space X is a group object in the category of sheaves of sets on X)


Here's a fun puzzle: concretely, what is a group of groups?


An Abelian group


I suspect that constructing your models with these logical rules and running them and their interactions through a proof checking compiler like Haskell's could help with concurrency, security, large code bases, refactoring and more. And yet what we get are solutions to problems that we don't really have.


What are the steps to learning category theory? Preferably something with solutions so I can check my work?


I don't want to discourage you from learning things. If you think category theory might enrich your life, go for it. That said:

I have a PhD in mathematics in a field that uses quite a few category theoretic notions, so I've had extensive hardcore contact with the theory. After my PhD I converted to being a developer, and have by now quite a bit of experience doing that.

I have not once found category theory useful, beyond the absolute basics of the idea of a universal property, either in my work as a mathematician or as a developer, despite going on many expeditions to find some use for it.

I'm not saying category theory is not useful. I'm just saying you might do just fine without it.

Also, in an attempt to maybe say something useful, "Categories for the working mathematician" by MacLane is a good introduction if you know some maths already.


With some consistent dedicated effort you could maybe go through Category Theory for Programmers https://github.com/hmemcpy/milewski-ctfp-pdf


I would highly recommend the book by Lawvere and Schanuel: "Conceptual Mathematics: A First Introduction to Categories". It is a very down-to-earth introduction to category theory, written by two masters of the field.


I second this recommendation! I agree with the sentiment expressed by some in these comments that intermediate category theory probably isn’t going to help you too much as a programmer. However the introduction to Category Theory in “Conceptual Mathematics” caused me to understand sum types, product types, algebraic data types, finite state automata, logic, and functions as objects in a new way. Additionally “Conceptual Mathematics” isn’t arcane or inaccessible they mention in the introduction (or maybe it was a review of the book) that a motivated high school student could use the book for self study.


You shouldn’t bother if your goal is to be a better functional programmer. Haskell’s use of category theory isn’t really the same as the formal category theory in mathematics. Some ideas have some conceptual overlap but that’s about it.

As a more practical point, trying to learn category theory without a good understanding of abstract algebra and linear algebra would be difficult. The material would seem very unmotivated and unintuitive. Math undergrads don’t typically encounter the subject except in an upper level topics course; more usually it’s a grad-level math course which requires a lot of exposure to abstract algebra.


Why do you want to learn category theory?


Basically what's going on:

with monoids, you can combine things, and then monads result from choosing a clever combining operation in a certain contexts


That makes them sound like union types to me, but I'm guessing they're not like union types, otherwise that's what they would be called.


Union/sum types don't combine things though? you can only have ever one variant at a time. it's a misnomer. monads and monoids combine things in the sense of joining them. e.g. lists are a monoid because you can concatenate (they're also a monad for the same reason).


funny, I was recently looking at a very smart algebra (Guibas and Stolfi) that was making really easy to express transformations I absolutely did not care about, but not that easy for what I wanted.


I notice that's the same Stolfi that runs the r/buttcoin subreddit, as a cryptocurrency guy I've had arguments with him before LOL.


It is.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: