For those of us who don't know what a Y-Combinator is or why it's useful, I found a SO answer [1] that makes sense to me:
A Y-combinator is a "functional" (a function that
operates on other functions) that enables recursion,
when you can't refer to the function from within
itself. In computer-science theory, it generalizes
recursion, abstracting its implementation, and
thereby separating it from the actual work of the
function in question. The benefit of not needing a
compile-time name for the recursive function is
sort of a bonus. =)
I've use something very similar in practice. Haskell's fixed point operator is:
fix f = let x = f x in x
This allows you to turn a non-recursive function into a recursive one, it factors the recursion out. It turns out that you can write a similar combinator (usually called mfix or memofix as I recall) that will result in a memoized version of the same recursive function.
So it allows you to abstract away the memoization, write it once and then use it on any recursive function. That way you can write quite natural recursive functions and just memoize them in one wrapper call.
I used this in competitive programming for a while, it was nice for certain types of dynamic programming problems where the interesting part is the recursive definition, so once I found that there wasn't any more to write. Quite satisfying when it applied.
Functional programming: Making you feel clever by allowing you to solve problems that nobody else even knew existed, in order to let you do what everyone else could do from the start.
I’m not sure about it actually being used in compilers. However as I understand it the simply typed lambda calculus does not support recursive functions. If you want a typed functional language you have to introduce as a primitive a recursive combinatory like fix or the Y combinator.
Useful when you have a lambda that needs to call itself, but the language doesn't bind the name until after the lambda is parsed (so you can't call by name). This happens in a few languages I think, C++ and C# for sure.
I’ve used it for recursion in anonymous functions in C#. It is straightforward enough to implement. Was it the best choice? Probably not, but I took the opportunity when presented.
I’ve used it in a semi professional way (internal tooling in clojure). I’ve also written about it here (Practical applications of the ycombinator in clojure) [1].
There was a good practical ruby example for auto-vivification of hashes using the applicative ycombinator many years back, but I'm having trouble finding it.
Nope, but I felt good when I finally managed to write one from scratch, because I knew for sure there was a time when I couldn't. So maybe the same would happen for you too.
This SO answer is posted every time this topic comes up, so I've seen it many times (and, in fact, I provided one of the answers there, too) and it's still the one and only place I've ever seen "functional" used as a noun like that. I'm always tempted to edit it, but maybe I'm missing something?
I slightly wish the exponentiation function from the original article weren't the more complex "faster" version -- it's not really relevant to the article, and it makes playing through the calls in your head just a little bit harder with a topic that already stretches your mind the first several times around. Regular exponentiation is a fine example without the added fun of working out why and how exactly this alternative form works.
That said, as always raganwald's stuff is a pleasant and informative read.
I disagree. If you are writing shippable code in an article, it's because the article is about how to write shippable code. This article is not about that, so writing shippable code risks detracting from the main purpose of the article.
If you like esoteric, I learned this algorithm for obtaining the exponent of a 2x2 matrix, which amusingly enough is part of an interesting fibonaccialgorithm.
It's nice to see an article like this every once and a while to remind myself that there is so much about programming still to learn. I particularly liked the javascript examples so I could play with them in my scratch pad. I am still not sure I could describe what a Y Combinator is and where to use it, but I'm much closer than I was 15 minutes ago.
It's over 30 years since I was taught about the Y combinator and I still find it amazing - it allows you to define recursion in a language which has no concept of recursion or named functions - such as λ-calculus.
The fact that you then define this in terms of amazingly simple functions like the S and K combinators just, in my opinion, adds to how wonderful it is.
Of course not - I haven't practically applied Turing Machines either. You'd have to be mad to implement a functional programming language purely in terms of S and K - which of course is what I did for my final year project and got a 1st and a prize!
What my encounter with the fundamentals of Computer Science did give me (apart from the necessity of implementing things like garbage collectors) - was an abiding interest and love of the mathematical foundations of computation and with maths as something interesting rather than something to be endured.
That interest did lead to me doing postgrad research in a control engineering group, which did lead to me co-founding a startup.
So did I ever apply S & K in a practical circumstance - no. Am I glad I took a turn down the more mathematical side of CS - absolutely.
The Y combinator isn't there and can't be defined trivially since Haskell only allows non-recursive data types. However, there's a nice solution using non-monotonic data types[1]:
newtype Mu a = Mu (Mu a -> a)
y f = (\h -> h $ Mu h) (\x -> f . (\(Mu g) -> g) x $ x)
It's clear, minimalist, insightful, and even has an in-page live code demo. Bonus points for including fixed-points as a key concept, and memoization as an extremely useful technique.
While I enjoyed the subject matter, I can't help but feel that Y Combinator is to functional programming as AbstractFactoryFactory is to object-oriented programming. My inner geek revels at these abstractions, but my inner engineer and team manager wants something simpler and more robust.
The difference is that the Y combinator has a deep meaning in theoretical CS, making recursion essentially out of thin air in the untyped lambda calculus which had far reaching consequences. In contrast an AbstractFactoryFactory is pretty mundane.
But it's not really meant to be used as a pattern for writing actual code in an actual programming language. It exists as a way to implement recursion when we can't refer to named functions.
Edit: unless you are specifically talking about the use case of this article.
Simpler in understanding and mental overhead. I want code in my projects being written in a way so that it's maintainable by developers this business can reasonably hire. Which means that they didn't read SICP or even Knuth, don't have experience with functional language and overall are not the x10 developers everybody pretends to have.
I'm old enough to remember when we said the exact same things about OOP, then about Ruby, and now we say them about Elixir, ClojureScript, and so forth.
I very much doubt anybody needs to have SICP at their fingertips, but it seems like many techniques that we deride as being "too far out there" turn out to be quite manageable by all sorts of programmers, it's just a matter of there being enough resources to learn and a little motivation.
This article definitely does not prescribe hurling combinators at every problem, but I will go on record and say that if there's somebody on your team you think cannot understand this code, perhaps you should be a little more optimistic and give them a chance.
Anybody who can figure WebPack, Gulp, Babel, closures, promises, async/await, generators, &c. out can figure this out.
Oh, they can figure it out if they wanted to, definetly. And in about 10 years, they will. That's not my point.
My point is that today, when developers encounter this code while debugging something that needs to be done yesterday, will not spend the time educating themselves and will just ignore it or (even worse) assume they understand it, which is even worse. I don't think that I'm better than them - I just acknowledge my own stupidity. And in my stupidity, in the heat of working on a real project I want to encounter familiar technologies and abstractions that by themselves are almost boring, rather than research new things.
At least when it's out of scope of our technology focus, the core of the project that actually helps it be unique and earn a profit.
I tested the theory myself. I went to Boston recently and followed my GPS. It took me over the Tobin bridge through solid bumper-to-bumper gridlock. When I arrived at the gridlock my GPS said 30 minutes left. 45 minutes later I had travelled about .5 miles and it still said 25 minutes left. Google was completely wrong, but I swear it had me right where it wanted me the entire time.
After getting fed up with traffic, I eventually turned my GPS off and got into the right lane to head towards Cambridge. My logic was that GPS could navigate from anywhere, to anywhere. I would simply go somewhere with less traffic and try the GPS again.
Sure enough I made it to Edward Land blvd and turned my GPS back on. I had a clear 15 minute drive with average traffic for the rest of my trip.
Google saw the traffic I was dealing with, and queued me right in behind it all. It could have recalculated around it, but it legitimately "thought" I only had 30 minutes to go. I guarantee it told all 500 other cars in front of me the same thing.
So we were all following our GPS's, and our GPS's were taking all of us the fastest route. It created a traffic jam and then optimistically pretended there was no traffic jam.
If street lights were truly "smart", with cameras and machine learning to maximize throughput, and GPS's programmed to load balance their own congestion, this problem wouldn't be a problem anymore.
Given I’ve never had reason to use a y-combinator, mocking bird or eta or any of the other colourful terms from the article this post about GPS seems vaguely appropriate.
Which means that in around 14 days I’ll suddenly need to revisit this article and learn all about it because it’s suddenly useful.
Life is strange like that. I do like the term “why bird” though.
It's interesting to see how the Y combinator is derived, but it seems like it's being oversold here?
I like the mockingbird better since it's less magical. Passing "self" as the first argument seems idiomatic from an object-oriented point of view; this is sort like how Python has an explicit "self" parameter. We can take the next step and pass "self" explicitly as well.
The ability to change how a self-call is handled seems quite like overriding a method.
AFAIU the problem, from a theoretical point of view, is not that you need to pass the self function, but how to define it. In pure lambda calculus, there are no named functions, just lambdas, so you actually need the Y combinator for recursion.
Nice. Very thorough article, that touches more concepts of functional programming than just Y like tail recursion.
For someone who never had contact with functional programming, the concept of the Y combinator can seem a bit... strange. But trying to figure out what it is and does can be a real eye opener.
This is actually the Z combinator, which is the eta expansion of the Y combinator. The Y combinator only works in languages with non-strict semantics, but JavaScript is call-by-value.
Is there something like the Y combinator for self types in F bounded OO generic type systems? I just realized it’s the same kind of problem at the type level. E.g.
I followed it entirely with one exception: I can see that `maker(maker)` has the right type, but why is it the right value? How do we know, as we fill in the blanks, that this is the implementation that yields the right result?
I wondered this myself. I mean, I can test it and see that it appears to give the correct result, but how do I prove that it gives the correct result?
I didn't include it, but you can do reductions, successively replacing the names with their expansions, and you see that it works out was we want.
And in Combinatory Logic, there really is nothing else. If an expression has the correct "type," then it works. There's nothing except rearranging terms, duplicating terms, and erasing terms.
But that being said, I don't know if that's a "proof," and I especially don't claim that a proof about Combinatory Logic would say anything at all about JavaScript.
So we start out with an attempt to build an anonymous recursive function. That's cool I suppose. But doing it in a language like javascript (or java)? It turns out that tail recursion == stack overflow. Not daunted, the plucky author decides that we can solve this particular issue by creating a closure over stack variables and put them on the heap instead!
You know, if you want to use Haskell, use Haskell. Trying to shoehorn this stuff into languages designed for loops instead of recursion just means impressionable JS developers are going to see this and rush out to implement it in some project to prove how smart they are. Then everyone suffers. The implementation is slower because it goes to heap. The memory usage explodes because it goes to heap.
Purely for illustration of a Y combinator, this is great. But there should be a big red warning somewhere on the page as a reminder of the downsides of this approach.
We actually don’t start out with an attempt to build anonymous recursion.
That is the goal of the Y Combinator in the Lambda Calculus and Combinatory Logic, but here the initial goal was to decouple an anonymous function from itself so that we could decorate it.
Going out to main memory is orders of magnitude slower than hitting cache. Even a stack overflow would be preferable to debug vs an out of memory error. At least with the stack overflow, the stack trace pinpoints the problem.
Like I said, I have no problem with this article as an illustration of a ycombinator for JS devs. The problem is it does not warn them sufficiently of the very major drawbacks in using it. This is why we have Gate's law; Developers using inefficient constructs because they're neato.
Well, given that the post links to two different discussion forums, I attest that it’s not my responsibility to outline every single consideration of every single technique, nor is it my responsibility to include the entire syllabus of computer science to explain the background.
Instead, I outsource that to people like you. You voice your concerns here, and people can read them and make up their own minds.
Speaking of making up their own minds, please explain in more detail how the trampoline’s memory consumption grows unboundedly, to the point where it could cause an out-of-memory error. That would be most interesting for the community to ponder.
A loop is not just fewer LoC and easier, it is orders of magnitude faster than the ycombinator decoupled trampoline. The first alert is nearly instant. The second one takes almost a minute. My chrome dev tools are hard locked waiting for the ycombinator to finish.
This is not something I would encourage JS devs to use in practice.
Well, refactoring isEven or exponentiation to a loop is faster. But wait, you know what's even faster? Using built-in math primitives.
I wrote an entire article about refactoring tail-recursion to iteration and linked to it in TFA. I've also written a Scheme interpreter that can perform this optimization for certain functions on-the-fly. But arguing that the code is slow is missing the point by a country mile, and then some. The article is not arguing that the code is fast, or that you should always do this. I find it amazing that every time a programming technique is discussed, people want to worry about whether it belongs in production.
Is there no recreational programming any more? Or is it all about shipping the next app to deliver food by electric scooter? My world would be depressingly boring if the only things I thought about were the things I use at PagerDuty every day.
Now as to what I asked you about:
What you said was that this implementation leaks memory in such a way that using the trampoline it trades a stack overflow for an out-of-memory error.
I wish to understand that problem. Specifically, how does the decoupled trampoline leak memory?
That incivility ("You get a trophy, smart person.") is the inevitable end of this sort of pedantic flamewar.
Since you've frequently broken the site guidelines and we've already given you numerous warnings and requests to stop, I've banned this account. If you don't want to be banned, you're welcome to email hn@ycombinator.com and give us reason to believe that you'll follow the rules in the future.
> There are plenty of people who will read it, then rush right out to duplicate it. That will bring bugs and chaos all over the javascript world.
Citation needed.
I seriously doubt that there's a horde of developers who--at this very moment--are pushing Y Combinator JS apps into production all thanks to this thought exercise.
I have the commit bit to the repo backing my blog. You do not. I f you have some serious suggestions, feel free to submit a pull request.
I'll review your PR and decide whether it matches the tone and style of my blog. If it doesn't , regardless of what you see as its merits, it will be rejected.
That state of affairs is not acceptable to most people, and fortunately, there is an alternative route forward: Instead of trying to lecture me about what to write in my own blog, and instead of lecturing me about what I should consider important to mention, and why, YOU CAN WRITE YOUR OWN BLOG POST REFUTING MY POST.
I am not being sarcastic. Great things have transpired in history when two or more people engage in a conversation via essays.
Words are so vague. I don't even know what "hard locked the dev tools" even means!
But if you write your own essay, with your own code illustrating your own ideas. Ah! That clarifies things for me and for everyone else reading.
It has never been easier to publish words. Avail yourself of this opportunity.
---
As for bugs and chaos all over the javascript world...
I have been publishing essays on the Internet for fourteen years. Most have been of a highly impractical nature: I save my practical thoughts for my work, and I scratch my "boy this is interesting" itch in my writing.
In all that time, I haven't broken the Internet. No hordes of people rushed out to implement macros in Ruby when I released Ruby.Rewrite.Ruby.
The JavaScript world was not taken over by decorators when I released Method Combinators. My book has an entire chapter on implementing traits in JavaScript. Where are all the traits?
(tumbleweeds)
I write to provoke thought. I think most people grasp that. You should have a little more faith in people.
Some people find these things interesting, even provocative. They may wonder if such a thing makes sense. But few people rush to implement an idea just because of one essay on Hacker News. Typically, they need to see a preponderance of momentum.
An essay from me. A library from someone else. A major framework baking it in. Only then will people bet their projects on it.
Until then, it's nothing more than one person, me, saying "Here's something interesting," and a bunch of other people saying, "indeed," over their coffee breaks.
Then we all go back to work and the world carries on just fine.
Honestly, with the exception of your statement that my trampolines will cause an out-of-memory error, you are not wrong about performance and the fact that practical applications for greenspunning your own trampolines are rare to the point of statistical non-existance.
I sense that everyone agrees with you, they just don't worry that hordes of Eternal September JavaScript n00bs will start swapping iteration for anonymous recursion powered by recursive combinators.