> Heh... completely wrong, but I suppose it's the best you can expect a non-techie/math nerd readership to get. Heck, it's probably close to the most you can expect the average programmer to get.
How would you explain the Y Combinator to the Hacker News readership?
The Y combinator is a programming construct allowing one to write recursive functions without explicitly calling themselves (unnamed recursive functions):
The difference is that the whole point of the Y combinator is that it's a fixed-point combinator that doesn't use explicit recursion. Imagine that you didn't have "var", and couldn't name anything (which is the situation in, say, the lambda calculus). Using the Y combinator you could still define recursive functions.
Like others have mentioned it's rather academic, but also pretty cool. It's a way of introducing recursive functions into the untyped lambda calculus. I think the easiest way to see it is using it in scheme to define an anonymous recursive function:
You'll notice that lines 6-10 define a procedure that takes a factorial function, performs one level of the recursion, and then calls the factorial function to compute the rest. The Y combinator, implemented in lines 1-5, basically allows us to start the recursion. A step by step derivation can be found at http://www.ece.uc.edu/~franco/C511/html/Scheme/ycomb.html
The really neat thing about this is that all that was required to create that factorial function was lambda, some arithmetic functions, and if. Arithmetic can be implemented using only lambdas (google church numerals) and so can if. Using these techniques you can create a factorial function using only lambda and function application (which is exactly the lambda calculus).
If you replace "program" in the Forbes article with "function" I don't think they're terribly far off.
I'm a coder/writer at Forbes. If I had to guess, that's exactly what an editor might say. Also, we're always crammed for word space so shorter (even if perhaps less nuanced) always wins.
Wow, hmm.. I worked a MEDITECH a long time ago, and they actually used a programming language, MAGIC, that you couldn't recurse in. It used macro code expansion to take MAGIC code and output into a lower level language and recursion would make it loop infinitely. Maybe at MEDITECH it wouldn't have been so academic.
Edit: Not sure why this got downvoted, but in their internal programming course they talked about the recursion problem explicitly--I'm not making this up, as crazy as it sounds.
If you don't believe me, here is a link to a blog post with someone else mentioning it in the comments:
That's like saying integers are an optional thing, and it's no big deal if your programming language doesn't have them. True, they aren't necessary (look at peano arith or pure lambda calc) - but it sure makes things a lot easier. It's much more concise and intuitive to write some search or divide and conquer algos recursively than imperatively.
Possibly it makes things easier for some algorithms, but as I said, it is optional. It's not rocket science to rewrite any algorithm that relies on recursion to not need it.
The "meat" of it is really that (Y F) = (F (Y F)). In other words Y applied to a function F expands to F applied the original expression, so it builds up nested calls of F.
I can't do it justice, but it's definitely much more interesting and even beautiful than "meh". I guess there are always "practical" vs "academic" arguments to be made but I find stuff like the lambda calculus, Y, Church numerals, all really cool.
Yeah ok, it's sort of interesting as a curiosity. But it's going to be terribly inefficient if used, and doesn't really have any practical application.
I've always been more of a practical "why not just use a loop" kinda guy, so it's probably wasted on me.
If you told me (before I knew about the Y combinator) that there was a function Y such that Y(F) = F(Y(F)) for every other function F, I doubt I would have believed you.
Loops are just a special case of recursion, and only necessary for languages that don't support recursion very well. As an example, Scheme and Haskell have no use for loops.
Except there is no difference at the machine instruction level between a recursive and iterative approaches. They are isomorphic to each other. That is why you can implement all recursive algorithms using an explicit stack and a loop and all iterative algorithms as recursive functions.
There's a lot of difference in the unoptimized form of recursion - namely stack usage. Yes you can hack together tail call recursion to try and make recursion perform as well as iteration does, but what really is the point?
The concept of recursion in is independent of computers. Recursion is just the mathematicians idea of how to repeat get arbitrarily big objects out of limited definitions.
Loops are a just a special case, and not really closer to the hardware. Branches and jumps are closer to the hardware. "Yes you can hack together [a compiler] to try and make [structured programming] perform as well as [goto] does, but what really is the point?"
How would you explain the Y Combinator to the Hacker News readership?