Same feeling here about recursion. I find it really cool that some algorithms can be described quite concisely and often simply, using recursion, sometimes more so than iterative or non-recursive algorithms for the same problem.
The same goes for some data structures, a simple example being a list, as often defined in functional programming languages. E.g.:
A list is either:
- empty, i.e. []
- a head (an element) followed by a tail (which is also a list), i.e. h | t
Lists of any size fit that definition because of the recursive definition.
Similar for binary tree, being either just a node, or a node with left and right child, each of which is a binary tree.
check out a book called "Godel, Escher, Bach". It's an incredible book about recursion in life, and in some sense makes the argument that consciousness arises fundamentally from recursive loops (a system examining itself)
You can build an entire working structure from base operations, exit conditions and logic that scale to higher states of the operations.
Extending the same, TCO is another very elegant concept in CS.