One thing that stuck with me from a good Clojure book(1) is that using < is greatly preferable to using >. Coming from an OO background and reading left-to-right, this made sense to me: list the args from smallest to largest.
(< 1 2 3) ;; true
is a lot more intuitive to me than
(> 3 2 1) ;; true
even if one or more of those literals is replaced with an argument. It’s especially apparent with the two-arg use of the function. This is one of the few places where I still feel like infix notation has an advantage.
This isn’t even a nitpick about the article, just a remark about some cognitive dissonance seeing > used here.
(1) Maybe Elements of a Clojure? Not sure that’s it; it was a book referenced on HN which has a whole chapter on naming things.
Both factorizations feel the same to me, without context. Subsequent numbers are greater than the previous ones vs. subsequent numbers are lesser than the previous ones.
What should really matter is how you're "talking" in the code. If you've defined some relationship as y being greater than x, then that's how you should phrase the comparison as well: (> y x). (< x y) would be the same thing logically, but we don't read logically: it requires another mental step to understand that (> y x) and (< x y) are equivalent. It's avoiding those extra steps that gives code a really big readability boost.
The historical preference of < over > seems to be widespread in programming languages, here's a credible sounding explanation of the origins: https://stackoverflow.com/a/12146965
Regardless of the origins, < is just a more natural operation than >. I kind of mean that both figuratively and literally: the most "natural" numbers are the "natural numbers", which follow 1 < 2 < 3... and not 1 > 2 > 3...
I see what you mean. But if we consider endianness and reading direction arbitrary conventions does it still hold? Maybe it does, since in your latter example the reading direction is inverted in any case and endianness applies only intra-number.
Yeah, I was just talking about the mathematical less-than operator. How we choose to represent anything computationally or visually is orthogonal to it.
I think this is a fun article, really fun little rawing.
But "imagine an SR latch!" is always funny to me in these kinds of explanations. The base circuitry for electronics have really powerful primitives, but if you are coming from first principles then it's like.... where did you get this from? In itself it's a really non-intuitive device, and only starts making sense in a larger context.
It's of course also what we actually do, just I think not something anyone would ever come up with in a vacuum if they were thinking "how do I build computers"
> If you were dropped in a forest…you could create your own computer.
Could you really? I've heard of fun things like water-based logic gates, but even with primitive elements like that it doesn't seem at all easy to come up with anything even mildly useful.
Did you do the entire book in Clojure or just this part? Thinking of forking SICP Distilled and was wondering if there's a reason why it stopped other than the author losing interest.
(< 1 2 3) ;; true
is a lot more intuitive to me than
(> 3 2 1) ;; true
even if one or more of those literals is replaced with an argument. It’s especially apparent with the two-arg use of the function. This is one of the few places where I still feel like infix notation has an advantage.
This isn’t even a nitpick about the article, just a remark about some cognitive dissonance seeing > used here.
(1) Maybe Elements of a Clojure? Not sure that’s it; it was a book referenced on HN which has a whole chapter on naming things.