Ultimately I concluded that map/filter/reduce really need to be paired with a concise lambda syntax in order to be ergonomic. Writing out a full func() literal doesn't feel much cleaner than just writing a traditional for loop. However, adding a new lambda syntax would be a much larger undertaking than I can manage right now.
The other thing I concluded is that you can highly optimize map/filter/reduce chains if they're first-class citizens of the language, rather than living in a userspace library. I was able to compile such chains into for loops that are more efficient than what a novice might write, and no worse than what an expert would write. The only way to improve would be to inline more aggressively, i.e. inline "x += 2" instead of calling out to "addTwo()" on each iteration of the loop.
I think there's an unfilled niche in programming languages that's basically "Go plus some FP niceties." I love Go's simplicity, tooling, and performance, but writing endless for loops and error-handling blocks gets tiring and can obscure the operation you're actually trying to achieve. At the same time, jumping to full-blown generics, macros, etc. like Rust introduces too much "magic," hurts readability, and bloats compile times.
Rust actually has a lot of Gos spirit in terms of being practical and consistently engineered, but also embraces FP. The issue is that Go is philosophically tied to concreteness. You solve your specific problem each time from scratch. FP is about raising your problem to a higher level of abstraction and solving it there. In Haskell this is you taking your domain and raising it into a problem of category theory, and then using all the nice things we know about transformations in category theory to solve the abstract problem and lift it back down to earth, such that we can take advantage of any sort of mathematical, general solution. so in a sense Go + FP is like adding two diametrically apposed philosophies.
I get paid for haskell and I don't use category theory. No one in my team does and neither do a significant share of the people in the industry.
I don't know what outsiders think about us, but 90% of the value derives from the simple things : constructing simple datatypes easily, a world-class type checker and some great libs that require little knowledge to be used, like Servant or Aeson.
Sure, there are fancier things if you want to go beyond that, but you can be productive with the bare minimum of Haskell, as Elm shows.
> FP is about raising your problem to a higher level of abstraction and solving it there
Haskell and lisp are like that for sure, but more blue collar FP languages like erlang ate "basically FP so that you get memory safety and easy multithreading".
F#? Runs on .net, interops with the entire .net ecosystem, has high performance garbage collector, if you need high performance you can use stack-allocated data structures.
Nim is much better than "Go plus some FP niceties.". Faster compile and run speeds, clean Python-like syntax, lisp-like macros, no runtime required, can compile to JS.
Ultimately I concluded that map/filter/reduce really need to be paired with a concise lambda syntax in order to be ergonomic. Writing out a full func() literal doesn't feel much cleaner than just writing a traditional for loop. However, adding a new lambda syntax would be a much larger undertaking than I can manage right now.
The other thing I concluded is that you can highly optimize map/filter/reduce chains if they're first-class citizens of the language, rather than living in a userspace library. I was able to compile such chains into for loops that are more efficient than what a novice might write, and no worse than what an expert would write. The only way to improve would be to inline more aggressively, i.e. inline "x += 2" instead of calling out to "addTwo()" on each iteration of the loop.
I think there's an unfilled niche in programming languages that's basically "Go plus some FP niceties." I love Go's simplicity, tooling, and performance, but writing endless for loops and error-handling blocks gets tiring and can obscure the operation you're actually trying to achieve. At the same time, jumping to full-blown generics, macros, etc. like Rust introduces too much "magic," hurts readability, and bloats compile times.