Beautiful, compact summary going to the heart of language philosophy,
what we can and should build and for what purposes. To hardened
pragmatists this probably looks like ideology and pointless tinkering
at it's worst, but I think the slow, eventual path to sucklessness
lies here in the vibrancy. Lisp will have its day again :)
The German aspects puzzles me, not sure if the implied German School exists.
Other German Lisp (and related) implementations: CLISP (a Common Lisp written in C+Lisp by Haible/Stoll), CLICC (a CL to C compiler by Goerigk/Hoffmann/Knutzen), Eu2C (an EuLisp to C compiler, from a Fraunhofer Institute), Chicken Scheme (Winkelmann), ...
My guess is that this has nothing to do with Lisps school located somewhere in Germany or having German authors/roots as you quoted.
No, I was associating the listed properties (focus, spartan, controversial) of the "Fluchtpunkt" Lisps at the end with typical peculiarities or prejudices or stereotypes of the Germans themselves. The question then remains, how does "fun" fit into the picture here?
Wonder where hylang would fit in this classification: exo-lisp, practical-lisp or fluchtpunkt-lisp? I intend to learn lisp, but keep delaying the time investment, waiting for hylang to get a better beginner-friendly tooling setup. So far it seems step 0 in learning hylang (and most lisps) is to learn emacs first, and how to configure it, what causes unecessary friction for would-be apprentices, as this recently asked stackoverflow question shows: https://stackoverflow.com/q/74409462
One thing I enjoyed in the smalltalks is that they share a lot of lisp power, but the tooling is more self-contained and integral to most of them, what makes the first steps really easy to take. As Seymour Papert stated the ideal, "low threshold, no ceiling."
If you don't know Emacs, see other editors: https://lispcookbook.github.io/cl-cookbook/editor-support.ht... If you want the more Smalltalk-like experience I'd go with the free LispWorks version: it has many GUI panes that allow to watch and discover the state of the program.
I personally couldn't stay long with Hylang. You won't get CL niceties: more language features, performance, standalone binaries, interactive debugger (all the niceties of an image-based development)…
I would call it an exolisp. Its very pointed use case is to enhance Python with better metaprogramming facilities.
It's often compared to Clojure, but Hy's own documentation often uses Clojure as a contrasting example that reinforces this perspective. Clojure is built on top of Java, but the integration is somewhat loose in that Java semantics don't always map cleanly to Clojure semantics. Hy, on the other hand, deliberately stays very, very close to Python's semantics, even when that means making significant departures from how things are typically done in a lisp.
No need for emacs, Vim and VS Code support many lisps well (or well enough). You can also use DrRacket for Racket and Scheme learning (https://racket-lang.org/).
It's my plan-B. But as I already use python at work, hylang would allow me to tap in the same ecosystem, making it immediately useful, at least in my particular circumstances.
Interesting that Michael wrote that 11 years ago and that it is still a relevant partitioning and categorization of Lisp languages.
I love the Lisp ecosystem and my only real regret is that PicoLisp does not run on macOS because I think as a tiny Lisp that it is practical for some applications and there are several very interesting programs written in it.
The easiest way to do this on a M1/arm64 macOS system that I found was doing a 'brew install lima', start lima, apt install picolisp+emacs, then set up plisp-mode.
This is a fast setup to run Emacs+picolisp in a Mac terminal, and file sharing between Linux and macOS is OK also.
I've also only played around with picolisp, but thought it was pretty cool. The design ideas are kind of cool for sure, but I can't put my finger on why.
REBOL might scratch that itch,
it is "not dead yet" kinda of like how Amiga is not dead yet
I gave up on it with the switch to 64bit machines ...
but it still holds a warm and fuzzy place in my memories
as how I "got" lisp before I even knew what a lisp was.
I think homoiconicity and the resulting macro system necessitates s-expressions. I mean how could you devise a "programmable" programming language with ASCII-based syntax in other ways?
"Ur" is a German prefix meaning "old", "original", "predecessor".
So Ur-Lisp would be the old original Lisp that is predecessor to all other Lisps.
Other examples include Urrind (Rind == Bovine) which would be the predecessor of all species of bovine. Urzeit (Zeit == time), which is a generic very ancient time period before all humans (where the Urrind or even the dinosaurs might have lived). Ur-Computer might be one of the first computers, like the original Zuse or ENIAC.
Ur-Lisp means “original” Lisp, in the sense of “origin”. “ur-“ is sometimes used as a prefix (Ur-Fascism of Umberto Eco comes to mind).
In this case I guess it refers to the first Lisp, or maybe just the first prototypical Lisp, the founding father of all modern Lisps, I haven’t read the article yet.
You are joking, but the Proto-Germanic language (the ancestor of all current Germanic languages (English, German, Dutch, Icelandic, Faroese, etc. (sorry for the nested parens, but this is a Lisp thread))) is actually called “urgermansk” in Scandinavia.
I still don't get Lisp. Why does there need to be an endless amount of dialects? Doesn't this completely needlessly fracture the ecosystem into hundreds of pieces making it impossible to effectively support?
Isn't the point of Lisps to be endlessly customizable, making the invention of a new dialect less necessary?
The only Lisp I'd consider for a real project is probably Clojure and I'd even be hesitant about that. Sure, you can make a simple web project with Arc (HN), but it seems like you get 0 libraries, 0 interfaces with other systems and 0 tooling. Why would I bother with that?
For the same reason as there are many algol-like languages.
The production-ready lisps are:
- Common Lisp: this one is standardized and the ecosystem is NOT fractured. Many implementations exist and work in parallel: SBCL, CCL, LispWorks, ECL, ABCL…
- Clojure (though there is also ABCL and I keep hearing good things about LispWork's Java interface)
- Schemes: some coming from university, with a fractured ecosystem.
Of course, nobody uses Arc for serious stuff (but we can have its syntax in CL and probably in Racket too).
> Why does there need to be an endless amount of dialects?
Because any time someone makes an (operator arg1 arg2 ...) language, they either put Lisp in the name, or else state elsewhere that it's a Lisp dialect.
Why does there need to be an endless progression of C-like dialects? They don't get called C, or rarely so.
The semantic differences between things called Lisp can be as large as between C++, Java, Javascript and Awk.
Many programming languages have been modeled on Algol; why are there so many Algols and why aren't they called that way? A lot of it is because of syntactic differences.
In the Lisp world, there are some mainstream languages that have the most users; if that's important to someone, they can stick with those and pretend the rest don't exist.
It's another example of the ~800 volume work titled "Where the indirection go?" A solid 90% of the variation you see in the programming world is moving indirection higher or lower, inside or outside, like moving food around on a plate. When you compound these choices, you get a combinatorial explosion that's both intimidating and absurd. And the clearest indication yet that, despite our pretensions, software engineering is, in fact, software alchemy awaiting systemization. We need Lavoisier and Mendeleev for software, stat!
There is an endless amount of Lisps because the core to get one working is one page of code. So many people make one as a pet project. Aside from the small unused versions, the rest of the dialects have a lot of differences between them, some of which would be difficult to implement in an existing implementation. Common Lisp is a fairly complex specification to achieve, so few even approach it, and of those that do most aren't fully compliant.
Getting a Lisp to run efficiently and effectively is more difficult than just writing an interpreter. I consider work on this problem to still be incomplete. For example, it should be possible to implement generic functions even more efficiently in Common Lisp implementations than is currently the case, in a way that preserves the ability to redefine the functions (and the classes they work with) dynamically. Robert Strandh has been working on general mechanisms supporting this and other improvements.
There are a lot of dialects because its core is one of the easiest programming languages to understand and implement, and once implemented, it's one of the easiest languages to extend. That's all there is to it, pretty much.
Anyone who is interested in the design of programming languages will find it easy to make their own Lisp and try out any ideas that seem congenial to them. The natural and obvious consequence is a lot of Lisps.
Why are there multiple dialects of real-world languages?
LISP is pretty minimal and simple. It's less of a language than a set of constructs for computation. Computers are made of abstractions on top of abstractions, so it makes sense for Lisp-based languages to evolve continuously. No one is better than the other, each target to their own niche. Just like other prog. languages, or real-world dialects.
Dozens. The standard language has been through numerous revisions (C++ 2.0, 98, 03, 11, 14, 17, 23), with no doubt many more to come - each version is a dialect. And there have been dozens of compilers, all of which define extensions to the standard - some of those extensions are copied by other compilers (and may eventually end up in the standard), others are unique to that implementation - so each compiler (even each of its successive major releases) can be viewed as a dialect. And then there are dialects defined, not by the core language features, but by which features are used, by which external libraries are used, etc. One C++ programmer is addicted to esoteric template metaprogramming, another avoids templates and treats C++ as a slightly improved version of C. One C++ programmer uses every Boost library they possibly can, another refuses to use any third party dependencies. One uses exceptions and RTTI heavily, the other always turns them off. Aren’t those all effectively different dialects, even if they are both using the exact same version of the same tools?
And then there are entire languages defined as extensions of C++, such as Apple’s Objective C++, or Microsoft’s Managed Extensions for C++ and then C++/CLI and C++/CX - and also more modest extensions such as OpenMP
Considering C++ is an object-oriented extension to C, it belongs alongside other such extensions - most notably Objective C and D (and even, to a lesser degree, Java and C#) - and there have been other attempts at that which weren’t successful - aren’t they all (in a sense) C dialects?
A large factor in the push for C# was that MS was sued by Sun for their not-totally-compliant Java dialect, Visual J++, and had to drop it as part of a settlement.
They also had Visual J# which ran on .NET, allowing Visual J++ and Java programs to target that platform.
There is also a non-Microsoft open source implementation of Java on top of .NET, IKVM.NET - I was sad to see it die when its original developer lost interest (although I just learnt it has since been revived by others)
> Isn't the point of Lisps to be endlessly customizable, making the
invention of a new dialect less necessary?
A great question. Maybe everything should have been built on Scheme
once . and . for . all But people and the world aren't like that. We
are restless souls.
When they sort out the spec issue (R7RS) we will reach programming nirvana and world peace. But I doubt humanity will live long enough to see that day.