Hacker News new | past | comments | ask | show | jobs | submit login
Why we need Lisp machines (fultonsramblings.substack.com)
221 points by otacust on March 26, 2022 | hide | past | favorite | 215 comments



I'm as big of a Lisp fan as can be. I'm a proud owner of Symbolics and TI hardware: a MicroExplorer, a MacIvory, two 3650, and two 3620. Not to mention an AlphaServer running OpenGenera.

Today, we have computers that run Lisp orders of magnitude faster than any of those Lisp machines. And we have about 3–4 orders of magnitude more memory with 64-bits of integer and floating point goodness. And Lisp is touted to have remained one of the most powerful programming languages (I think it's true, but don't read into it too much).

Yet, it appears the median and mean Lisp programmer is producing Yet Another (TM) test framework, anaphoric macro library, utility library, syntactic quirk, or half-baked binding library to scratch an itch. Our Lisp programming environments are less than what they were in the 80s because everybody feels the current situation with SLIME and Emacs is good enough.

We don't "need" Lisp machines. We "need" Lisp software. What made a Lisp machines extraordinary wasn't the hardware, it was the software. Nothing today is impeding one from writing such software, except time, energy, interest, willpower, and/or money.

Don't get me wrong, there are some Lisp programmers today developing superlative libraries and applications [1], but the Lisp population is thin on them. I'd guess that the number of publicly known, interesting (by some metric), and maintained applications or libraries that have sprung up in the past decade probably fits on one side of a 3"x5" index card. [2]

Though I won't accuse the article's author of such, sometimes, I find, in a strange way, that pining for the Lisp machines of yore is actually a sort of mental gymnastic to absolve one for not having written anything interesting in Lisp, and to excuse one from ever being able to do so.

[1] Just to cherry-pick a recent example, Kandria is a neat platformer developed entirely in Common Lisp by an indie game studio, with a demo shipping on Steam: https://store.steampowered.com/app/1261430/Kandria/

[2] This doesn't mean there aren't enough foundational libraries, or "batteries", in Lisp. Though imperfect, this is by and large not an issue in 2022.


> We don't "need" Lisp machines. We "need" Lisp software. What made a Lisp machines extraordinary wasn't the hardware, it was the software. Nothing today is impeding one from writing such software, except time, energy, willpower, and/or money.

Discussed here https://news.ycombinator.com/item?id=30800520 The main issue is that Lisp, for all its inherent "power", has very limited tools for enforcing modularity boundaries in code and "programming in the large". So everything ends up being a bespoke solo-programmer project, there is no real shared development. You can see the modern GC-based/"managed" languages, perhaps most notably with Java, as Lisps that avoided this significant pitfall. This might explain much of their ongoing success.


I think many people have conjectures, such as this one, but I don't think it's a tech problem, or a "Lisp is too powerful for its own good" problem. It's a "people aren't writing software" problem. History has demonstrated umpteen times that developing large, sophisticated, maintained, and maintainable projects in Lisp is entirely and demonstrably possible. Modern Common Lisp coding practices gravitate toward modular, reusable libraries through proper modules via ASDF packages ("systems") and Common Lisp namespaces ("packages").


The obvious answer as to why people aren't writing software is that almost all of the people able to write good software don't like the language and are writing software in some other language or for some other platform.

I know Lisp enough to have written programs of a few thousand lines in it. I'm not even slightly fazed by functional programming and (tail-)recursion instead of loops. I've read Steele's Common Lisp book from cover to cover. Someone even tried to get me to interview for a job writing Lisp (I politely told them I thought their system could not practically be implemented in Lisp and was, several years and tens of millions of dollars later, eventually proven right).

And I don't think the language has any redeeming features other than garbage collection and documentation, neither of which is notable in 2022. I'm someone familiar with the language who could quickly become productive in any decent Lisp, and that's what I think of Lisp. Can you imagine what a person new to the forest of parentheses, weird identifiers and rejection of 500 years of operator precedence notation thinks?


> I politely told them I thought their system could not practically be implemented in Lisp.

I'm curious to what that was, it sounds like a challenge more than anything.

I'm off a younger generation of programmers, and I've recently found lisp after coming from python. I still adore python and I can understand python's success, but I also adore lisp and its syntax, and I feel like I get the "powerful" argument for it.

For me, for what it's worth, I think the difficulties for lisp to join mainstream is really the thousands of little differences and gotchas that there's no single standard body looking after the users of the language. Princ instead of print, mapcars everywhere, etc. Trivial examples sure, but there's a lot that prevented me from making a few programs without doing through a book.

I'm not saying it's wrong, or it should be such and such a way. I am just simply comparing it to python and pointing to the first big elephant I come across.

Fwiw I also think that clojure is a lang that is capable of capturing more love because it's capable of tying in with Java ecosystem.. but it java is old and slow*; jobs in Java are slowing down, and clojure isn't filling in the gap, kotlin might be.

I take common lisp with me to holidays, because I like to learn it and all the funny words for things, but by far the lisp I use the most in day to day stuff is fennel, which is a lisp that compiles to Lua, and allows me to write plugins for neovim as well as a few other things that uses Lua.

TLDR: yes lisp needs software to be popular but it's not just a case of "make more applications" nor is the problem parents or operator precedence. People like sensibly named keywords.

I personally think things would be a little better if there was a CL to JavaScript transpilier, then you'd start to get some of the js crowd, same with python.


> History has demonstrated umpteen times that developing large, sophisticated, maintained, and maintainable projects in Lisp is entirely and demonstrably possible.

The thing is, developing maintained, maintainable projects is work, and everything I see about Lisp seems to have an undertone of being done, to some degree, for fun. It’s a language that scratches a specific itch; people feel clever about writing elegant code in it.

But the sad truth is that 99% of real application code doesn’t need to be clever or elegant. It needs to be simple and cheap and maintainable by anyone.


the 1% is extremely important though, and the tools for it look very different.

I do algorithm design, a lot of which is seeing what's possible/feasible before committing. I prototyped a classical AI planning system for orchestrating maintenance. there's no battle-tested standard library to reach for here. I needed prolog, I chose picat because it had a good planning engine.

before this, I worked on an NP-hard graph problem that involved tons of topological transformations for heuristics. I wrote the damn thing in C#. it was a nightmare. it's so hard to iterate and experiment in a language like that. even using something like Python/Jupyter, nested loops and horrible messes of bidirectional maps and indices bog you down. I should have done functional. nobody touched my code for two years because it was pure math. I think using C# made it much harder to understand - too much scaffolding obscured the semantics.

now I'm doing another NP-hard graph problem. this time I'm using MiniZinc. because it's concise, because it's powerful enough that I can at least figure out how to specify my problem and see where scale falls down.

so I'm in this position of using a menagerie of bizarre languages not because I'm a hipster obsessed with pretty code, but because my problem domains are really weird and maintainability doesn't matter when you're not sure if what your goal is even possible. something like Lisp - a lingua franca for weird DSLs - would be great, for my very niche use case.


Absolutely true, and it sounds like you're doing really interesting stuff where the language and the way it maps to the problem are more important than the factors I listed above.

Do you stick with the same (maybe-)esoteric language for the production implementation? And if so, does this then drag the rest of the production implementation into that language as well, or do you do the hard bit in one language and the routine stuff in another more common language?


I do the hard bit in the esoteric language, then the glue in a traditional language. for instance, I made a tool that exhaustively steps an implementation through every possible state of a TLA+ model. I wrote "serializers" to parse TLA+ expressions into C# classes and vice-versa, and used this to build a bridge. Parser combinators work great for this stuff, though that's another reason I'd be happier with a Lisp representation for everything.


> people aren't writing software

Maybe not OSS, but somebody is keeping Lispworks and Franz in business.


I'd love to know who. The example apps in Lispworks site aren't commercial successes, as far as I know.


Looks like it's just a small unit in an office park in the tech center in Cambridge: http://www.lispworks.com/contact.html

It doesn't really take many contracts to scrap together office park rent and a staff of a few people. We're talking like < $1 million/year in revenue.

Good on them. Not every company needs to try to be the next Amazon


Already in the 80s the Lisp Machine OS had > 1 MLOC code lines with support for programming in the large. The development environment was networked from the start. A server keeps the definition of things on the network: machines, users, file systems, networks, gateways, printers, databases, mailers, ... The source code and documentation is usually shared in the team and versioned via a common file server.

Nowadays, there are large applications written by groups/teams, which are worked on for three or more decades. For example the ACL2 theorem prover has its roots in the 70s and is used/maintained until today for users in the chip industry.

Common Lisp was especially designed for the development and production use of complex programs. The military at that time wanted to have a single Lisp dialect for those - often applications came with their own Lisp, which made deployment difficult. A standard language for projects was then required.

These were usually developed by teams in the range of 5 - 100 people. Larger teams are rare.


"Designed for complex programs in the 1980s" is not really up to current standards. Moore's law means that complexity of overall systems can grow by multiple orders of magnitude in the 01980 -to- 02022 timeframe.


That was not the point. Lisp applications were already a team sport back then. That all or even most Lisp software is written by single person teams is just wrong.


Modern agile development is far more complex than even that: you need entire loosely-coupled teams of developers to be able to seamlessly cooperate with one another, so there are multiple scales of cooperation. Tightening the semantics of interface boundaries is a necessary support for that kind of development and is what's largely meant today by programming "in the large". Highly dynamic "scripting" languages get in the way of this, and that's essentially what Lisp is too.


Lisp isn't essentially a scripting language. Common Lisp was designed as an application language, in rare case it's also a systems language. On a Lisp Machine, for example the whole networking stack was written in it.


Never heard of Javascript?


> The main issue is that Lisp, for all its inherent "power", has very limited tools for enforcing modularity boundaries in code and "programming in the large"

I don't see any mention of "modular" or "boundaries" in the post you linked, so I'm assuming that it doesn't add extra context to your point.

You say "very limited tools for enforcing modularity boundaries", which I'm going to assume means that you believe that Lisps have constructs for creating modularity boundaries (e.g. unexported symbols in Common Lisp), and just don't enforce them (e.g. I can still use an unexported symbol in CL by writing foo::bar), in which case - I don't think that this is actually an issue.

Programmers are capable of doing all kinds of bad things with their code, and shooting themselves in the foot, yet I've never seen an indication that the ability to shoot yourself in the foot with a language noticeably contributes to its popularity (see: C, C++, Perl, Ruby).

Moreover, specializing to Common Lisp, it's not like CL allows you to accidentally access an unexported symbol in a package - you have to either deliberately use :: (which is bad style, and takes more effort than typing :) or you get a hard crash. This is opposed to the above listed languages, which allow you to shoot yourself in the foot in a large number of extremely diverse and interesting manners, often without giving you advance warning - and yet are still far more successful.

------------

I don't believe that the lack of success of Lisps are due to technical deficiencies.


Lisp is clever and also a but inhumane. It only ever appeals to the small subset of the population that likes that. Lisp has some niceties that are fundamentally inherent to its design but it's just not enough to overcome how awful it is for people.

But this is not a technical deficiency of Lisp, it's more of a technical deficiency of humans.


What? How is Lisp (either the family of design decisions or even a specific language/implementation) "inhumane"? What does that even mean?


Inhumane is the wrong word; brutalist is a better word. Brutalist architecture is characterized by minimalist constructions that showcase the bare building materials and structural elements over decorative design. Human's are not logical/mathematical machines and all the irregular and decorative parts of programming languages are often for our benefit even if they're inconsistent, verbose, and limiting.


That's not the experience a Lisp programmer has.

One is not looking at dead code, instead one is working with it interactively.

Lisp is not brutalist concrete, Lisp is wet clay in the programmer's hands.

The main way to program software in Lisp is extending it while it is running and with the development tools as much integrated as possible. Thus a Lisp program has maximum information about itself, there is no separate debug mode, no long build times, no static code.


Lisps are fairly "decorative". It just doesn't combine a pin-striped blazer over a plaid shirt with leopard patterned pants, like many languages.

Here are some traditional notations found in Common Lisp:

  "string"
  symbol
  :keyword-sym
  #:gensym
  package:sym
  `(static content ,inserted ,@spliced)
  (l i s t)
  (im pro per li . st)
  #(v e c t o r)
  6.18e+23
  13/17
  ;; comment
  #*100111101101
  #1=(1 2 3 #1#) ;; circular structure
  #S(point :x 35 :y 44)
Various dialects have various notations. There is syntax highlighting support in editors, too.

Lisp is not just (this (and that)).


And you imagine this is a contraction to what I said?


Well, not exactly. What you said doesn't quite have the form which supports that sort of activity.


> Brutalist architecture is characterized by minimalist constructions that showcase the bare building materials

This definitely doesn't describe the semantics of most Lisps, which operate at the higher levels of abstraction and conceal many implementation and hardware details from you by default.

It also doesn't describe the syntax of most Lisps, including all of the major ones. Common Lisp, in particular, has extensive syntactic sugar, as shown in kazinator's post.

> structural elements over decorative design

I mean, Lisps don't usually go out of their way to look "pretty", but neither do they actively attempt to look ugly. Most Lisps' syntax has had design effort put into it - the authors strive for consistency and ergonomics.

Taste is subjective, and I personally strongly prefer the syntax of Common Lisp to C or Python - both in terms of aesthetic appearance, and in terms of ergonomics.

Furthermore, very few programming languages (excluding most Lisps and all major languages) are intentionally designed to be "decorative". Whitespace, for instance, is added into languages for readability and parsing determinism, not beauty or decorative appeal.

I don't think that this metaphor is accurate at all. There are many criticisms that you can make of both the Lisp family and specifically of Common Lisp, but I don't think that "brutalist" is a valid one.


Why do you think Lisp never gained significant popularity?

I think I understand why certain people really like Lisp -- I have enough experience with enough programming languages to understand the appeal. But I also believe it's unappealing to the majority and I don't think that's a controversial stance. It's not a simple matter of syntax -- I don't think it can be fundamentally still be Lisp and be broadly appealing.


> It's not a simple matter of syntax -- I don't think it can be fundamentally still be Lisp and be broadly appealing.

Python is, semantically, pretty similar to Common Lisp - they have far more in Common than they do with C++, for instance. So, even if it's not the syntax, it's clearly not the semantics or language behavior, either.

I'm pretty sure it's the syntax, though. You used the word "unappealing", which often refers to a surface attraction or draw. I think that this is absolutely correct - Lisps are unpopular in large part to their very weird syntax, and indeed, one of the most common comments that you'll hear when you bring up Lisp is "Is that the weird one with all of the parentheses?"

This is further reinforced by the fact that most programmers have never even tried to run a single line of a Lisp - and the vast majority haven't invested enough time to learn enough to write a non-trivial program with it, and therefore haven't had enough exposure to even judge whether they like the language or not. Therefore, any reason for unpopularity must be superficial.


It's very easy to shallowly dismiss something if

- none of your friends or coworkers use it.

- there are few jobs requiring it.

Basically, you can mock it all you want with practically zero social risk.

Jokes that target people of some specific nationality or regional group are not as tolerated any more nowadays, perhaps depending on the social company, but this is like those jokes. For instance "dumb Polack" jokes.

You've heard the one: "Frederic Chopin and Nicolaus Copernicus walk into a bar, immediately bespying Marie Curie sitting by herself. Stanislaw Lem is tending bar, pouring her a drink. ..."


I think the majority of developers simply don't know whether they would like Lisp or not. It's just not that well known.

Mainly, Lisp was developed on institutional hardware in the era of big iron. It became larger and more complicated, keeping up pace with the capabilities of those machines, which Lisp could easily strain.

Suddenly, microcomputers showed up in the 1970's. The industrial strength Lisps just wouldn't go down from hundreds of kilobytes or megabytes of RAM required down to two-digit kilobytes.

The situation didn't start to improve until probably the late 1980's.

Some developers used Lisp for prototyping, and had the expensive hardware; then to create a product to ship to users on consumer hardware, they rewrote the prototype. That story was exemplified by the CLIPS expert system.

An entire generation of programmers who started on microcomputers in the 1970's and 1980's wouldn't have even had opportunities to use Lisp.

When I started programming in 1982 or so, I inherited a stack of Creative Computing magazine back issues, so I read about all sorts of interesting stuff, including Lisp; but I wouldn't have been able to get my hands on it anywhere: and not for the lack of knowing about it or wanting.

Microcomputers steam-rolled a vast amount of software that preceded them. Software that didn't make the jump from big iron to microcomputers either died or was relegated to niches. Fortran, Cobol, operating systems like Multics, you name it.

We use Unix-like systems today because Unix started on minicomputers that were close in capability to microcomputers. There was only a small lag time between Unix being "bigger iron only" to running on microcomputers. It used pretty modest resources; e.g in 1980, /vmunix was maybe 50-something kilobytes. But even that was a little bit large, and the microcomputer jump did hurt Unix; it allowed disk operating systems like CP/M, PC-DOS and MS-DOS to gain a foothold. Unix never quite recovered; we now have free versions that are quite popular, but mostly in serving and embedded.

Speaking of Unix, I read a book on using Unix in the mid 1980's. As a kid. I had nowhere to actually access it for hands-on experience, but was exciting to read about. You had to be a university student, or else a grown-up working at some company or institution. (Or else, dial into something and break in.)

Anyways, the time lag between the start of the microcomputer revolution, and Lisp being able to run well like on 1970 big iron, was quite long! Long enough to see people retire. There is nobody out there to do mentoring. If millions of programmers started to be interested in Lisp, there would be nobody available for mentoring. Everyone would have to use just books and such. You can hardly find a course to take or anything like that. Some universities use some Lisp or Scheme dialect in a few courses, and that's about it.

Even if every great Lisp hacker who ever lived was still alive and interested in mentoring, it wouldn't be enough; there simply weren't that many people in computing 30, 40, 50 years ago.


Lisp was invented 60 years ago and we're still talking about it today, right now. I don't think the issue is that isn't well known. Plenty of programming languages have sprung to life in that time from nothing and became more popular.

Lisp is more capable now than any time in it's history and if you wanted to learn about it then there's also no better time in history to do that than right now.


Lisp is absolutely not well known. Some programmers may know the word Lisp and that it refers to some kind of programming language, the same way that they might know about Fortran, and possibly Algol, PL/I or Snobol. People talking about it today are not average programmers.

I agree that there is no better time than now, but at the same time, the ratio of Lisp experts to total developers may also be at an all-time low.


But why do you need experts? Plenty of languages have been created out of nothing in the last few decades and have more users than Lisp. Obviously a new language starts with zero experts.

And Lisp has experts! You're literally posting this to a massively popular forum for developers and it's written in a Lisp dialect!

Reddit, arguably one of the largest sites on the Internet, was written in Common Lisp before being re-written in Python.


What do you think it is?


Marketing. Too focused on AI, followed by more than a decade of breathless hype for C++ and Java.


Nailed it. More generally: "non-technical cultural factors".

I would also add some minor incidental technical factors (e.g. early Lisp implementations were far less efficient than C implementation during the time when efficiency really mattered, and then cultural/marketing factors kept C as the dominant language far after the technical balance tilted in favor of Lisp).


agreed, and well said

in broadstrokes, imo, Lisp's endless plasticity of infinitely nested parentheses is both its greatest strength and... its greatest weakness

I love it at Write time. hate it at Read/Maintain time. why I've avoided it for serious work. Python, Java, Go Rust etc are all easier for my brain & eyes to quickly parse on the screen

which is unfortunate. because I still drool at Lisp's linguistic power


Lisp at "read/maintain" time in a good environment is probably the most enjoyable time for me working with Lisp, assuming it's on a code base that followed some sort of coding standards. Introspection, debugging, and navigation in a Lisp environment are incredible.


Still the average function written by a smart lisp programmer is going to make me sigh, take three deep breathes, and set aside some time to focus, while the average function by a smart Go programmer, I can just read.

This readability loss does counteract the ability to reload forms into a running core. Especially now when i am trying so hard to make everything running be immutable and derived from various git repos.


I don't think this is a Lisp thing, though - you get this same problem with Haskell and C++ and Perl as well, because the cultures of all of those languages normalize and encourage writing clever, difficult-to-read code.

Similarly, if you're working by yourself, or in a team with shared norms, you care far less about average function written by a random smart Lisp/Haskell/C++/Perl programmer, and more about what (a) you/your team writes and (b) the ergonomics of the language itself.

In fact, I argue that Lisp is slightly better than the above languages because much of its fanciness comes from macros, which you can trivially expand inline to simplify the code - which you can't do with fancy category theory constructs in Haskell, and is only barely possible with boost:: templates in C++, for instance.

(that isn't to say that I disagree with your point "the average function written by a smart lisp programmer..." - I completely agree with your assessment, I just don't think it's either as big of a problem as it could be, nor is it unique to Lisps)


> . You can see the modern GC-based/"managed" languages, perhaps most notably with Java, as Lisps that avoided this significant pitfall.

An interesting perspective. From my POV, it's hard to think of a less Lisp-like language than Java. COBOL, maybe.


The greatness of Lisp -at least when it comes to end-user empowerment- and (I think) the only differentiating factor that most other languages have still not caught up to, is the cybernetic philosophy with its roots in Licklider (man-computer symbiosis) and Engelbart.

Building an environment that strongly and uncompromisingly expresses this philosophy at its core is a serious undertaking in terms of time investment. Emacs has been in continuous development for 37 years and while it is still not as good as Genera, it's certainly "good enough" for lots of people and definitely captures the spark of this philosophy.

In the Common Lisp world, we've had plenty of tries (MCL, Hemlock, McCLIM) but they've all failed to get people to coalesce and generate progresss.

Maybe the fact that Emacs exists is a curse in that people realize the barrier they'll have to clear to make something substantially better and decide to devote their energies into more easily realizable efforts.


Anyone interested in the computing holes that can be filled by lisp machines should check out Urbit. There is an vibrant and growing community of people building a network of personal servers running a lisp-like functional OS. It uses an identity system secured by the Ethereum blockchain and it has created a bottom up economic incentive for developers to participate. They are starting to solve unique problems that couldn't be addressed on currently prevalent platforms. Urbit is an affirmation; we can ditch the nihilism.


i associate lisp machines with power, simplicity, and a delightfully shallow learning curve.

urbit to me is exactly the opposite


And it's 4x more expensive than it was supposed to be to buy a planet.

Ethereum was a mistake.


Since they moved to Layer 2 it has been amazing. No eth fees anymore.

They should just move to their own chain entirely. Maybe this is a baby step towards that. Each Galaxy/Star is already basically a staker in a Proof of Stake network.


Always be griftin


if they were on their own POS chain, they wouldn’t need to charge anything. it is just in the galaxy/stars best interest to maintain the ledger of activity.

galaxies/stars can just charge for services provides like an ISP basically


It's $3. www.azimuth.shop


>They are starting to solve unique problems that couldn't be addressed on currently prevalent platforms.

Got any examples?


> Yet, it appears the median and mean Lisp programmer is producing Yet Another (TM) test framework, anaphoric macro library, utility library, syntactic quirk, or half-baked binding library to scratch an itch. Our Lisp programming environments are less than what they were in the 80s because everybody feels the current situation with SLIME and Emacs is good enough.

I don't think this is true. Not anywhere close. Most such examples are small, and probably only took a small number of hours to produce. While "superlative" stuff takes very many man hours to create. So just by seeing that there are many throwaway testing frameworks or whatever, you cannot tell where most of the work hours actually go. A half baked binding library takes 20 minutes to make, while a proper high quality rendering engine takes hundreds if not thousands of hours.

The Lisp population is thin on people making cool shit because the Lisp population in general is thin.


I'd say it's both. It seems most lisping is done high on the stack. Some are doing assembler level lisp (or scheme) but less bare metal / OS / system oriented lisp.

I wonder what the lisp os guy are thinking about OS / UI these days.


Personally, I think the problem is that CommonLisp is just another programming language, whereas Lisp really shines when it provides a full-fledged programming environment. Nowadays, it would seem best to create such an environment on top of commodity hardware as a "virtual machine" that abstracts away from the hardware in a general, portable way. However, a good environment (1) needs a purpose, and (2) somebody needs to write it. Lisp currently fails on both points. The purpose used to be symbolic AI and NLP among other things. Nowadays it could be the same, or a web framework with integrated distributed computing and database, or scientific computing, or a server for certain business services, etc. There are many application domains for which a virtual "Lisp machine" would be beneficial but it needs to be developed for one of those real world applications, not just as a toy like existing attempts of building Lisp machines. And in my opinion the problem really is (2), developer power / size of the community. If you exclude super-expensive legacy Lisps, the current Lisp community doesn't even have a fully supported Lisp-native editor (portable Hemlock is not good enough) and also doesn't have good enough object persistence / native Lisp-based databases. Both are the foundations of any integrated Lisp machine.

People sometimes claim CL+Emacs+Slime provides the full interactive experience. I just can't agree with that at all. I have tried, and the experience was not substantially different from Go development and development in any other fast-compiling language with Emacs. In some respects, it's even worse than with various modern languages, even though most of those languages are strictly inferior to CL from a pure language perspective. If editing files in Emacs is all there is to the allegedly great Lisp experience, and developers at the same time have to deal with all those idiosyncrasies of CL such as CL's filesystem path handling, ASDF, and tons of poorly documented libraries, then I can't really see the advantages of CL. The language is powerful, don't get me wrong, but a truly interactive experience is totally different. Smalltalk managed to keep this experience but for some reason the Lisp community seems too have lost this vision. I guess the developer community is just not large enough.

Anyway, before someone tries to build another "close to metal" Lisp machine or tries to revive any old Lisp machine, I'd rather wish the community would focus on creating truly integrated development environments that abstract away from the host system and are fully hackable and editable from the ground up while maintaining security and multi-user access. A "virtual Lisp" machine with virtual OS, so to say. If that's developed for a certain purpose like building and deploying extremely portable web applications, I believe it can have a great future.

Sorry for the long rant. This is just my impression after having programmed in various Lisp dialects for the past three decades.


> the experience was not substantially different from Go

I think the (a?) reason for that is a (otherwise good) shift to production being immutable. When you aren't allowed to interact and change code in a running system, you lose a massive advantage of CL over simpler languages. When the answer to every error is "send a 4xx or 5xx to the client" then having a powerful error-handling system is similarly pointless. When you only write CRUD programs like everyone else, you're just plugging together other's libraries and not creating novel techniques or fancy math. In this world all CL's advantages are negated.


Common Lisp on Emacs via SLIME is not competitive with Smalltalk, re: "interactive experience" since Emacs is not the native substrate of CL, but essentially an out-of-process experience. If you want to experience CL at its best, you need to run Symbolics Genera.

Emacs with Emacs Lisp on the other hand offers a great interactive experience that also manages to easily surpass every modern Smalltalk incarnation in practicality and size of development community. So if running Genera isn't easily doable, this will give you a taste of what Lisp interactivity is all about.


To me development with SLIME is much better than with a fast-compiling language.

- Debugger is always ON.

- I can inspect the data I'm working with.

- I can redefine things without starting everything all over, avoid losing current context. Fast restart is not the same.

- I can evaluate pieces of code without the need of a REPL. Point to an s-expression and evaluate that piece of code, inspect the result.

I don't see how Smalltalk is much more interactive. It is more powerful at graphics and tools integration, but SLIME provides an interactive enough experience IMO, and it is significantly better to any fast compiling + restart language.


I prefer Factor for that.


> "We don't "need" Lisp machines. We "need" Lisp software."

Nobody goes into Java because their self identity is "a Java programmer" to gather a team of people to create a Java machine running Java software to unleash the power of Java for the masses by enabling them to do everything in Java for the sake of doing everything in Java, By Java, With Java, For Java. And if they do talk like that they would be a Sun Microsystems marketing pamphlet from 1999, or a joking reference to Zombo.com, or suspected of having zero interesting ideas and defaulting to Ouroboros-naval-gazing.

Adobe Photoshop Lightroom is C++ and Lua. Blender is C++ and Python. Excel is C++ and Visual Basic for Applications. LibreOffice Calc is C++ and Python. These are large, popular, programmable systems which exist today and are good enough; good enough for people to spend lots of money on them, good enough for people to spend years skilling up in them, good enough that once they existed people wanted them to keep existing and they haven't faded into the past.

The added allure of an imaginary rebuild of them like "if you took the lid off Excel you'd see VBA inside so you could rework the way it handles multiple sheets using only a skill you have and software design skills and Excel-internals knowledge you don't have" would get a hearty side-eye and slowly backing away from most Excel users. "Everything inside looks the same" is as attractive as "if you open your car trunk you'll see leather seats and plastic dashboard components making it move" or "if you sit in this car you're sitting on engine parts because the top priority is that a welder in a scrapyard can build the entire car without leaving their comfort zone". There are certainly people who want that, but the way the world hasn't developed that way suggests it isn't particularly desirable. Even when such things have been built, people can today use a Smalltalk, an APL, save their running work in a memory-dump and reload it and rewrite parts of it in itself, people flocked to Jupyter notebooks instead.

> "[1] Kandria is a neat platformer developed entirely in Common Lisp"

https://cdn.akamai.steamstatic.com/steam/apps/1261430/ss_a3f...

Without mocking a team who has built, developed, polished and planned to release a project, because that is respectable, it also looks like computer gaming of the early to mid 1990s Intel 386/486 era; remeniscent of Prince of Persia, Gods, Lemmings, Monkey Island. But it needs an Intel i5, 4GB RAM and 1GB storage. It's not even released yet and has no reviews, but you describe it as 'superlative' ("Of the highest order, quality, or degree; surpassing or superior to all others") - are you rating it so highly based on it being written in Lisp or what?


I don't know how to respond to the whole "Lisp programmer identity" stuff; it doesn't seem relevant to anything I said. I also didn't suggest anybody rewrite anything in it. The success of Lisp doesn't depend on the existence of fancy machines, it depends on people choosing to write software in it. That's basically all I meant to say.

As for Kandria, did you play the demo, or did you just look at screenshots and system requirements and make your brazen judgment? I don't think Kandria is at all amateur or sloppy, regardless of to which aesthetic era you think it belongs. Many have claimed that it's not even possible to write a game that doesn't blow because Lisp is dynamically typed and garbage collected. Now the goalposts have moved to, "well, it takes 1GB of memory and doesn't even look like it's from 2022."

I commend anybody who ships.


It's relevant in the sense of being a reply to "we need software written in Lisp" and how, if you substitute Java and say "we need software written in Java", people would just shrug and ask "why do we?". People are saying "we need sofware written in Rust" and other people are asking "why?" and one answer is "to avoid the memory and race condition problems we have from C and C++ code". Maybe correct or not, maybe compelling or not, but it's a practical, outward-looking concrete reason. The answer for Lisp from your comment is "The success of Lisp doesn't depend on the existence of fancy machines, it depends on people choosing to write software in it" and that's the kind of self-referential Ouroboros loop I mentioned. "OK but why?". The success of COBOL depends on people choosing to code in COBOL, but nobody uses that as a reason to support COBOL. People who identify as "Lisp programmers" are going to care about that because if it dies, their identity dies (which is daft because as you say, anyone can choose to write a LISP environment at any time).

> "Many have claimed that it's not even possible to write a game that doesn't blow because Lisp is dynamically typed and garbage collected. Now the goalposts have moved to, "well, it takes 1GB of memory and doesn't even look like it's from 2022.""

JavaScript appeared, took over the world, demonstrated good games in a dynamically typed and garbage collected language at least a decade ago. Goalposts move, time moves on. Some AI people complain "once you wanted AI to beat chess, now that's not good enough? Stop moving goalposts!". Software today can recognise people, generate images from text descriptions, complete sentences, describe photos, self-drive cars over constrained environments, walk robots over rough terrain and jump onto ledges, steady cameras on flying drones following a person. DeepBlue beating Kasparov was impressive in 1996, it's not impressive now. There are AI experts today who were born after that.

Especially contrasted with "it's the best language", "superlative applications", linking something which looks like software of 25 years ago on 10,000x less powerful computers is a big difference in expectations. (It may actually be an amazing game, hence me asking what it was that made you say it is, before it's released). Years ago a company writing Transport Tycoon in assembler was very impressive. Now a single person can write a math animation video generator in Python as a hobby side-project while being a double-major student.[1] Expectations ramp up, year on year, and "coding on the libraries of giants" is a real effect. Pythagoras calculating the length of a hypotenuse was impressive. A school student doing it today isn't.

Hacker News is written in Arc. It's impressive to build a language and build a forum in that language, even though forums existed years ago. But if someone claimed it was the best language which needs to be preserved because it can do the best things, and then used HN as the example, anyone who has used a modern forum with all the trimmings would do the "yes, Grandpa, everything was better in the old days" polite smiling and nodding.

Electron isn't bad because it lacks Lisp, it's bad because it's sluggish and ramps up fans and drains battery life. WhatsApp isn't great because it was written in Erlang, it's great because it connected hundreds of millions of people on all kinds of featurephones and early smartphones. Visual Studio Code isn't very customisable because it's written in JavaScript, it's customisable because they built it to have lots of extension points. The answer to why we need software written in Lisp is like the answer to why we need software written in APL or Prolog: we don't. We also don't need softwre written in Java or C# or Python or Ruby. We may need software written in C, x64 Assembler, because of hardware lock-in. Tools are for doing things with, not for falling in love with.

[1] https://www.3blue1brown.com/about


I think we are talking past each other, and I think it's because you're fixated on and aggravated by a premise I did not stipulate, which is that Lisp is superior to other languages, and as such, you're interpreting "superlative" as "the best software ever written", instead of how you're supposed to interpret it, which is "the best software written in Lisp in the past 10 years", which is precisely what I referred to.

In isolation and without context: We do not need software written in Lisp. Nothing compels us to choose Lisp as a language to express programs that solve problems.

If we want a non-UNIX, non-C ecosystem, then what we need is software, not hardware. Lisp is one type of previously proven ecosystem that works. So it is reasonable to discuss that as a potential option, which is partly the topic of the article we both are posting comments to. I argue that if we want a Lisp ecosystem, we need better and more comprehensive software written in Lisp.

If we don't want a non-UNIX, non-C ecosystem, then we needn't discuss writing software in Lisp (or Smalltalk or ...) as a possible solution to that (non-)problem.

In any case, I simply argue that old hardware or even obsolete operating systems aren't really a productive thing to talk about in this context, except on a case-by-case basis.

Again, regarding "superlative", the word is in reference to last-decade Lisp software alone, which I contend describes Kandria. I don't even attempt to compare software written in Lisp to the entire universe and history of software, which it seems you're doing, but doesn't seem pertinent to the discussion. (Though, to be clear, even when comparing to all of that, I still think you're wrong. But we can argue to no end about our own subjective opinions.)


>anyone who has used a modern forum with all the trimmings would do the "yes, Grandpa, everything was better in the old days" polite smiling and nodding.

HN could have indeed been written in many languages, but IMO the simplicity is a strength. I'd rather be here than twitter, reddit, or any other forum. There's no dark patterns, no recommendation engines or other annoying nonsense trying to game "engagement", no ads, basically none of the things that make the modern Internet an awful user experience.


Regarding Kandria:

You might have heard of this thing called "Art", and that it has styles, and that not only one of them is called "pixel art" for celebrating that kind of limitations, Art as a whole is often talked in terms of self-imposed limits used in creation of a work.

That said, a game can deliberately target such style, and yet hide considerable richness of implementation (consider: Noita, Dwarf Fortress).

Another thing with Shinmera and his team is that they produce both interesting stories, interesting games, but also code I'd argue is art too.


> "That said, a game can deliberately target such style, and yet hide considerable richness of implementation (consider: Noita, Dwarf Fortress)."

It definitely can, and Monkey Island and other SCUMMVM games are examples of having lots of gameplay and comedy, despite constrained graphics. As are card games and board games, for that matter. Fancy 3D isn't the be-all, end-all. In more recent years SpeedRunner[1] isn't pixel art but it's quite simply styled, and is fun for leaning so heavily on a single game mechanic.

I'm not meaning to diss Kandria which might very well be a great game. I meant to call out the gulf between Lisp as "the best language" and "superlative applications" being developed with it then linking to Kandria might set one up to think of the "best games" of recent years, by popularity or profitability or ambitiousness or multiplayability, or replayability, or storyline, or VR support, e.g. Pokemon Go, Grand Theft Auto series, Dark Souls series, Fortnite, Fifa, Half Life Alyx, Tony Hawks Pro Skater remakes, Spiderman PS5, Elite Dangerous, Roblox and Minecraft, Final Fantasy series. Which all seem to have done alright without Lisp, and no game development houses have done the Paul Graham "Lisp as secret weapon" to make a game nobody else can make with other tools.

[1] https://steamuserimages-a.akamaihd.net/ugc/58020036161797293...


Well, the end effect was that your post seemed to concentrate on dissing Kandria, not the way the article went about it (which wasn't the best honestly).

As for games where Lisp was secret sauce, Uncharted series would count, I think ;) It was a return to GOOL-style programming (like in Crash Bandicoot) instead of full engine written in Lisp (like GOAL was for Jak&Dexter), however for narrative-heavy game the tooling used for making that narrative should count as secret weapon.

BTW, a huge underappreciated thing about why Kandria can't fit into less RAM is simple fact that today games generally render at much higher resolutions, and I believe Kandria does have multiple buffers and passes involved (like most game rendering engines these days, though I will have to read Shinmera's paper about shader composition system to verify)


Meh the problem is "Which Lisp?" There are dozens of incompatible Lisps. Even this site is written in a Lisp dialect written by its author (Arc).

In fact I conjecture that this is the reason Unix is more popular than Lisp -- because Lisps don't interoperate well. They haven't built up a big ecosystem of reusable code.

Whereas Python, JavaScript, R, C, C++, and Rust programmers can reuse each others' code via Unix-style coarse-grained composition. (Not just pipes -- think about a web server running behind nginx, or git reusing SSH and HTTP as transports.)

You can also use link time composition. It takes some work but it's better than rewriting your Common Lisp code from scratch in Clojure.

-----

Honest question: how do you communicate between two Lisp processes on two different machines? I know Clojure has EDN (which is sort of like JSON : JavaScript), but I haven't heard of the solutions for other Lisps.

I wrote about this problem here: A Sketch of the Biggest Idea in Software Architecture http://www.oilshell.org/blog/2022/03/backlog-arch.html

> The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script).

I'll definitely update it if there's something I'm missing.

I would say the design of Unix is "rotting", but the answer is to IMPROVE Unix. Not dream of clean slate designs that will never be deployed. Plus this post doesn't actually propose anything. If you actually start trying to build your Lisp machine, I believe you will run into dozens of reasons why it's not a good idea.


You are aware that Lisp machines understood several different flavors of Lisp? The Symbolics ones understood Zetalisp and Common Lisp at least. Were they on the market today they could be convinced to run Clojure and Scheme as well. There are a few old-timers developing Common Lisp applications to run on modern hardware, using Symbolics hardware.

In fact, Symbolics shipped compilers for other non-Lisp programming languages, including C and Ada. These interoperated smoothly with Lisp code, much more so than they do under Unix. In this demo, Kalman Reti compiles a JPEG decoder written in C, and replaces the Lisp JPEG decoder that came with the system with it, yielding a performance boost:

https://www.youtube.com/watch?v=o4-YnLpLgtk


OK interesting, I will check out the link.

I still think various Lisps don't interoperate enough today, but I'm not very familiar with the Lisp machines of the past. If it can interoperate with C and Ada that's interesting. But I also wonder about interop with JavaScript :) i.e. not just existing languages but FUTURE languages.

These are the M x N problems and extensibility problems I'm talking about on the blog.


If you’re fine with ES3, there’s https://marijnhaverbeke.nl/cl-javascript/ (I’ve been intending to use Babel to compile modern JS to ES3 and then load it into CL, but no time)

Or parenscript: https://parenscript.common-lisp.dev/

Or CLOG: https://github.com/rabbibotton/clog

And, there’s always the option of writing an HTTP or websocket server and serving JSON.

I personally use the SLIME repl in emacs for a ton of things I used to use bash one-liners for and you can push it pretty far (this is an experiment, I’ve wrapped it up more nicely into my .sbclrc since):

https://twitter.com/fwoaroof/status/1502881957012668417?s=21...


I'm not sure what you mean by "The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script)." You can get two Lisp programs (or two Python programs, or two mixed-language programs, etc.) to intercommunicate without involving the shell at all.

It'd be more accurate to say "The lowest common denominator between a Common Lisp, Clojure, and Racket program is sexpr notation." By using sexprs over a pipe, named pipe, or network socket, you can very easily get any of those Lisps to intercommunicate deeply structured data with any other. This is how SLIME and its SWANK protocol work. I don't even think the shell is involved; Emacs handles spawning the inferior Lisp and setting up the communication channel itself.

The thing the Lisp machines had was a very robust ABI. Lisp has a specific notion about what a function is in memory. This is largely because Lisp is always "on" on a Lisp machine, there is no point at which you cannot take advantage of the entire Lisp runtime, including the compiler, in your own programs. Accordingly the Lisp machine C compiler output Lisp functions containing code compiled from C, that could be called by Lisp functions directly (and vice versa). Presumably a JavaScript runtime for Lisp machines would be able to do the same thing.

By contrast, C has no notion of what a function is; and the C ABI used by many operating systems presents a function as simply a pointer in memory that gets called after some parameters are pushed to the stack or left in registers. (How many parameters, their type, and their size, is unspecified and simply agreed upon by caller and callee.) Nothing about memory management is provided by the runtime either. All that has to be provided by user programs. All this adds friction to interoperation by function call, and makes IPC the most straightforward way to interoperate.

But oh well. Our computers are all slowly turning into JavaScript machines anyway, so maybe those Lisp/Smalltalk happy days can return again soon.


What I mean is "coarse-grained composition with text/bytes" (described in the blog post in my first reply)

If you're thinking that's tautological (how else would Common Lisp and Clojure communicate?), then this subthread might help with the context:

https://news.ycombinator.com/item?id=30814716

I don't think Common Lisp, Clojure, and Racket are compatible with Emacs' s-expression format. Lots of people here are saying they use JSON or the like with Lisp, not s-expressions.

Emacs can use its own format to communicate with itself, because it controls what process is on the other side of the wire.

(In any case, any variety of s-expressions IS TEXT; there are non-trivial issues to get it back in memory -- like what to do about graphs and sharing, node cycles, integer sizes, string encodings, etc.)

But the point of the slogan is that when you have even a small bit of heterogeneity (among Lisps) then what you're left with is "Unix". A Lisp machine isn't a big win for this reason.

It is cool that Lisp machines had a robust ABI. That could solve the IPC problem. But then you have the problem of a Lisp machine communicating with a Unix machine, which I'm sure was a very real thing back in the day. So now you're left with the lowest common denominator of Unix. Again that is coarse-grained composition over wires, which is analogous to shell. The shell doesn't have to be involved strictly speaking, but the syscalls that are made and parsing that is done is pretty much the same.


> In fact I conjecture that this is the reason Unix is more popular than Lisp -- because Lisps don't interoperate well. They haven't built up a big ecosystem of reusable code.

And why are there so many? IMO the language is too flexible for its own good. It promotes this curious intellectual solo-competition where you try to prove you are worth of the elite who get all the fancy FP stuff and make all this bespoke functionality.

It's almost impossible for Lisp to be popular. To be popular means a lot of people use it, but that means it can't be complicated much above the median. But because it lets individual programmers push intellectual boundaries it self-selects itself out of this pool. Any big collaborative project will attract this type of developer and soon the lesser developers (who don't dare object for fear of appearing dumb) are not able to contribute.

Just my opinion, if a little dramatic.


Haskell has its share of fancy FP stuff, and people manage to develop workable things in it. I still think Lisp really is too dynamic to be useful beyond a small scale of development.


> Honest question: how do you communicate between two Lisp processes on two different machines? I know Clojure has EDN (which is sort of like JSON : JavaScript), but I haven't heard of the solutions for other Lisps.

Probably TCP or UDP based protocols like essentially every cross-network communication in every language today.

EDIT: Also, it should be noted that JSON does not, itself, allow you to communicate across a network. It's just a serialization format. You still have to select some protocol for actually doing the communication. If your answer to the question "How do you communicate between two JavaScript processes on two different machines?" is "JSON", you've failed at actually answering the question.


Right, so that is what I'm getting at. If you have two Lisp programs on different machines, or two programs written in different Lisp dialects, the way you compose them is basically "Unix" -- serialize to some common format and send over a pipe / socket.


What alternatives are you possibly allowing for, if you're taking IP and wire protocols off of the table?


I agree there aren't many practical alternatives to serialize -> wire -> parse, because that's how all storage and networking hardware works.

But I keep having this conversation with people who are mad at Unix (and Unix shell) and searching for an alternative. Some of them are just fantasizing, as with this post, but some of them are actively working in that direction.

One possible alternative is if the kernel grew a mechanism to pass linked data structures directly, instead of requiring serialization and parsing (a little like Go channels perhaps). I suppose you can kind of do that with shared memory -- but not really because copying is roughly equivalent to serialization.

This might sound like an unrealistic idea, but people are seriously arguing for it. And of course they always argue in favor of their favorite language's data structures, not realizing it comes up with a boatload of design choices specific a problem domain, and that other problems require different data structures.

This post is arguing for something even more radical: an entire machine in Lisp, with a single address space (really bad idea).

----

The other more practical thing I'm arguing against in the post is a certain design of alternative Unix shell [1] that I've been seeing a lot.

The design is that basically that all the tools like 'ls' and 'rm' and even networking tools are expected to be statically linked into the shell binary.

And many of them pass structured data directly, specifically to get rid of serialization and parsing. There are at least 3 or 4 shells with this design.

My point is that this creates a "two-tiered" shell design and that composition is inhibited. In my mind the point of the shell, and the power of it, is to be situated within or "woven into" the OS. It interacts with everything on the system. It's not a "world" unto its own. You should be able to whip up a script in your favorite language and call it from your shell, and compose it with other tools you didn't write. You shouldn't be forced to "escape" the shell to call outside tools, or write algorithms in shell like you would write algorithms in Python or C. The shell is for coarse-grained composition and coordination.

[1] list here: https://github.com/oilshell/oil/wiki/Alternative-Shells

----

Another possible alternative is something like https://www.unison-lang.org/ where source code is not text; it's structured data. This has big consequences that I hinted at in the prior post, and have in my notes for upcoming posts:

They are writing their own version control system and editor because they don't use text. (I think experiments like this are interesting; it's definitely worth some portion of our efforts.) It is true that they have to use byte streams at the end of the day because they have to use storage and networking hardware, but the user is supposedly insulated from any of that. That is, as long as you stay in this hypothetical universe. Again I will bring up interop as a big problem.

A lot of people seem to imagine "monoglot" worlds where they wrote everything from scratch and no other code exists. Everything done in the past must be thrown out because we know better now. :)

Elm is a more widely used language with similar interop problems.


The canonical Lisps still widely used today are Common Lisp, Scheme and Emacs Lisp. They all belong in the same family, and syntax / semantics are close. Porting code from Scheme to Common Lisp can be a lot easier than going from Python 2 to Python 3.

Clojure is something else entirely which is why a lot of people don't consider it a Lisp.

> Honest question: how do you communicate between two Lisp processes on two different machines?

If you want to use built-in object serialization, there is print and read.


> Common Lisp, Scheme and Emacs Lisp... all belong in the same family

Could you say more about what you mean by this? Is there another family of Lisps that excludes these three? I've met people who make a big deal about lisp-1 vs lisp-2 (https://en.wikipedia.org/wiki/Lisp-1_vs._Lisp-2), and which is the right way to be a Lisp, but I think maybe those people just enjoy being pedantic.


I was referring to Clojure which going by syntax/semantics does not belong in the same family as Common Lisp, Scheme, Emacs Lisp.


Makes sense. I think many people would put Scheme in its own branch as well.

E.g. https://gist.github.com/Aethaeryn/3036597


An unappreciated means of code reuse under *nix is the static and dynamic library. This seems to be the go-to whenever you need something more involved than simply reusing a full binary via pipes.


C's greatest feature is that it trivially maps onto .so files. Linking to and creating .so files isn't just cheap, it's effectively free.

Most higher level languages I've worked with seem to focus on using .so files rather than producing them.

This means the lowest common denominator for the unix ecosystem is what C can provide. Otherwise stated, unix is marching forward at the pace of C.


This is one of the things I like so much about Lua, LuaJIT in particular: it's designed around being just another .so file, from another clib's perspective Lua is a library for manipulating a calling stack, and the LuaJIT ffi makes it short work to adapt a header file so that structures and functions can be used from within Lua.


No wonder, given that C was created to turn UNIX from an originally Assembly written OS into a portable one, UNIX is effectively a C Machine.


> Honest question: how do you communicate between two Lisp processes on two different machines?

Depends what level of integration you want: custom tcp/udp protocols, HTTP and websockets are all pretty easy. But, you can also use something like this over a trusted network/vpn: https://github.com/brown/swank-client


> how do you communicate between two Lisp processes on two different machines?

Same as every other language: you pick a protocol and use it on both sides. Many of us already have enough JSON in play that it makes sense to start there.


> Honest question: how do you communicate between two Lisp processes on two different machines?

using continuations with sexps is insanely powerful

send (+ 1 3 (lambda (x) `(handle-response, x)))


It's all going to be datalisp, mark my word :)

Although you have no idea what I am talking about yet just wait a bit more :))


pretty sure hn was ported away from arc at some point.


> Meh the problem is "Which Lisp?"

Not a problem - you don't need (or want) a single Lisp. A hypothetical Lisp OS would support a standard runtime and typed IPC system that host programs can use (a la Windows and COM, or dbus), and nothing prevents you from using your own custom communication protocol over pipes/sockets instead.

I don't agree with your implicit assertion that there is a single reason why Unix is more popular than Lisp (machines) - I think that there are a multitude of reasons. Certainly, the most impactful one isn't a lack of interoperability between Lisps - it would be something like the bandwagon effect, or the fact that Bell Labs gave away Unix to universities, or the inefficient implementations of early Lisps.

I'm also fairly confident that the lack of a Lisp ecosystem is not because of a particular lack of interoperability (after all, Python has a massive ecosystem that is built almost exclusively on (a) C FFI and (b) other Python code), but for cultural reasons - Lispers just don't like collaborating together, and enjoy reinventing the wheel. These tendencies have been documented in [1] and many other places, and are highly consistent with my own experience in the Common Lisp community.

> Honest question: how do you communicate between two Lisp processes on two different machines?

Use PRINT to serialize an object on one machine, ferry it over to another one through whatever means you wish, and then READ it on the other. Or: "in the same way that two processes in most other languages on two different machines communicate". Serialize, transport, deserialize. Yes, the ecosystem is far less mature, and you'll have to do more legwork, but fundamentally the process is the same as in, say, Python. (Erlang might be an exception and have language-lever support for this, I'm not sure)

This method works for different implementations of the same Lisp, and even for different Lisps, under some constraints (e.g. while a list with the integers 1, 2, and 3 is represented as the s-expression (1 2 3) in almost every Lisp you can find, CL represents "true" as T, while Scheme represents it as #true, so you'll have to work around that). If you will, you can just use JSON or XML as a serialization format instead - every non-trivial Lisp has libraries for those formats, and some have libraries for ASN.1, too.

>> The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script).

All of those languages share basic s-expression syntax described above, which is rather higher-level than a Bourne shell script. Why do you say that the latter is the "lowest common denominator"?

For that matter, why do you exclude the idea that JSON or XML aren't "common denominators" between Lisp programs, or even between Lisps, Python, and C++?

----------------

Your article says "Text Is The Only Thing You Can Agree On", but "text" isn't even a single thing. The set of bytes allowed by ASCII vs UTF-8 vs UTF-16 aren't the same. Even if it was, plain text is purely sequential and flat. Program interoperability requires structure. If the structure isn't directly encoded in a given "substrate" (such as text), and you need another layer on top, then that substrate isn't actually the one providing the interoperability. You say "Text is the most structured format they all agree on" but text isn't even structured at all, and nobody completely agrees on it - "bytes" is the only thing that fits into place here (which is equivalent to saying that there's no structured communication method that all programs agree on, which is true).

Put another way - programs do not communicate using plain text. They communicate with either an ad-hoc protocol built on top of text (that still has a set of constraints that make it incompatible with other protocols built on top of text), or they use a standardized format like JSON or XML, or even something like ASN.1 that isn't a subset of text.

Communication using text does not make programs interoperable. Bytes might be a "narrow waist", but text is factually not - if it was, you wouldn't need to use sed/awk/perl one-liners to connect various Unix utilities to each other, because the very fact that they were all using text input/output would make them interoperable.

You say "Tables and documents are essential structures in software, and expressing them in JSON is awkward." but you can express them in JSON. You cannot express those structures in "plain text", because "plain text" is not capable of expressing structure at all, and the best you can do is build a protocol that is a subset of plain text that can express structure (JSON, XML, CSV, etc.)

------------

If anything, Lisps are more interoperable than Unix, because the "lowest common denominator" of Unix utilities is "plain text" (which by definition cannot encode structure), while the lowest common denominator of Lisps is some common subset of their s-expression formats, which can encode structure.

------------

> I would say the design of Unix is "rotting", but the answer is to IMPROVE Unix.

You say this, but you haven't given a reason for why that's the answer.

Here's a reason why improving Unix is not the answer: because "everything is text" is a fundamentally flawed paradigm that introduces needless complexity and fragility into the whole system design.

> Not dream of clean slate designs that will never be deployed.

You're mixing the normative and the positive - "Unix should be improved" with "clean-slate designs won't be deployed". Are you making an argument about what should happen, or what will happen? (there's no guarantee that any improvements to Unix will be deployed, either)

> Plus this post doesn't actually propose anything. If you actually start trying to build your Lisp machine, I believe you will run into dozens of reasons why it's not a good idea.

Oh, yes, and I can start with a few: first, there's good reason to believe that high-level CPUs are a bad idea (as suggested by the fact that nobody has been able to make an effective one[2]); second, the security-less design of older Lisp machines is a terrible idea (in a vacuum); third, that a machine that only runs Lisp is going to be unpopular and there's no single Lisp and Lisps are not the final stage of PL evolution.

...but the arguments made by the article as to why Unix is inadequate are still solid, even if the author is suggesting a suboptimal solution due to nostalgia.

[1] https://www.lambdassociates.org/blog/bipolar.htm [2] http://yosefk.com/blog/the-high-level-cpu-challenge.html


This feels like a whole bunch of misunderstandings about what I'm saying ... Almost everything here was directly addressed in the article.

For that matter, why do you exclude the idea that JSON or XML aren't "common denominators" between Lisp programs, or even between Lisps, Python, and C++?

JSON is mentioned in the post as A narrow waist, not THE narrow waist (of an operating system or of the Internet). I also mention CSV and HTML as "on the same level".

Likewise, bytes and text have a similar hierarchical relationship. I mention that in the post and also show some diagrams in the previous post.

If anything, Lisps are more interoperable than Unix, because the "lowest common denominator" of Unix utilities is "plain text" (which by definition cannot encode structure), while the lowest common denominator of Lisps is some common subset of their s-expression formats, which can encode structure.

JSON, CSV, and HTML are all built on top of plain text. You can store them in source control and you can use grep on them, and you can build more specialized tools for them (which has been done multile times.)

What I'm contrasting this with is people who say that we should build s-expressions into the kernel -- i.e. passing linked data structures directly, rather than bytes/text.

See the threads linked in the posts -- a variant of this same argument came up. This is very related to the idea of building Lisp into hardware, which I view as a bad idea.

You say this, but you haven't given a reason for why that's the answer.

The article is analyzing why empirically Unix, the web, and the Internet have worked in multiple respects -- interoperability, generality, scalability in multiple dimensions and multiple orders of magnitude, extensibility over decades, the polyglot nature, etc.

These are obviously successful systems, and I think it is pretty crazy to have the opinion that because it makes you parse things that a design that tries to eliminate parsing much be better along many/all dimensions!

These kinds of (IMO naive) arguments are exactly why I wrote the last 2 posts. They were popular and highly commented upon, and most people got it, or at least appreciated the tradeoffs I highlighted.

So I don't need to convince everybody -- as I said in the intro to the post, the more important task is to build something that embodies these ideas. I look forward to your code and blog posts; I do think there is something to the top criticism in the thread: https://news.ycombinator.com/item?id=30812626


> This feels like a whole bunch of misunderstandings about what I'm saying ... Almost everything here was directly addressed in the article.

I would definitely appreciate you pointing out exactly where the misunderstandings are! (also, ironically, most of the points that you make here were already addressed in my comment, including refutations of some of the ideas in your articles)

> JSON is mentioned in the post as A narrow waist, not THE narrow waist (of an operating system or of the Internet). I also mention CSV and HTML as "on the same level".

I saw that JSON was described as a narrow waist; however, the specific context of my question was about the use of those technologies applied to Lisps - if you believed that those were narrow waists, then you wouldn't have written both of the following:

> The lowest common denominator between a Common Lisp, Clojure, and Racket program is a Bourne shell script (and eventually an Oil script).

> What Is a Narrow Waist? [...] Small, simple mechanisms like the Internet Protocol, UTF-8, and JSON.

That is, there's an inconsistency between two statements both made by you, not a misunderstanding on my part.

> JSON, CSV, and HTML are all built on top of plain text. You can store them in source control and you can use grep on them, and you can build more specialized tools for them (which has been done multile times.)

Yes, and as I said in my initial post, that's both incidental and irrelevant. You missed or ignored several paragraphs of my comment you're responding to (starting with "Your article says"). I'll re-state for the sake of concision: the fact that those technologies are built on top of plain text does not mean that plain text is a narrow waist! All of those formats require the same level of parsing effort as they would as a byte-oriented protocol that didn't adhere to the subset of "plain text". This was, again, already addressed in my comment.

> See the threads linked in the posts -- a variant of this same argument came up.

You say "the posts" but there are at least four different "zones" that you could be referring to (the other posts on the Lisp machine article, links in your blog posts, or the Reddit or HN posts on your blog posts), and dozens-to-hundreds of individual links/comments in each of those "zones". "the posts" means nothing - please either link to a specific web page (or part of it, if it's long), or re-state the argument in your own words, so I can see what your argument is.

> This is very related to the idea of building Lisp into hardware, which I view as a bad idea.

I'm not defending that idea - I also thing it's a bad idea! I'm arguing against the specific idea of "text" being a "narrow waist".

> The article is analyzing why empirically Unix, the web, and the Internet have worked in multiple respects

You say this, but I can find zero evidence in either of your posts to support the assertion that the success of those things is a direct result of their use of plain text as a narrow waist. Thing like HTML don't count, because again, the fact that HTML builds on top of plain text is incidental and irrelevant.

> These are obviously successful systems

...and there are dozens of reason why a system can be successful, many of them social/cultural and not technical. Why is Windows so popular? Gmail? C++? Perl? SQL? Java? x86? DVD/Bluray? JavaScript? Are you going to tell me that the popularity of all of these systems are mainly due to technical reasons, and not business, social, cultural, and incidental ones?

In the specific case of Java, I can tell you that the main reason for its success was because Sun bribed universities to teach their CS courses in Java in exchange for free Sun workstations, and then the students went out and spread Java around. This is a perfect counterexample to the extremely naive idea that "technical superiority leads to market dominance", and there are dozens more out there.

Actually, given that we're on HN, any successful startup founder will tell you that technical superiority will not make your company successful (and of course we can then translate that to success in the open-source world).

> I think it is pretty crazy to have the opinion that because it makes you parse things that a design that tries to eliminate parsing much be better along many/all dimensions!

Then I'm going to say that I think that it is pretty crazy to have the opinion that a system that requires you to completely unnecessarily add serialization and deserialization for virtually no benefit whatsoever (and many significant drawbacks) is somehow better overall than a system that...doesn't.

> These kinds of (IMO naive) arguments are exactly why I wrote the last 2 posts.

I'm coming up with concrete counterarguments to every single point that you concretely state. Calling them "naive" instead of addressing their content is...not an indicator of sound reasoning, to say the least.

> They were popular and highly commented upon, and most people got it, or at least appreciated the tradeoffs I highlighted.

And, of course, none of those things are indicators of sound reasoning, either - appeal to authority and popularity fallacies, and all that.

> So I don't need to convince everybody

Well, unless you make concrete and valid arguments, I won't be convinced - although you can always find people who are convinced without those...

> as I said in the intro to the post, the more important task is to build something that embodies these ideas.

I am building things that embody these ideas. They're likely never going to be open-source, though, so I'm not surprised if you don't believe me.

> I look forward to your code and blog posts

The idea that blog posts are somehow more authoritative or substantial than any kind of writing is...something, for sure.

Unless you're saying that a design paradigm isn't good unless it's implemented in software, in which case that's clearly false, given the number of completely terrible design paradigm that have been used to design "working" software.


Some of this needs checking -- you could not run Unix on Symbolics hardware. LMI did have machines that ran both OSes -- but Unix was running on a separate 68000 processor; see, e.g. http://www.bitsavers.org/pdf/lmi/LMI_lambdaOverview_1982.pdf

(3600-series Symbolics machines also had a 68k "front end processor", but no Unix port was provided for it; they also ultimately had a C compiler that could generate code for the "Lisp processor", but the code it generated was intended to run in the Lisp environment.)

It's also worth noting that systems-level code for Symbolics machines (and, I presume, LMI as well) made frequent use of "unsafe subprimitives", misuse of which could easily crash the machine. And, unfortunately, if you needed to, say, get anything close to hardware bandwidth out of the disk drives, some of this became well-nigh unavoidable, due to poor performance of the OS-level file system (LMFS).


What one could do was running hardware Lisp Machines from Symbolics on VME boards inside a SUN: the UX400 and UX1200.

Later Open Genera was sold as a Virtual Lisp Machine running on a DEC Alpha / UNIX system.


Apparently Open Genera now even runs under macOS on Apple M1s: https://twitter.com/gmpalter/status/1359360886415233029

I think the big problem with Genera is the licensing. Although it comes with source code, it is proprietary software, and buying a license is expensive. I think the owners of the Symbolics IP have prioritised squeezing the maximum revenue out of a declining user base over trying to grow that user base.

I'm surprised "Open Source LispOS" projects have largely failed to gain traction. Writing your own OS is (at least in some ways) easier than it used to be (especially if you target virtualisation rather than bare metal). There seem to be a lot more people saying "LispOS is what we need!" than actually writing one or contributing to an existing effort to write one.


This post would benefit from further expanding some of these statements.

> UNIX isn’t good enough anymore and it’s getting worse

Why exactly?

> A new operating system means we can explore new ideas in new ways.

LISP machines were not only OSes but also hardware. Is the author also proposing running this OS on optimized hardware or simply using our x86-64/AMD/M1 CPUs?

> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.

Sure, but it also requires rewriting a lot of these things, introducing and fixing new bugs... It feels like the good ol' "let's rewrite this program" that quite frequently doesn't live up to the expectations [1].

[1] https://vibratingmelon.com/2011/06/10/why-you-should-almost-...


>> UNIX isn’t good enough anymore and it’s getting worse

>Why exactly?

Personally? We're in a bit of a transition point, and a lot of the technologies aren't working together like they used to.

An example, on my laptop I want to run android apps. The way to do this that actually works well (waydroid) only supports wayland. Unfortunately I use x2x to control another display remotely, and x2x doesn't work properly under wayland, and never will due to wayland's security choices.

So like, what am I supposed to do here? Not run android apps? Not use tools like barrier/synergy/x2x?

This is one of many many frustrations I've had from this new generation of wayland/systemd/etc. Hopefully it gets better eventually but it does feel a lot like the rug is constantly being pulled out from under me for no good reason...

Now I don't think a lisp machine is going to fix that mind you, but it is a concern.


I am finding that everything is just becoming more and more fragmented. Programming languages, ecosystems, frameworks, blah.

I have had ideas for some applications, but can’t do them because the libraries I need are written in different languages, which don’t interoperate well (and one I am not familiar in).

Every post here looking for recommendations has many responses with different packages/ecosystems doing the same thing.

Sometimes I feel like there are too many developers and not enough of them really interested in the actual hard problems. So they just make another python package manager or webapp framework.


I think they are too invested im hard problems, but not invested enough in tedious problems.

I don't want a new language, or a new preprocesser, or a new way of thinking about programs... I just want a one click way to take a folder of HTML that looks like a static site, and package it up into cross platform apps with all the proper API access.

I don't care about cryptocurrency and global decentralized databases, I just want to be able to run a site by buying a NAS appliances, putting files on it, and sharing the QR code on it without any signups and cloud accounts.

The hard problems are mostly solved. There's probably some long dead random github repo that does anything you want. They're just not packaged for all platforms and maintained professionally.

I don't need a phone without any binary blobs at all... I just want a mesh network feature and a fair trade stamp.

Because... it sucks to maintain something novel even if it's all just pieced together from npm. It sucks to be the one responding to issues. It sucks to actually run a project and keep track of devops stuff.

All the tech has been invented dozens of times over, but never polished, and always ruined with some unnecessary immutable log feature or performance destroying random access low latency routing thing like IPFS has.


There's a gap between developer culture and the giant corporate bureaucracies.

The former is focussed on tinkering, nostalgia, tool-building, wheel reinventions, and half-finished build-and-forget spare time projects.

The latter has successfully applied high-friction chokeholds around startup culture which makes the cost of entry for new business ideas far higher than it really needs to be.

There is almost no blue sky development equivalent to the original Internet, the Mother of All Demos, and the desktop computing model - and associated projects - created at PARC in the 70s and early 80s.

It's all very "mature", incremental, slow, and me-too.

And so expensive, both financially and in terms of developer time.

I'd love to see a renewed interest in attacking this from both ends - a movement to really push the imaginative limits of what's possible, while also being implacably humane and user-oriented and indifferent to ad tech, tinker tech, and pre-commercialised cloud tech.


The giant bureaucracies are doing some pretty amazing innovation. Look at Google's web platform features, Bluetooth LE, etc.

There's just certain things they can't do because their incentive is stuff you constantly pay for all the time.

I'd love to be part of something like old 90s revolutionary software.

But I've noticed close to zero interest from the dev community in anything like that.

Everyone mostly just wants to play with sorting algorithms or write their own small language, unless they're getting paid a whole lot. It's become a sport more than an engineering discipline.

It's hard to imagine Excel or BitTorrent being written now.

Heck, it's hard to even imagine web browsers with JavaScript being written now.

What kind of projects would you like to see/work on?


a revolutionary project moving forward would be a quality gopher browser with a gopher extension for forms, along with a gopherd.

the basic problem is fragmentation and assembling a community of interest.


Wouldn't gopher just be more fragmentation? What can it do that can't already be done on top of web? It's still pretty much just client/server.


"New business ideas" are really easy. VC-style hypergrowth towards unicorn status is what's hard, but make no mistake it's always been very hard. Unicorns are rare, by definition.


This is why I like Golang. I think it’s the first time in my professional experience where if I see a package that hasn’t been updated in 3 years, that doesn’t mean it’s abandoned, it means it’s stable.


Clojure is like that. I love it.


common lisp is like that.


It isn't nice? Back to the 8 and 16 bit home computers, when we had plenty of choice and real fights on the school playground about which were better.


I’m curious what you would classify as a hard problem. Personally, I think writing a good, easy to use Python package manager that both gains adoption and addresses the myriad of corner cases is a hard problem.


I work in scientific software. I tend to view really hard problems as problems where you don't know the solution, and don't even have a blueprint for how could be solved.

Writing a package manager is hard in some sense, but you have an idea of what it needs to do, what features it should have, and what success ultimately looks like.

But there are problems, large and small, which require new thinking and leaps of faith. There are people working on these, of course, but sometimes I feel they are being neglected.

(and people working on those problems probably make 10-25% of what many FANG developers make...)


> I work in scientific software

I envy you!


So a perfect use-case for the obligatory XKCD post about "too many standards, invent one more to try to solve the issue => too many standards plus one".


I don't think standards are the problem.

I think a wave of new standards causes issues like the above, and eventually this gets worked out.

Not all "standards" in practice become standard. We learn a lot of "standards" were bad ideas, and they eventually get tossed out and most people forget anyone was ever trying to make that a thing.


"Standards" in this case just means "things that I/we developed that we think should be adopted by lots of other people because it's awesome-sauce"


You can actually start a Wayland compositor/session in a X window. That plus existing solutions for Wayland network transparency should be enough.


That's what I'm doing now, but it's a pretty bad user experience and definitely isn't seamless. Also completely breaks any integration with my app menu meaning I have to rewrite all the desktop files to launch the app in cage. I imagine someone will eventually make a seamless wayland compositor for x though.

>That plus existing solutions for Wayland network transparency should be enough.

If you're saying it will fix my x2x use case, well it won't. Wayland's security model fundamentally prevents this use case. Maybe someone will add extensions to it eventually but right now the way compositors handle mouse capture seriously prevents this use case, and I'm skeptical that all the different compositors will agree on a solution any time in the next 10 years...

So I'm stuck using X on both displays for the foreseeable future if I want to use x2x/synergy like functionality, and I'm certain that it's going to become harder and harder to keep using X over time...


Wow and here I was hopeful that Wayland would actually make this kind of thing easier. I'm a big fan of synergy and all it's descendant. Apple just released a similar feature for iOS and OSX. Really surprised this is getting more difficult on Linux instead of easier.


Xerox PARC workstations could run Interlisp-D, Smalltalk, Mesa/XDE, Mesa/Cedar, thanks to this little thing RISC failed to kill, microcoded CPUs.


> > UNIX isn’t good enough anymore and it’s getting worse

> Why exactly?

Two reasons:

1 - systemd (which is moving linux towards becoming a systemd OS)

2 - developers and companies moving ever more towards web apps (which will eventually make the underlying OS irrelevant, as the browser becomes the OS) (incidentally, web assembly seems to herald the end of the open/transparent web too, as we're eventually all be running opaque binary blobs on our web browsers)


> 1 - systemd (which is moving linux towards becoming a systemd OS)

Or, a microservices OS.


There's a old book all about just that, which included and popularized Richard P. Gabriel's paper, "The Rise of Worse Is Better":

https://en.wikipedia.org/wiki/The_UNIX-HATERS_Handbook

https://web.mit.edu/~simsong/www/ugh.pdf

>The year was 1987, and Michael Travers, a graduate student at the MIT Media Laboratory, was taking his first steps into the future. For years Travers had written large and beautiful programs at the console of his Symbolics Lisp Machine (affectionately known as a LispM), one of two stateof-the-art AI workstations at the Lab. But it was all coming to an end. In the interest of cost and efficiency, the Media Lab had decided to purge its LispMs. If Travers wanted to continue doing research at MIT, he discovered, he would have to use the Lab’s VAX mainframe.

>The VAX ran Unix.

>MIT has a long tradition of mailing lists devoted to particular operating systems. These are lists for systems hackers, such as ITS-LOVERS, which was organized for programmers and users of the MIT Artificial Intelligence Laboratory’s Incompatible Timesharing System. These lists are for experts, for people who can—and have—written their own operating systems. Michael Travers decided to create a new list. He called it UNIXHATERS:

    Date: Thu, 1 Oct 87 13:13:41 EDT
    From: Michael Travers <mt>
    To: UNIX-HATERS
    Subject: Welcome to UNIX-HATERS

    In the tradition of TWENEX-HATERS, a mailing list for surly folk
    who have difficulty accepting the latest in operating system technology.
    If you are not in fact a Unix hater, let me know and I’ll remove you.
    Please add other people you think need emotional outlets for their
    frustration.
https://www.amazon.com/UNIX-Haters-Handbook-UNIX-Haters-line...

https://www.goodreads.com/en/book/show/174904.The_UNIX_Hater...

https://wiki.c2.com/?TheUnixHatersHandbook

>I'm a UnixLover, but I love this book because I thought it was hysterically funny. Many of the war stories are similar to experiences I've had myself, even if they're often flawed as a critique of Unix itself for one reason or another. But other UnixLovers I've loaned the book to found it annoying rather than funny, so YMMV.

>BTW the core group of contributors to this book were more Symbolics Lisp Machine fans than ITS or Windows fans. ITS had certain technical features superior to Unix, such as PCLSRing as mentioned in WorseIsBetter, but having used it a bit myself, I can't see that ITS was superior to Unix across the board. The Lisp Machine on the other hand, although I never used it, was by all accounts a very sophisticated environment for programmers. -- DougMerritt

https://news.ycombinator.com/item?id=13781815

https://news.ycombinator.com/item?id=19416485

>mtraven on March 18, 2019 | next [–]

>I founded the mailing list the book was based on. These days I say, Unix went from being the worst operating system available, to being the best operating system available, without getting appreciably better. (which may not be entirely accurate, but don't flame me).

>And still miss my Lisp Machine. It's not that Unix is really that bad, it's that it has a certain model of computer use which has crowded out the more ambitious visions which were still alive in the 70s and 80s.

>Much as the web (the Unix of hypertext) crowded out the more ambitious visions of what computational media for intellectual work could be (see the work of Doug Engelbart and Ted Nelson). That's a bigger tragedy IMO. Unix, eh, it's good enough, but the shittiness of the web makes humanity stupider than we should be, at a time when we can ill afford it.

https://medium.com/@donhopkins/the-x-windows-disaster-128d39...


I really appreciate your writing this reply (esp. the links). Thanks a lot, mate!


>> UNIX isn’t good enough anymore and it’s getting worse

> Why exactly?

Beside the defects well stated in the Unix Hater's Handbook, unix violate it's own principles since many years. Original unix idea was: desktops like Xerox SmallTalk workstations are too expensive and complex for most needs, so instead of a real revolution of an extraordinary outcome we decide to limit ourselves to most common needs in exchange of far less costs. No GUIs, no touchscreen, no videoconferencing and screen sharing [1] just a good enough CLI with a "user language" (shell scripts) for small potatoes automation and a bit of IPCs for more... Well... For more there is a "system language" (C) that's easy enough for most really complex task.

That was a success because no one really like revolutions and long terms goals especially if they demand big money while many like quick & done improvements at little price.

However in few years unix start to feel the need of something more than a CLI and some GUIs start to appear, unfortunately differently than original Xerox&co desktops those UIs were not "part of the system, fully integrated in it" but just hackish additions with so interoperability, just single apps who have at maximum cut&paste ability.

> Sure, but it also requires rewriting a lot of these things, introducing and fixing new bugs... It feels like the good ol' "let's rewrite this program" that quite frequently doesn't live up to the expectations

We need desktops again, witch means not just "endpoints" or "modern dumb terminals of modern mainframes named cloud", but desktop computing, since desktop development is essentially abandoned since many years and even back then was in a bad shape we need to restart from the classic desktops. LispM was ancient, hackish, but are still the best desktop we have had in human history so a good starting point. We have some kind of LispM OS/OE here: Emacs, still alive and kicking so there is something to work with, that's is. Emacs is already a WM (EXWM) have countless features and it's already "plugged" in modern bootloader OSes to have hw, driver and services. It just need to evolve.

[1] yes, you are reading correctly and no, I'm not wrong, I'm talking about the famous NLS "Mother of all the Demos" from 1968 https://youtu.be/yJDv-zdhzMY


I agree, especially with the statement that Unix isn’t good enough and getting worse.

I feel like that was one of the core assumptions and point of the article, but it didn’t have any explanation beyond “multiple programming languages.” Feels a bit flat to me.


What ramblings.

Optane is the best performing SSD but the worst performing RAM you ever had. It is too expensive at any speed, even if Intel is losing money on it. HP memristors are vaporware.

LISP machines, Java machines, and similar architectures specialized for complex language runtimes are a notorious dead end. They just can’t keep up with performance-optimized RISC, pipelined, superscalar, SIMD, etc. architectures paired with compilers and runtimes that implement efficient abstractions (e.g. garbage collection, hotspot compilers) on top of those very fast primitives.


Before Lisp Machines were killed in the market it was clear that new architectures were needed and a few were under development, even RISC like CPUs. They weren't released.

But Lisp at that time was already fast enough on standard RISC chips (MIPS, SPARC, ALPHA, POWER, ...). Later the 64bit RISC chips also provided enough memory space. SPARC also had some tricks for Lisp implementors.

Currently the assembler coded Ivory emulator is 80 times faster on Apple's M1 than the last Ivory hardware (the Ivory Microprocessor from Symbolics was released end 80s).


Speed is relevant for some use cases, sure, but not at all for a ton of others. Memory, disk and CPU are almost free in this new world, so why are we computing like it's 1990 still? It's time for some different abstractions than file -> process -> file.

The vast productivity gains of Smalltalk and Lisp were because they discarded those abstractions and programmers were free for others.

Presumably OP posted this after noticing Phantom came up a few days ago. https://news.ycombinator.com/item?id=30807668


> Memory, disk and CPU are almost free in this new world, so why are we computing like it's 1990 still?

Elsewhere on this very site you'll find no ends of complaints about, say, Electron apps.


For general purpose computing applications expand to fill the performance available (that includes real value and bloat!)

I dabble in microcontrollers for fun and there it's different. I am an AVR-8 fanatic and sometimes I think "this is so fast" and "2K of RAM is plenty" and "I can fit CRC-32 tables in 32k of flash because that's what counts as an 'operating system' for me"

Then there are the applications where it just doesn't have the power and I am so glad to have a box of RP2040's because in 2022 the most important attribute of a microcontroller is that it is available.


The RISC-V folks are working on additions for special support of "complex language runtimes". Pipelined, SIMD and superscalar are all well and good, but what kills pure software-side support is always heavy branching and dispatching. These operations are genuinely much faster and more power-efficient when implemented in hardware.


How is the ARM not a "JavaScript Machine"?

https://stackoverflow.com/questions/50966676/why-do-arm-chip...

>Why do ARM chips have an instruction with Javascript in the name (FJCVTZS)?

https://community.arm.com/arm-community-blogs/b/architecture...


That instruction is a very small hack that uses just a few transistors to speed up a bit of data conversion that JS runtimes do frequently. That’s a far cry from a specialized chip.


> You could open up system functions in the editor, modify and compile them while the machine was running.

Why would you want to do that other than hot patching a system that can't go down? Testing new changes requires more time than rebooting. If you just want to test simple changes, most debuggers can do that.

> Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.

And with a single address space you have win9x security.

> A modern UNIX system isn’t self-contained. I have 4 UNIX systems on my desk (Desktop, laptop, iPhone, iPad) I’m contentiously using the cloud (iCloud for photos, GitHub for text files, Dropbox for everything else) to sync files between these machines. The cloud is just a workaround for UNIX’s self-contained nature

This is just your use habbits. Nothing is stopping you from using NFS or SSHS. Someone who feels the need to use iCloud for whatever trivial convenience it provides is unlikely to benefit from a Lisp machine's ability to edit code on the live system.

> Then we add a gazillion programming languages, VMs, Containers, and a million other things, UNIX is a bloated mess of workaround for its own problems. We need a replacement, something that can be built for the modern world using technologies that are clean, secure, and extendable

The same thing will happen with any OS given enough time. Lisp is also not secure. It's prone to side channel and eval bugs.

> eliminate memory leaks and questions of type safety,

Lisp is not type safe.


> Lisp is not type safe.

It is type safe. While Lisp is not statically typed, its typing discipline is strong: operations performed on incompatible types signal recoverable errors.


Crashing at runtime, recoverable or not, is usually not what people mean when they say type safe. Spare me the static vs strong academia. Type safe when spoken, in practical every day terms, normally means enforced at compile time with IDE autocompletion support, usually implying static typing.


It's not crashing at runtime, it's crashing at compile time. Or rather, a purely REPL-focused language like Lisp dispenses with the phase separation between compile- and run-time altogether. But then this applies just as much to dependently-typed languages, which come from the "compile time type safety" line of research. You can't be okay with those while dismissing Lisp.


Recoverable error handling at runtime is usually not what people mean when they say crashing.


> Lisp is not type safe.

Typed Racket is, and that is why I love it


> Testing new changes requires more time than rebooting.

no it doesnt

> Lisp is not type safe.

yes it is


> And with a single address space you have win9x security.

Address space != protection boundaries. These are nearly orthogonal concerns. Where single address spaces might become less useful today is in dealing with Spectre vulnerabilities, though formalizing more explicit requirements about information domains (as in multilevel security, which is a well-established field of OS research) might help address those.


I don’t really agree. I had a Xerox 1108 Lisp Machine in the 1980s and loved it, but special purpose Lisp hardware seems like a waste of effort. I set up an emulator for the 1108 last weekend, and yes, I really did enjoy the memories, and things ran an order of magnitude faster than on the 1108 in the 1980s.

Then, I appreciated my M1 MacBook Pro running SBCL, LispWorks, Haskell, Clojure, and various Scheme languages - all with nice Emacs based dev setups. Life is really good on modern hardware.


The 1108 wasn't really special purpose Lisp hardware. One could run other operating systems on it. What made it special purpose was the loaded microcode for the CPU.

> Life is really good on modern hardware.

Agreed: On modern CPUs.

More support for the additional hardware features like GPUs, media processing engines and the neural network engines (see the M1 Pro/Max/Ultra) would be welcome.


The best bet for getting GPU deep learning support, I use Anaconda/conda, using the Apple M1 channel. That said, I usually use my Linux GPU rig or Colab for deep learning.


I feel like a lot of posts like this are pining for the complete lisp machine -user environment- and overestimating how necessary/important the hardware architecture would be to getting back to that today.

I can manage to context switch between different lisps fine but I do sometimes wonder in e.g. a slime+SBCL setup how much that context switching is costing me.


I think that, while the idea is solid (Unix is poorly-designed and we should have better) some of the specific ideas mentioned are lacking:

> Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.

No! Bad! We have enough problems securing software on separate VMs running on the same metal, single address spaces are completely out of the question until someone manages to build a feasible trusted compiler system.

> Then we add a gazillion programming languages, VMs, Containers, and a million other things, UNIX is a bloated mess of workaround for its own problems.

A lot of these problems could happen with a Lisp machine - you could have a billion different Lisps, for instance (although, to be fair, with better (i.e. non-Unix) OS design you wouldn't need containers).

> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.

This is partially true, but a lot of the complexity in modern software doesn't come from Unix, but just...bad design decisions. Webtech doesn't really care whether it's running on Windows or Unix, after all.

Also, high-level CPUs are a bad idea: http://yosefk.com/blog/the-high-level-cpu-challenge.html

I think the good in this post is along the lines of: text bad, typed IPC good, runtime-aware OS good, standardized VMs good, interactive systems (Lispy stuff, Jupyter) > batch-processing systems (Unix, C).


I've started using Mathematica recently. I quite like it: I've used Sympy before, which was good, but nowhere near as "good" as Mathematica. How does it compare to the Lisp Machine operating systems? There's some vague resemblance to Lisp in treating symbols as a basic type of object. In the Mathematica use-case, these symbolic values are used to stand for algebraic variables or unknowns. Undeclared variables by default have symbolic type, with their own names being their values. (I know that other CASes do similar things here). Also, algebraic manipulations produce expressions which double as Mathematica code, which resembles the meta-programming features of Lisp. There's even glimpses of reactive programming in the way you construct interactive plots.

I know this is "uncouth" because it's commercial software, but Mathematica is one of the most interesting programs I've ever used. [edit] Might something like this be the future?


> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.

Oh man, wat?

I love lisp as much as the next guy.

But you absolutely can have library mess, memory leaks, and millions of lines of code using a Lisp. You arguably can have a "multi-language" mess, too, because Lisp gives you wonderful tools to create DSLs; I'd say creating a language that fits your needs, and then using it, is the right way to use Lisp.

I use Emacs daily, and see how an all-Lisp environment can make for a good, productive interactive experience. More efforts in this area would be quite welcome, but this is a shell, not a kernel.

I still suppose that systems software, and especially the key parts of an OS, need a language more like Rust than like Lisp, with a good affinity to raw hardware, and a ton of static guarantees.


The main problem with Unix right now is security / permission control. Unix was built from the perspective that users potentially don't trust each other, but users all magically trust the applications that are run. In the age of the internet, this doesn't hold anymore, and we need strong permission control.


Who cares if Lisp is popular?

The "lisp epiphany" is real. Either you "get it" or "you don't".

I've been writing lisp programs for 50 years. I've been paid to program in 60 different languages but nothing compares with lisp.

There is an intellectual "distance" between a problem and its machine solution. I call this the "impedence problem". Lisp lets you think at the most abstract and write to the most specific.

Writing changed the world. But if you give most people a blank piece of paper they don't know what to do with such freedom. Lisp is the "blank piece of paper" of programming languages. Everything, literally everything, comes from you.

I loved my Symbolics machine. It was the closest expression of a "thinking platform" I've ever used. IDEs are horrible for thinking, ever interrupting at every keystroke.

Lisp isn't "popular" because it provides a "thinking platform" you can shape to your thoughts.

Lisp will never be popular. The reason should be obvious.


A number of Lisp fans seems to care. That’s why we regularly see articles on HN trying to convince other developers to use Lisp. It has been going on for years. Article after article written by frustrated Lisp fans, not understanding why the language they love is not mainstream. Often making bizarre claims about non-Lisp developers not being smart enough to “get” Lisp or whatever. Not having a clue that most developers care about a lot more than just the programming language. The best way to show the “power” of Lisp is to develop commercially successful software using it. That will do way more to convince smart developers to try out Lisp than writing yet another Lisp article trying to “sell” Lisp.


For the record, I made (and make) no claims about "non-Lisp developers not being smart enough".

My claim is that Lisp is perfect for thinking about a new idea or a new approach.

For example, I implemented a program that merged Expert Systems and Knowledge Representation into a single system (KROPS) that allowed a domain expert to express their knowledge as rules or as facts. Anything the system learned by either method could be expressed in either representation. Thus, the two representations were "unified".

Another effort involved Human-Robot Cooperation to change a car tire (TIRES). The system could interact with the human through pseudo-natural language, learn rules dynamically, and expand its knowledge base of the current situation in real time. So the system self-modifies and learns through human interaction on the task.

Both of these systems required self-modifying code which is rather more difficult to do in other languages. In Lisp this is trivial.

As for commercial sales witness:

Axiom, a 1.2 million line Computer Algebra program written in Common Lisp, was sold commercially by the Numerical Algorithms Group.

YESOPS, an IBM Expert System program implemented in Common Lisp, was sold commercially.


> For the record, I made (and make) no claims about "non-Lisp developers not being smart enough".

I am glad to hear that.


It’s great that there is commercially successful software written in Lisp. However a lot more needs to be written in Lisp to compete with the hundreds of thousands of commercially successful applications written in other languages. I don’t think that will ever happen. But that’s fine of course. Pick the language that makes you happy and work with that. It doesn’t have to be a popular language to make you happy.


> frustrated Lisp fans

Why be frustrated? For example I'm using the latest Apple Silicon laptop and there are a dozen Lisp (and related) systems already ported to it. Years ago it took a lot longer to move to a new platform, especially for open source software implementations with native code compilers.

Happy times.


I agree. Nothing stops Lisp fans from using Lisp for their own projects. The fact that Lisp isn’t popular, and probably never will be, really shouldn’t matter.


For a somewhat complete history of LISP machines, I recommend reading "Hackers: Heroes of the Computer Revolution" [1].

[1] https://www.goodreads.com/book/show/8260364-hackers


Thanks for the recommendation.

I've been learning/using CL, on the side, for about a year, in fits and starts, and I'm also picking up a lot of it's history and evolution along the way.

I find the history of the thing is as fascinating as the language/tools.

There's so much written on this but it's hard to aggregate and put together. Links lead to other links that lead to other links and oftentimes a lot of them are dead.

So I'm glad that there are books like this that can preserve some pieces of the story.


The problem with language wars is the people whom cannot be trusted with pointers, also cannot be trusted with lambdas or recursion.

The name of the game has always been to avoid directly insulting the bad programmers by making fun of the languages they use. Bad programmers have clustered in several languages over the course of my long career. Mostly they are the easiest most expressive languages, which would superficially seem to be an advantage, however they make it easy to express amazingly bad ideas. The harder to express languages require more work to express a bad idea thus somewhat filtering them out of that language's pool.

People whom don't understand the game think the game will be won if the bad programmers would have access to better languages for the first time in history. Despite the idea being decades old its always presented as a new idea.

There's an authoritarian streak where if only we could remove access to the inferior languages then they'd have to use the better languages and we'd have better code. However you can't force people to not use inferior tools and you can't force them to learn to use better tools.

The graph of easy to write and code goodness is interesting and nonlinear. You can express very complicated ideas in lisp easier than in vb6 or perl or interpreted basic or spaghetti fortran. However, you can express very bad ideas easier in a "bad" language, so it accumulates interesting authors. There is always a crossover point where an intermediate complexity idea is equally hard to express in a simple language or a complex language. Frankly most of IT needs are and always will be below that point. So its counterproductive to demand difficult language for simple tasks, everyone laughs at "Enterprise Java Hello World" that is 100K lines of enterprise patterns.

Expressing ideas in computer languages is much like expressing ideas in everyday language. Some esoteric philosophy texts require a VERY large precise complicated hard to use and hard to learn vocabulary. Road signs do not. For in between jobs, trying to use a minimum number of language vocabulary words and lexical complexity would be wise. It would be a fools errand to try to cut half the vocab words from street signs to "make driving safer", or an equally bad idea to force all road signs to be expressed as Shakespearean sonnets. The real world counterpart of "Enterprise Java Hello World" would be forcing the "No U Turn" street sign to be in the form of a Shakespearean sonnet.

The difficulty of tasks is usually under a power law, so its nice that we have lisp, but usually a bad idea to use lisp.


Part of the hype and hope of the "OLPC" laptops was that it would generate a "Python machine" userland, if not entire OS.


When the author talks about too many languages and libraries, I personally think he means the proliferation of Unix DSLs (some of which are Posix and some are not) like Make, Awk, Sed, BC, DC, shell-script dialects, Gnuplot (I know it's not Posix but it was widely used once), xargs, Vimscript, ELisp. Each of these has its own quirks and random limitations. Each of these makes Unix unnecessarily complicated and difficult to learn, and they don't even accomplish much that's impressive. I also think they make the editing experience worse, because they prevent the use of IDEs or highly featured REPLs like the ones general-purpose languages have.

I imagine that a lot of this can be replaced with libraries for a general-purpose language like Python. In the case of Awk, BC and DC, Python's standard library does everything these do without their strange quirks. Gnuplot can be replaced with umpteen plotting libraries like Matplotlib. Shell-scripting can be done in a Python dialect like Xonsh. I don't know of a Python alternative to Make, but Make is a fairly perverse and ad-hoc language that looks ripe for being replaced by a library.

I don't think a new operating system (however you define that) is necessary. I think you just swap out a lot of the crazy DSLs with one consistent general-purpose language. People will switch over when they want to accomplish things more easily (like me!). This is especially likely if you're not a SWE but you want to do file system automation anyway. The DSL-heavy approach is too Byzantine for such people (like me!). And it's mostly possible today.


I wouldn't even say any of those make UNIX difficult to learn. 99% of people can be users and developers just fine without learning awk or sed or even make, if they don't do low level work.

It can all be incrementally replaced over time, without starting over, just like how systemd and pipewire didn't need to totally start over.

Make will probably need something more than just a library though, because it's gotta stay declarative. But the fact that make exists at all is kind of an issue. I think build and package management should be done in the language itself like non-C languages do.

The other use of make is almost as a pseudo UI, just a standard place to put a list of actions you can do.

Something like Ansible could replace that, in theory, or we could have some new "project control center" file with menus and inputs and settings.


> I don't know of a Python alternative to Make, but Make is a fairly perverse and ad-hoc language that looks ripe for being replaced by a library

I think Gnu Guile can integrated with Make


Maybe scons or pydoit?


If were talking about wild dreams, I would like to see a modern Plan9-like operating system written in lisp.

While the Plan9 mouse chording is cool (and could be kept), I would have everything also accessible via keyboard commands. 3 button mouse chording on a laptop trackpad is not fun.


Executable-images-and-bytestreams (Research Unix, Plan 9) and everything-is-in-$LANGUAGE (Lisp machines, Emacs, Smalltalk, Oberon, Forth) environments seem largely contradictory to me, because much of the flexibility in Unix seems to come from the freedom to ignore as much structure in the data as you want to, while programming-language environments seem to derive their advantages from expressing the structure in as detailed a way as possible. (In particular, they really want to invent their own storage formats for everything.) I don’t have much of an idea about Inferno, but my superficial impression is it also mostly ends up as a single-language island.

Which is annoying, because both of these approaches produce some really attractive results, so I’d very much like to learn about any attempts to reconcile them.


"Expressing structure" is just a higher layer on simple bytestreams. Some historical operating systems only supported special-cased "file types" with hard-coded structure, but the *IX folks found out that the simple bytestream is enough.


It used Limbo and was called Inferno.


Hardware LISP machines didnt survive the 1980s because you could emulate LISP on a general purpose CPU faster than a special machine. That is because faster new general purpose CPUs cane out every year or so, while it took 3-5 years for the next special purpose CPU.


The Lisp machine of today is MirageOS: https://mirage.io/

A unikernel that throws out the legacy of Unix and starts fresh to build a library operating system, it's exactly what OP describes:

> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.

Even better, Mirage is programmed in OCaml, which is basically a statically-typed facade over Lisp (or Scheme). That's the modern Lisp machine of today. It even takes care of security nightmares like this:

> Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.

Because in the Mirage model is program is a separate OS image and they can communicate only over defined service interfaces.


Leaving the specific idea of the Lisp Machine aside, today there we have a tremendous advantage over the past when it comes to creating the kind of full, self-consistent systems Lisp Machines -- and systems like Smalltalk or Oberon, etc -- represent. Today we have widely accepted communications and data format standards. This is something that really isolated system diversity in the past, where you were "stuck" in some given computing system and had little ways of interacting with other types of computing systems. We have figured all of that out.

We should hope to see a flourishing of new and diverse systems again, since now all one need to do to interact with everyone else is merely (and I know it's a slog) implement tried and true standards.


One can implement these standards. Problem: it's work.

They supported things like TCP/IP, UDP, SMTP, X11, RPC, NFS, DNS, HTTP, ... The machines had C compilers, too.


Some of these new systems should definitely be lisp machines of a kind, though not direct clones of Genera. We need something more "of the times." In reality, there are only a few things people expect of their computing systems, but those things are huge: decent graphics and components for the UI, and a web browser.

So, yeah, standards and protocols are there. And it would take a s*t-ton of work to implement them in a bespoke environment. But they are not "difficult" in the classic sense. If we had a completely different type of economy it might even be possible!


If I'd like to try the emulated Lisp Machine linked to (https://tumbleweed.nu/lm-3/), is there a straightforward getting-started doc? The page lists multiple links for each of the simulator, the bootstrap, the documentation, and umbrella projects ("to make it easier setting up", though for this one it's only a binary choice). This is not counting the multiple system sources, since only one is recommended right now.

This suggests it's too much work for now if you're just curious, but it'd be great to be wrong.


I agree that we need more operating systems, but...

> These machines used specialized hardware and microcode to optimize for the lisp environments (Because of microcode you could run UNIX and the Lisp OS at the same time).

This is a dead end in the history of CPU design.

Processors are all vaguely similar these days. Your CPU is built around ALUs which typically take one or two inputs, produce one output. Around that, you build some logic for shuttling these inputs and outputs to and from registers, or in some cases, to and from memory.

The core here, the ALUs, have gotten more and more sophisticated over the years, but the wiring on the outside has remained fairly modest relative to its excesses back in the day. I'd say that the lesson here is simple: rather than add complicated operations to make your high-level language faster, do the complicated stuff in software... which gives you a lot more flexibility, and the combined hardware-software stack ends up being faster and cheaper anyway.

> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack, eliminate memory leaks and questions of type safety, binary exploits, and millions of lines of sheer complexity that clog up modern computers.

I can understand where this notion is coming from... but practically speaking, switching to Lisp doesn't eliminate memory leaks or questions of type safety or binary exploits. Even with an idealized version of Lisp, I don't think these problems could possibly go away. Neither garbage collection nor systems like Rust really "solve" memory leaks, they just provide strategies to make memory leaks less common.

The same thing applies to type safety. You'd have to define "type safety" very narrowly to say that Lisp solves all type safety problems. Again, I can understand where the author comes from--it's kind of an intuitive notion of type safety, that you don't end up operating on an incorrectly typed pointer, or something like that. But the notion of type safety is much more broad than that these days.

And the C/Unix strategy is actually pretty good, too, when it works--contain memory leaks within a process, then terminate the process.


Just run Emacs as your Lisp Virtual Machine. All it really needs is a good editor, but evil-mode is kinda serviceable.


I have had this thought. Jokes about needing a good editor aside, I wondered about making a unikernal that bootstraps enough to start emacs.

The issue I see with the idea of lisp-all-the-way-down is that most people don’t want to write filesystems and device drivers. I know I don’t. I mean I find them fascinating, but I usually have a specific app I want to write. I don’t want to write device drivers on my way there.


you could do this with nanos...but i dont know how that would benefit you


> They were programmed in lisp the whole way down and could be run code interpreted for convenience or compiled to microcode for efficiency. You could open up system functions in the editor, modify and compile them while the machine was running. Everything worked in a single address space, programs could talk to each other in ways operating systems of today couldn’t dream of.

So this is basically expanding Emacs to actually be the operating system. There's definitely some allure to it. Awhile back, I was hacking on the Lem editor for Common Lisp, using the Lem editor itself. It was great until I made a mistake in some code that restricted my ability to edit the code, sort of like performing brain surgery on my self and snipping a nerve that controlled my arms. It was amazing to have that immediate feedback loop, but in a world that is striving to find new ways to minimize human error, I'm just not sure it'd hold up.


Pervasive rollback in a sufficiently compartmentalised bit of the UI that it's really hard to break would seem like the ideal theoretical solution but practically really quite tricky to introduce to an environment retroactively.

(I've been pondering this problem recently for something I'm working on an introducing it from the ground up is bending my brain hard enough, others may be better at it though ;)


> With lisp machines, we can cut out the complicated multi-language, multi library mess from the stack

This seems like the kind of goal that's only palatable to a very few people nowadays. Specifically, the people who want to use that language, toolchain, and libraries, and nothing else.

These days, I don't think that that's ever going to allow for enough of a community to support more than a relatively self-contained hobbyist scene. Which there's absolutely nothing wrong with that; personally I wish there were more compelling tinkering-oriented platforms; I'm a little meh on Unix too. But the article seems to be advocating rather loftier ambitions.


I have had the same thoughts before and come to a similar conclusion. I believe it's not about the language. It's about the features: runtime code editing, single address space, program interoperability. Lisp could be replaced with C or anything else. Unfortunately this is the hard part. No one is about to design completely new hardware or architecture for this, as it makes no economic sense. So we just get a shiny new language every few years, and nothing really changes.


IMO the biggest difference is that that there is no longer a difference between binaries and libraries. Everything is a library. In UNIX there are a bunch of utilities which you just can't use from C. Also the idea of just passing data structures around instead of using a pipe passing streams of characters around is an improvement since there is no longer the need to serialize / deserialive it.


The future is already here, but unfortunately it's programmed in elisp and still lacks a decent editor.


Oh, come on; I love Emacs, and I admire RMS a lot.

And I mean it, Emacs is as close to a Lisp Machine as you get in today's world.


Emacs has a great editor...

The evil package.


So it's interesting to note modern hardware is only 60x faster than Lisp machines when it should be 1000x. Weirdness of Lisp, turns out hardware isn't some random thing. And harder to compromise than modern stuff, by a huge amount, real actual security improvement.


Not saying you're wrong, but do you have a source for the 60/1000x number? Curious to read more.


Yeah, where did I read it, I think John McCarthy checked it out along with...who could it have been...definitely him.


Today one Lisp Machine CPU emulator written in assembler is 80 times faster than the original Lisp hardware. This could be made faster by a JIT compiler. Native code Lisp compilers for x86-64 or Apple's M1 are roughly 800 times faster than the original Lisp hardware (the third generation Ivory processor).


Wow! Really?! So the M1 really is faster in a non-gimmicky way? I could cut out this X86-64 assembly bullshit, at 800x I could be happy just with Lisp. Assembly all based on C by this point anyway...


The key thing about those type of systems was the ability to reach down into the system and edit the code of the system currently in operation.

Here's a demonstration of Symbolics Open Genera (TM) 2.0, demonstrated running in a virtual machine. It is noted by the author of the video in the first minute or so that even in emulation, it is much faster than the original machines. - https://www.youtube.com/watch?v=o4-YnLpLgtk

Oberon also had a similar attribute, in that they kept the names of functions to operate on objects visible.

The same was true of the Canon Cat, Hypercard, Ted Nelson's Xanadu project, Smalltalk, and a number of other early computing systems.

The main feature common to all of these systems is that they all preserve context. In Genera, Oberon, Canon Cat, Hypercard, and SmallTalk the source was always available. (as far as I know). In Xanadu, the main functionality of the web was present, but it wouldn't allow the broken links (and lost context) that now plague the web.

I think a future platform could take code in a number of languages, compile it to an abstract syntax tree, but preserve the context required to recreate the source. In fact, it's reasonable that you could import a routine in a language you aren't familiar with, and reverse the compilation to get it expressed in an equivalent (but less elegant) form in your language of choice, along with the original comments.

There's nothing stopping an open source project from taking elements of these existing systems and moving forward from that basis. It might be profitable to include ideas such as Coloring of the Source text to label intent, such as in ColorForth.

Also, consider "Literate Programming" - Literate programs are written as an uninterrupted exposition of logic in an ordinary human language, much like the text of an essay, in which macros are included to hide abstractions and traditional source code.

You could also add the ability to store graphics and other data along with the source code.

Of course, if you are required to run code you didn't write and don't trust, your operating system must provide means to run it only against the files or folders you wish to let it operate on. The principle of least privilege needs to be supported at a fundamental level. This is one of the big shortcomings of the Unix model.

Sorry it was a bit of a ramble, but this seemed to be a call for ideas, so I ran with it.

PS: In the past, getting your vision of Computing required building a machine, and then getting it manufactured. Now it just requires that you make it work in a VM, Raspberry Pi, or web browser window.

It is MUCH easier to try out and/or create alterative systems now that it has ever been in the past.


You can get a bit of that Oberon experience on Windows via PowerShell, .NET and COM.

Now if Windows team would get over their COM worship, it would be even better.


Why would you reach down into the system and edit the code of the system currently in operation? Because it doesn't already do what you want. Isn't it a bit pathalogical to start by building a system which doesn't do what you want, then building in the tools to let you fix it, so you can use those make it do what you want? Why not skip all the middlemen and make what you want in the first place?

If you arrive at "we can't do that because everyone wants different things", you're in this weird place where you can't build one system to suit everybody so as a fix for that you will ... make one dev system to suit everybody? Why is that going to work out any better? If people are going to have to build the system they want anyway, why not skip all the middlemen and leave people to build the system they want from scratch their own way?

"Well we can't do that, having to build everything from scratch is hard, people don't want that. And we don't know what people want (and can't be bothered to ask) and won't try to build it for them, but we can give everyone Our Blessed System and Our Blessed Tooling which we presume they will want to use, while we abandon them to have to build what they actually want using it". It's patriarchal and preachy and still unhelpful. The kind of person willing to put up with your system and language and its edge-cases and limitations instead of making their own, is quite likely the same person who would have put up with your spreadsheet and its edge-cases and limitations if only you'd built it.

It's the "all progress depends on the unreasonable person" meme; you need the person who can't tolerate a system which isn't perfect and demands to be able to tune it to their perfection, but is simultaneously fine with a customizable-system built by someone else for someone else's ideas. Then you say "ah but they can rebuild as much of the system as they want!" they're now having to build it themselves from scratch but based on what you built, which is even more work for them, not less.

And the whole thing says little about working with other people/organizations; everyone building their own thing isn't great for that. One could argue that Microsoft did a lot of good for the world by strongarming companies into .doc and .xls formats for interchange, in a similar way that HTTP/HTML did (but more tyrannically). Microsoft-Word-documents-over-email should probably go down in history as one of the big enablers of distributed human organization, like the telephone. Moreso than SMTP alone or TCP/IP alone which enabled computers to connect, not people. More than HTTP/HTML which ordinary people can't use without setting up a webserver and publishing first.

> "I think a future platform could take code in a number of languages, compile it to an abstract syntax tree, but preserve the context required to recreate the source. In fact, it's reasonable that you could import a routine in a language you aren't familiar with, and reverse the compilation to get it expressed in an equivalent (but less elegant) form in your language of choice, along with the original comments."

Would that even work? What about "code is primarily for people to read"; loss of indentation, of alignment, of variable names if you're changing language with different variable naming rules, loss of separate files/folder/module-context of how it was built in the first place, and loss of idiomatic patterns from changing one language to another, or to languages with different scoping behaviour. Look at the image in The Article for the square root code[1], can you decompile the syntax tree for a language which doesn't have a complex number type? What about one which doesn't have a short-float for the result, or has different floating point precision assumptions? What does "F and I2 need to be regular-heap-consed to avoid the extra-pdl lossage" mean for JavaScript or Python or Clojure? What's the magic 0.4826004 for, without any comments next to it?

And that square root function alone is a screenful of code, the very idea of being able to rebuild the system to your desires falls out the window if it's going to take you a lifetime to do that, and if not then we're back to my original paragraph where you're building a massive programmable system so someone can tweak one or two tiny things occasionally, which seems hella overkill.

> "Of course, if you are required to run code you didn't write and don't trust, your operating system must provide means to run it only against the files or folders you wish to let it operate on. The principle of least privilege needs to be supported at a fundamental level. This is one of the big shortcomings of the Unix model."

The XKCD comment that malicious code can ransomware all my documents and upload them to the cloud, but at least it can't add a new printer. One of the big shortcomings of the principle of least privilege is the effort of carefully and precisely describing the privileges needed - see the complexity and quantity of SELinux rules for a whole system. Even then you get immediately to the point where "my editor can edit all my text files" and that's now a large blast radius.

[1] https://cdn.substack.com/image/fetch/w_1456,c_limit,f_auto,q...


> Why would you reach down into the system and edit the code of the system currently in operation? Because it doesn't already do what you want. Isn't it a bit pathalogical to start by building a system which doesn't do what you want, then building in the tools to let you fix it, so you can use those make it do what you want? Why not skip all the middlemen and make what you want in the first place?

Because what you want is not set in stone for all time. Needs and desires change, and often what you think you want is different from what you actually want, but discovering that can take time.

I spend so much time trying to work out things like "when I click this button in the UI, what does that actually end up doing in the backend?" I wish I could right-click on a UI element and it would take me directly to the backend code which actually implements it. And a system like that should be much quicker to change – it should be much quicker for a new developer to get up to speed with it and start implementing things, rather than spending days (even weeks) trying to understand how it all fits together.


>One of the big shortcomings of the principle of least privilege is the effort of carefully and precisely describing the privileges needed - see the complexity and quantity of SELinux rules for a whole system.

SELinux was a terrible thing to inflict upon the world. Capabilities are more like cash in your wallet.... you pick what you want to use in a transaction, and hand it over... never worrying that the rest of your money would somehow get siphoned off later.

Static rules aren't what capabilities are about. Run this program, oh.. it wants a file? Here... it can have this. The default access would be nothing (or in a more modern context, it's config file) The program wouldn't directly access files, but would use dialog boxes to do I/O, or you could drop them in via batch, etc.


Reposting this from the 2014 HN discussion of "Ergonomics of the Symbolics Lisp Machine":

https://news.ycombinator.com/item?id=7878679

http://lispm.de/symbolics-lisp-machine-ergonomics

https://news.ycombinator.com/item?id=7879364

eudox on June 11, 2014

Related: A huge collections of images showing Symbolics UI and the software written for it: http://lispm.de/symbolics-ui-examples/symbolics-ui-examples....

agumonkey on June 11, 2014

Nice, but I wouldn't confuse static images with the underlying semantic graph of live objects that's not visible in pictures.

DonHopkins on June 14, 2014

Precisely! When Lisp Machine programmer look at a screen dump, they see a lot more going on behind the scenes than meets the eye.

I'll attempt to explain the deep implications of what the article said about "Everything on the screen is an object, mouse-sensitive and reusable":

There's a legendary story about Gyro hacking away on a Lisp Machine, when he accidentally trashed the function cell of an important primitive like AREF (or something like that -- I can't remember the details -- do you, Scott? Or does Devon just make this stuff up? ;), and that totally crashed the operating system.

It dumped him into a "cold load stream" where he could poke around at the memory image, so he clamored around the display list, a graph of live objects (currently in suspended animation) behind the windows on the screen, and found an instance where the original value of the function pointer had been printed out in hex (which of course was a numeric object that let you click up a menu to change its presentation, etc).

He grabbed the value of the function pointer out of that numeric object, poked it back into the function cell where it belonged, pressed the "Please proceed, Governor" button, and was immediately back up and running where he left off before the crash, like nothing had ever happened!

Here's another example of someone pulling themselves back up by their bootstraps without actually cold rebooting, thanks to the real time help of the networked Lisp Machine user community:

ftp://ftp.ai.sri.com/pub/mailing-lists/slug/900531/msg00339.html

Also eudox posted this link:

Related: A huge collections of images showing Symbolics UI and the software written for it:

http://lispm.de/symbolics-ui-examples/symbolics-ui-examples....


The trend seems to be toward statically-typed languages, with the exception of the adoption of Python being used for data. I also prefer having more problems being found before deploying and running on production data.

Dynamic typing is great for prototyping, early development, and for small teams. If your definition of success is greater than that I would choose differently, or port at a good time early on.

And if someone were to ask me to join a company using lisp, I would hope that it's Clojure, though arguably not a Lisp, is better it that it will vary less between usages. This gives it a better chance of a growing ecosystem.


You could do some powerful things with LISP machines but I think the underlying assumption is that everything is written in LISP, or compiles to LISP (or its underlying bytecode). That places restrictions on what you can do. For example, does it do high-performance multithreading and SIMD right?

Also not sure LISP machines solved security as well as modern-day Linux does. I think it was just less of a concern back then, because you knew most of the people who were on the network.


The answer for that question lies with Connection Machine and Star-Lisp.

https://en.m.wikipedia.org/wiki/Connection_Machine


I used to start writing Star-Lisp code for my company’s Connection Machine (version 1, the SIMD one) by using Coral Common Lisp on my little Macintosh. Then I would get on an airplane and fly to the city where out CM was installed.


> genera

eh, if it's gui is anything like clim which was based off of it, no thank you big time


The JVM is my lisp machine.


UNIX is fine. UNIX philosophy is an issue, along with C, and the fact that everything now is mobile and web based and the tools aren't well suited to offline/nonSaaS stuff yet.

Linux is slowly becoming a standardized, integrated platform. I don't see why it can't be evolved to have all the main advantages of a LISP machine.

I also don't see how that solves dependency management. No matter what, if you build against something and it changes, stuff breaks. That's the main issue with Linux.

It also doesn't solve microservices being kinda hard and needing manual configuration specifically for the setup, rather than the one size fits all style of monolithic desktop software. That's an application architecture challenge.

Nor does it solve cross-platform.

Linux does have problems and could learn from LISP machines(Although I'd rather we have TypeScript machines or Python or something, LISP is pretty far from what I want a language to be, and is meant for creativity and expressiveness rather than Ada-like safety and boring hacker-repelling Java-like standardization).

But a lot of issues go away if you pretend everything other than Debian and Red Hat don't exist.


UNIX and UNIX philosophy are one and the same, there is no way around it with some kind of word games.

C was created to make UNIX, originally written in straight Assembly, portable. A process finalized by UNIX V6 release.

UNIX philosophy grew out of Bell Labs into all universities and business that took those source tapes, and took it from there.

Hardly possible to be selective of what UNIX is all about.


Modern Linux is so far from old school UNIX though. They are even talking about using Rust!

It's evolved so much, and the remaining unixy bits get more and more supressed and hidden.

BSD still has more of it, but Linux seems to be moving on and is very far from the days of everyone piecing together software from small parts.


Linux became a kernel for the GNU project, whose founders were raised in ITS and MIT AI Laboratory, and the original planned kernel for the project doesn't have anything related to UNIX.

Most projects using the Linux kernel nowadays care more about having a free beer kernel than anything UNIX related.

BSD has always been a proper UNIX, hence AT&T lawsuit when they were finally allowed to monetize on UNIX.


From a semi shallow position, what annoys me with linux is the lack of genericity above the "file" abstraction (which is not even real enough).

I remember seeing GNU ls code, 30% argparse, 30% formatting.., all of this seems brittle and redundant.

Bazaar is fine to allow freeform innovative evolution but it's also too messy.


Linux has a lot going on above the file level, it's just not in the kernel.

Most people using Linux also have DBus and Pulse/Pipewire going, and a ton of other stuff.

Network devices are technically files, but in practice they are names. I have no idea what them being a file even means at the byte level. They're just names that you deal with through NetworkManager's DBus API, if you're not targeting any of the minimal or embedded distros.


This feels like an odd critique. I would expect that more programs should devote more code to the parts that interact with a user. That is, the size alone isn't much of a signal. Is it?


In the sense of the PowerShell engine doing the argument parsing and display formatting for all cmdlets, which is something I think the designers took from VMS shell. That makes for "Don't Repeat Yourself" (one place for all the argument parsing code) and from the user experience side all commands which subscribe to this take parameters in the same way, arguments handled in the same way, with the same common parameters, and support parameter introspection and tab completion, etc. so it can be a more uniform experience.

(Not always great though; I don't know VMS shell but PowerShell cmdlets can still have similar tools taking different parameter names for the same concept, or not working implementing what to do with common parameters, or taking DSLs in strings like WMI Query Language, or taking rather opaque and unique hashtable blobs).


I didn't mention it but powershell was in my mind. powershell format-table in particular. Also there's some guys who are patching linux user space to embed json as output (and the jc wrapper to be able to leverage jq later down the pipe)


The formatting is redundant and should be able externalized. The argparse also deals with the output formatting, hence redundant. Only file spec selection is core to LS.


Redundancy isn't, by itself, a problem. In the case of core utilities, it affords them a level of stability that is just not even attempted in many areas.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: