Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
SICL: A New Common Lisp Implementation (github.com/robert-strandh)
147 points by tosh on May 26, 2021 | hide | past | favorite | 57 comments


Does the Common Lisp standard need to be revamped? I remember seeing stuff about CL21 a while back...

I keep my eyes peeled for anything Scheme on GitHub, I love reading Scheme code, and seeing R7RS march eternally forward has brought me a lot of joy over the past few years.

But try as I might, and I HAVE tried, I just don't see Common Lisp as dealing with modern computing that well. It looks crufty and tired where Scheme continues to refine and sharpen. And then there's Clojure, which has become what I always kinda thought Common Lisp should've tried to become.

This isn't a very good comment, I guess it just always amazes me how much innovation is still happening from simple, unfancy s-expressions. Thank God.


> I just don't see Common Lisp as dealing with modern computing that well. It looks crufty and tired where Scheme continues to refine and sharpen.

I don't think that they are aimed at the same thing. Lisp is stable, reliable and suitable for large, durable systems; Scheme is conceptually pure, but its refinement and sharpness lead to fragility. You applaud the success of R7RS, but it is a retreat from specifying a general-purpose programming language (something R6Rs tried to do and was castigated for), choosing simply to provide a toolkit for building a general-purpose programming language. That's not bad, but it means Scheme is not suited for the things Lisp is suited for. I would hate for the RTOS monitoring my car to be written in Scheme, but I think it could in principle be written in Lisp.

Racket, arguably the most successful and influential Scheme out there, isn't even a Scheme anymore! There's nothing wrong with that, of course, but it says something when the best Scheme had to evolve until it was no longer Scheme.

Meanwhile, systems written in Common Lisp decades ago keep running, and new systems keep being written. It has some warts, some big ones, but it has some real virtues, too.


You've made some very good points, all of which I agree with.

I suppose I somewhat despise the Common Lisp "way" of things so much so that it distorts my perception of it. A lot of old JPL stuff was written in Lisp, and what you said about resilience strikes a chord with me.

I think the things that really stimulate me the most are things that a small, refined, core language can provide. I've always considered it a benefit that R7RS has become smaller and also that history will not look favorably to R6RS, I thought it was the wrong move. The projects that matter the most to me and to my sensibilities for scheme are abstract and academic. Check out namin's staged-minikanren repo on GitHub. (On mobile, else I'd link you) That is one of the most fascinating pieces of technology ever produced by humankind, imo, and it's written in a simple, common scheme. I think it actually evaluates on Racket, but you can see most of the code tastes like regular Scheme.

If I said that Common Lisp wasn't good, I surely regret it and can admit my wrong.

I wouldn't call Racket the best scheme, but certainly the most well known, as you say. It evolved in parallel to it's vision of meta-languages, and that's something I would argue entirely catalyzed by having a small, simple, functional core of Scheme. Racket could just as well have been Lisp, and it'd look and feel different, but the same magic would be lurking beneath the surface. It'd probably execute a lot faster these days too!

Anyhow, thanks a lot for your insights.


> Does the Common Lisp standard need to be revamped?

Yes. Will it happen? Almost certainly not, though I would like nothing better than to be proven wrong about this.

> it just always amazes me how much innovation is still happening from simple, unfancy s-expressions

Turns out S-expressions are just a Really Good Idea (tm). That's why people keep reinventing them over and over.


> Yes. Will it happen? Almost certainly not, though I would like nothing better than to be proven wrong about this.

I have a sense that any new standards process would have to work hard to defend against a number of distractions, such as forcing character encoding into the standard, the lure of JVM style compile once run anywhere (which has destroyed the ability for Clojure to bootstrap without maven), or attempts to reduce or remove certain warts that some users find annoying.

Many who don't know the history don't realize that the standard as it exists encoded the separate practices of many disparate groups into a single standard. This can be confusing for newcomers because they may encounter ways of doing things that are archaic but still in the language. Part of the challenge thus is not getting rid of things, but somehow figuring out how to communicate what have come to be understood as the better ways of doing things --- all without changing those parts of the standard.

I'm sure that the old lispers have fairly extensive lists of fundamental issues and improvements.

The short list that I have been able to compile from chance encounters with mentions of shortcomings includes implementation details, such as lexical O(1) jump tables, delimited continuations, proper tail call elimination, threads, extending dispatch macro syntax, and default precision for reading floats, and issues that almost certainly need more research and exploration, such as finding a solution to the CLOS action at a distance issue, CLOS dispatch on parametric types, something about static types, ways to alloc without gc, and many more things I am completely unaware of.

The Racket ecosystem has done quite a bit of research and implementation in many of these areas, but Racket lies at the far end of the lisp family from common lisp. Finding a way to take the more static and text based ideas from the ML/Racket end of the spectrum and make them accessible at the the dynamic image based Smalltalk/Interlisp end seems like something that could happen within the common lisp space more readily than elsewhere. There seem to be some fundamental questions about how late binding interacts with many other language features that do not have satisfactory answers.

All of this being said, I don't see the motivation for anyone in the space to spend the time on another standards process while so many ideas haven't been explored.


> I'm sure that the old lispers have fairly extensive lists of fundamental issues and improvements.

There are two public lists where specific proposals are collected:

https://cliki.net/Proposed%20ANSI%20Changes

https://cliki.net/Proposed%20Extensions%20to%20ANSI

And a page for more speculative proposals:

https://cliki.net/Lisp%20-%20Next%20Generation

Please feel free to add your ideas there.

> ways to alloc without gc

Do you mean something between DYNAMIC-EXTENT and https://gitlab.common-lisp.net/vsedach/Thinlisp-1.1 ?


> Will it happen? Almost certainly not, though I would like nothing better than to be proven wrong about this.

At this point it is about consensus among implementations. phoe got package local nicknames into all implementations, and that is a very major revision to the standard. So in practice it is happening.


> phoe got package local nicknames into all implementations

Unfortunately it's not yet in Clisp. I submitted a merge request[1] a year ago, but it's been silent since then.

[1]: https://gitlab.com/gnu-clisp/clisp/-/merge_requests/3


Ecl and abcl (and of course sbcl) adopted pln before that, so this statement is not technically correct.


> Yes. Will it happen? Almost certainly not, though I would like nothing better than to be proven wrong about this.

Of course it did, was Clojure not released? — “And then there's Clojure, which has become what I always kinda thought Common Lisp should've tried to become.”.

I can't help but think that this is purely a game of names, not of existence and features.


> Turns out S-expressions are just a Really Good Idea (tm). That's why people keep reinventing them over and over.

In the age of multicore, the perception that everything is a list which lends to sequential processing it makes lists actually less appealing.


Except most supercomputers have had Lisp and ML dialects experiments, because the hardware abstractions allows for easier scalabilitiy across cluster nodes, either on the same hardware or across the network.


Lisp represents source code as lists. It’s a huge myth that modern Lisp just uses lists as the only data structure.


Even older lisps had interesting data structures. Star Lisp (*Lisp) on the Connection Machine had vectors. https://en.m.wikipedia.org/wiki/*Lisp


Every Common Lisp already has vectors.

STARLISP is an extension of Common Lisp. Pvars in STARLISP are basically similar to vectors with elements spread over different processors.



> Does the Common Lisp standard need to be revamped?

Probably. It has been nearly twenty years after all!

> I remember seeing stuff about CL21 a while back...

That was mostly one guy’s thoughts, and didn’t meet with universal approval.

Here are this guy’s thoughts:

- Retain case sensitivity. Remove case-folding. Use lower-case for all symbols in the common-lisp package. Basically, Allegro’s modern mode is the right mode.

- Improve, don’t eliminate, pathnames. Standardise their semantics for Unix and Unix-like systems. Bonus points to integrate URLs and pathnames, although … that way may lie madness.

- Expand the standard. In 1994 Common Lisp was derided for being a huge language, but compared to languages like Go or Python the library portion of the standard is pretty small. It is probably time to add some TCP & UDP primitives. It is time to settle on some concurrency primitives. And then there are the higher-level things 21st-century programmers take for granted: HTTP clients and servers, Base64, that sort of thing. Maybe specify them in separate optional packages for smaller environments?

- Consider CLOSing all the things. Caution: this could be a terrifically bad idea, but …

- Consider breaking backward compatibility and settling on one argument order to rule them all. (nth 1 list) and (elt list 1) is a wart. Switching to lower-case raises the possibility of retaining the entire COMMON-LISP package for backwards compatibility here …


I too love Scheme. I prefer Racket Scheme but to each his/her own.

I think it may be whole running VM kind of model that Common Lisp requires the programmer to keep in mind. What is defined/expanded at run-time VS the compiled nature of Scheme/Clojure. I may be wrong about my wording here but there have been times when playing around with real Common Lisp that I just couldn't understand why something wouldn't work when it did in Scheme (I don't remember the specifics).


I consider Serapeum to be a revamp of the Common Lisp standard: https://github.com/ruricolist/serapeum/blob/master/REFERENCE.... This provides a bunch of new features and idioms including ideas borrowed from newer languages like Clojure.

Great example of "growing a language" as a long-term evolutionary process that doesn't require changing earlier specifications in incompatible ways.


It doesn´t help that most people that try Common Lisp via the FOSS offerings, are completly unaware of what LispWorks and Allegro Common Lisp are capable of.


Agreed. I just wrote this LispWorks IDE review and will add it to the CL Cookbook (when slightly enhanced): https://lisp-journey.gitlab.io/blog/discovering-the-lispwork... It should shed some light on their offerings and help newcomers get a bigger idea of what CL is.


Uau, very nicely written.


What’s a Scheme that 1) has fast startup binaries (so no Racket) 2) has a good emacs experience ala SLIME/SLY, CIDER?


I would guess that the new Racket implementation based on Chez has 'fast startup binaries'. Doesn't it have that?


And Chez itself seems like a good candidate.


I've been messing with Cyclone Scheme lately, but I don't use emacs. Might be worth a look?

Guile is what I mainly use for hobby projects, it's pretty dang reliable and fast, Andy Wingo is a compiler genius and there's a lot of good internals walkthroughs on his blog.

Other than Racket, have you tried a lot of implementations?


Chicken Scheme compiles to C, and you can build binaries as you would with any C compiler. Geiser is the Scheme equivalent in emacs, but I don’t know how well it competes with SLIME/SLY/CIDER.


SICL is interesting because it aims to be written 100% in Common Lisp. This means no kernels or stubs written in C, and also means that (in theory) any conforming Common Lisp compiler should be enough to build it. (In practice, various compilers end up showing various limitations when building SICL at the moment - building it is a pretty intensive and interesting process.)


what kind of limitations?


IIRC a recent example is the amount of memory that an implementation allows for generated code to take.


There was a very nice paper at this year's ELS on a call site optimization technique that they plan to put into SICL.

The idea solves the problem of optimizing function calls while allowing various things (functions, classes, methods) to be dynamically redefined, even while calls involving those are on the stack.

https://european-lisp-symposium.org/static/proceedings/2021.... (pages 72-78)


Ah, related to #clasp:

https://www.youtube.com/watch?v=8X69_42Mj-g Clasp: Common Lisp using LLVM and C++ for Molecular Metaprogramming

https://www.youtube.com/watch?v=mbdXeRBbgDM 2018 LLVM Developers Meeting: C. Schafmeister lessons Learned Implementing Common Lisp with LLVM

https://www.youtube.com/watch?v=0rSMt1pAlbE Clasp: Common Lisp using LLVM and C++ for Designing Molecules

https://www.youtube.com/watch?v=9HXfvT85EFE Modular Polymer Catalysts - Chris Schafmeister

https://www.youtube.com/watch?v=PQDwvatwjD8 10S6P4 Christian Schafmeister 2010 UNTT 3rd Annual Conference Session 6 Presentation 4


In that they're both implementations of Common Lisp, but is there a deep connection between the two than that? Not that directing more attention to clasp is a bad thing!


Clasp author here - Clasp uses SICL's Cleavir compiler.


SICL is being developed in a modular fashion: I believe clasp uses SICL’s compiler framework Cleavir for its own compiler. (I’ve also heard that ABCL and ECL have considered reimplementing their compilers with Cleavir)


A couple past threads:

SICL – A fresh implementation of Common Lisp - https://news.ycombinator.com/item?id=8284835 - Sept 2014 (49 comments)

SICL – A new Common Lisp implementation - https://news.ycombinator.com/item?id=26795066 - April 2021 (3 comments)


In case anyone is wondering what step 3 in the readme means, this is what I used when I was exploring SICL back in March.

  (asdf:initialize-source-registry
   '(:source-registry
     (:tree #p"/path/to/SICL/")
     :inherit-configuration))


I never understood how to use ASDF properly. It seems like it expects me to put all my projects under a central location, like what Go requires (used to require?), which is extremely annoying.


Looking at my clrc file I have a variant of the above which might allow you to have folders in different places, but by accident I have them all sharing a common parent folder.

  (asdf:initialize-source-registry
   '(:source-registry
     (:tree #p"/path/to/Second-Climacs/")
     (:tree #p"/path/to/SICL/")
     (:tree #p"/path/to/OntoLisp/")
     :inherit-configuration))
The ASDF best practices seem to be documented, but it is hard to find them through all the noise, and I'm still not sure I'm following them.


You do have to put all your projects under directories that ASDF knows about, but you can freely add directories to that list.


One of the most annoying things about CL is that I can't get access to compiler type inference information inside compiler macros. This lack of type information cuts off lots of optimization opportunities. Maybe a new CL implementation can expose the type inference engine to macros and enable much better compile time code generation.


Is there anything new, since.... say 2014? Anyone familiar with the project care to comment on how viable a sicl is today, as a common lisp?


SICL is definitely not "viable" today, in that we are still not at the stage where we can generate a native executable. But we are making good progress. Register allocation is close to being done. Code generation is next. We recently programmed ASDF so that it can participate in the bootstrapping procedure, so now, it will be easier to load additional libraries into the target environment, like Eclector, Clostrum, Trucler, Cleavir, and the systems they depend on.

I am not going to be a regular participant here, so if you have any questions, it would be better to show up on the #sicl IRC channel on libera.chat.


I don't know anything about this specific implementation, but I find it incredible that the main author seems to be committing to the repo on a daily basis. Considering how old the project is, that's quite a commitment for any open source project, much less a CL implementation!

But I agree it would be nice to get some information on how complete it is.


It's at least maintained enough for their README to be up to date as of the recent Freenode brouhaha:

> All these channels are on the libera.chat network.


Some of the core ideas of SICL have been written up in a digestible form in a 2019 paper, 'Bootstrapping Common Lisp using Common Lisp'

https://hal.archives-ouvertes.fr/hal-02417646/document


I’ve been watching the project for a while, and have dropped into the irc channel a few times. From what I’ve gathered, there’s slow but steady progress. Last I checked SICL was in a usable state, but I’m not sure how production ready it is or if there are benchmarks available to compare it to, say, sbcl.

The folks working on it are always super friendly and helpful, so if you’re curious to learn more I’d recommend dropping by their irc channel and asking.


I’m not aware of any “full” uses of SICL. It’s compiler framework, cleavir is used by clasp (a somewhat new c++/llvm/boehm/ecl based implementation).

Long ago, I tried to build a CL->JavaScript compiler based on cleavir. It started ok and I had the definitions of some annoying things (eg lambda list parsing code gen) but I was caught by the lack of any interpreter [1] with any kind of acceptable performance, and I eventually gave up. I think there weren’t any JS based implementations at the time but that might not be the case anymore (I haven’t followed CL closely in a while). My ultimate goal was to have the ability to write web applications where the server and client could share code in the best way possible, including things like throwing closures from server code to be executed on the clients[2]. However that would in fact require also having a compiler to native code [3] and the design of cleavir was well suited to handling the sort of “dual compilation” that was required.

[1] you need an interpreter to evaluate macros. You cannot portably use the host lisp’s macro expansion system because you cannot use e.g. symbol types from the host lisp (due to issues with packages, interning and namespaces) and you cannot sufficiently control the global or compiler environment. The interpreter must be used for evaluating the bootstrapping code to execute macros in the target environment to create that environment. SICL did have a simple interpreter at the time (removed around then too) but it used much more memory than I could give it.

[2] I also wanted to support compiling some weird code which is not required by the standard like:

  '#.(let ((a 0))
       `(,(lambda () (incf a))
         ,(lambda () (decf a))))
(Here the source code itself contains a list of two objects which are functions closing over a common environment.)

[3] the above is one example of why a second native compiler was needed. Making bootstrapping/compilation adequately fast was another. In general marshalling objects is hard to do portably hence why I wanted a native target. Another complication is that for consistent semantics you need to only use (IEEE) double-precision floats in code that might be compiled to JavaScript but there isn’t even a portable way to get such a float!


You do not need an interpreter to evaluate macros. You can compile the code of the macro, same as you would any regular function, and call into it that way.


I think you’re missing something: if I compile the macro to JavaScript, I don’t have a way to run it because I am cross compiling. I can’t simply use the host lisps compiler because of reasons mentioned above.


Macro expansion is done in the host environment, not the target environment. You can compile the macro definition to a function taking a form and an environment. This would be something you arrange for within your compiler.

CLTL2 specifies a function `parse-macro ` which is available in most implementations. Portably you can call it using the `trivial-cltl2` package:

https://github.com/Zulu-Inuoe/trivial-cltl2

Robert Strandh has written both papers and code for handling environments:

http://metamodular.com/SICL/environments.pdf http://metamodular.com/clostrum.pdf


Either you are describing something that doesn’t work for cross-compilation or you are missing the point. Common Lisp is a hard language to compile because of macros. Consider the following code you might see in bootstrapping:

  (defmacro defun ...)
  (defun ...)
To macroexpand the second line you must have an environment with the defun macro definition from the previous line, which means that the code to modify the environment with that definition must have been evaluated too. This must be evaluated in the (emulated) target environment because (1) the host environment would otherwise clash with it, and (2) you need to write down the target environment as a build artefact. Evaluating these environment-modifying forms requires an interpreter you can run on the host. If you want a good programming experience for your standard library then this needs to be a reasonably capable interpreter.

But you also need an interpreter to evaluate code during macroexpansion. Three reasons:

1. You must emulate the target environment for correctness, eg JavaScript only has one type of float.

2. You cannot sufficiently interact with the host environment to use its own macroexpander (if you want to be portable). This is because compiling a macrolet requires extending the lexical environment with the macro definition so that calls (from eg the expander for setf) to macroexpand can use the definition. The only portable thing on the host that can evaluate a macrolet is the compiler/interpreter

3. You probably can’t use native types for things like symbols so the built in macro expansion is not sufficient.


That the runtime environment and the compilation environment, insofar as they are distinct, are not able to communicate seems like a deficiency in the implementation. Lisp is meant to be interactive.

Do you not expect to be able to implement 'eval'?


To a large extent, they can communicate: the compile time environment when compiling one form depends on the runtime environment modifications made by evaluating the previous form. This is what makes cross compiling lisp hard.


The acronym immediately put me in mind of the book "Structure and Interpretation of Computer Programs". I wonder if it's deliberate; doubt it. But it's a fortunate association.


Can anyone explain what SICL stands for? It doesn't seem to explain the name on this repo.

(I'm guessing the similarity in names to SBCL - Steel Bank Common Lisp is intentional?)


It doesn't mean anything. However, someone has suggested the recursive backronym "SICL Implements Common Lisp".




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: