Do any editors have a "translator" kind of tooltip or panel that explains what each symbol does in languages like APL or KamilaLisp?
I was thinking about how I'd go about learning the various symbols involved in such a language, and I learn best by reading and modifying existing code, so something that helps me understand what I'm looking at would be nice.
Language-specific web editors often do. I think Uiua (https://www.uiua.org/) is the state of the art here, with the minor issue that most of the symbols aren't used in any other language. Several APL-likes have language bars at the top, with names shown on hover.
Dyalog APL's RIDE interface doesn't even need you to find the symbol on a language bar; just hover the mouse over it, and a tooltip appears: https://i.imgur.com/XtzRGUs.png
Encode.su discussion on bzip3 [1] was quite interesting to me back then. Tl;dr, she combined some unpopular ideas into one package but those ideas themselves were well known in the data compression folks themselves, so that was hardly novel to them. Nor those ideas were independently reimplemented or optimized.
Are they related? I mean same author, but they seem not related besides the Lisp part. Both are pretty impressive feats for someone who is only 19 though.
"The day that someone writes, in Malbolge, a program that simply copies its input to it's output, is the day my hair spontaneously turns green. It's the day that elephants are purple and camels fly, and a cow can fit through a needle's eye."
"There's a discussion about whether one can implement sensible loops in Malbolge—it took many years before the first non-terminating one was introduced. A correct 99 Bottles of Beer program, which deals with non-trivial loops and conditions, was not announced for seven years; the first correct one was by Hisashi Iizawa in 2005."
Ah! I stand corrected, thanks. I opened the github and saw APL, Haskell, Lisp. Interesting as this new effort looks pretty serious while the Malbolge, while an incredible feat, is more a joke I guess.
You're totally new to Lisp, right? There is no such clear boundary in Lisp languages.
> This can't be relevant when the objects referenced by these collections are mutable.
Why not? Explain your reasoning. E.g. why is not not relevant that we can make a longer list by adding an element to the front of a shorter list, without mutating that shorter list, if the elements are objects that support mutation?
> Tail-call optimization would be a language feature.
Most lisps generally do it this way (except maybe emacs lisp?), but there's not really a requirement for it to be a language feature. TCO is really just AST manipulation, and lisp macros are more than capable of that, although you might want to hook into the reader as well if Kamila supports it (I didn't see anything about that in the github readme).
While it is true that tail call relationships can always be represented using goto, it requires that all related code is known to make this conversion. For instance:
(defun f (x) (a x) (g x))
(defun g (x) (b x) (h x))
(defun h (x) (c x) (f x))
You cannot remove the tail call by altering the AST of F, G, or H. Instead, a third function must be instroduced that contains the bodies of all functions in the loop.
(defun fgh-loop (start-at x)
(tagbody
(case start-at
((f) (go f))
((g) (go g))
((h) (go h)))
f (a x) (go g)
g (b x) (go h)
h (c x) (go f)))
(defun f (x) (fgh-loop 'f x))
(defun g (x) (fgh-loop 'g x))
(defun h (x) (fgh-loop 'h x))
You can only apply such optimisations in retrospect, but a macro would only have access to the body of F at the point of definition for F. And even then, you cannot TCO any lambdas that are passed around. A simple example is the Y combinator:
> You can only apply such optimisations in retrospect, but a macro would only have access to the body of F at the point of definition for F.
There's no actual reason for this, you can get the symbol tree for the functions in question, assuming the runtime allows this (several do), and re-compile them and blow over the old definitions. You can only apply the optimization once all of f, g, and h are defined, there's no rule saying that a macro can only modify the function being defined, or that other functions can't be defined (the CLOS would be impossible if that were the case).
Both this and the lambda example should be doable with function-lambda-expression, although it's not exactly standardized. It should be possible in principal with the right implementation.
> It should be possible in principal with the right implementation
Most implementations that do this also support TCO, and I think the process itself goes a little beyond AST manipulation. It's basically just manually combining the entire program into one function to get a version of unconstrained goto. It isn't a macro in the traditional lisp-sense of the word, more of a compiler for a new programming language that uses lisp as a host. Practically speaking, it doesn't have many of the properties that are important in macros. It is not defined in a composable way, and it needs deep integration with the language runtime. Tor instance to handle definitions that mutually recur between different packages, it would need to be installed globally in the implementation rather than locally in a given package. So if two packages tried to use different recursion macros, they would likely be incompatible. While it could hypothetically be implemented on top of another lisp, it is really only possible to do so properly as a language feature.
> Most implementations that do this also support TCO
Sure, I acknowledged that in my original post. I just said that it didn't have to be done that way.
> It's basically just manually combining the entire program into one function to get a version of unconstrained goto.
You really only need to combine those functions in question, and common lisp already has a pretty unconstrained goto already. Worst case, if you're willing to deal with the code bloat, you could copy the function definitions of the functions in question into each function as it's compiled (you'd have to defer macro expansion for n-1 of the functions though until runtime, or conditionally trigger re-compilation as each one is compiled). Probably would be a big pain though.
> It isn't a macro in the traditional lisp-sense of the word, more of a compiler for a new programming language that uses lisp as a host.
I mean, isn't making a dsl out of lisp macros rather traditional? That's how the loop construct began. The CLOS/Metaobject protocol also started out as 3rd party packaged, and they're also absolute beasts.
But, like, that's kinda the point of lisp. You can add your own syntactic abstractions to the language. Adding things to the language is part of the language. Is the argument here that those dsls aren't really lisp? How about the dsls that made it into some of the standards?
> it doesn't have many of the properties that are important in macros. It is not defined in a composable way, and it needs deep integration with the language runtime.
I'm not entirely sure where you got these requirements for macros, but I've never heard them. Admittedly, I've spent most of my time in Common Lisp and playing around with some of its ancestors. Maybe this is the case in some of the more scheme derived languages with hygenic macro-systems? I haven't really played around with them that much aside from a semester of scheme way back in the day in college.
Also, you might be able to get around the language runtime integration by hooking into the reader, but that would probably require making a meta-compiler (or at least keeping track of the symbol trees of everything that you've compiled so far). Would be a lot more portable than the thing I'm thinking of though.
> So if two packages tried to use different recursion macros, they would likely be incompatible.
With the implementation I have in my head, I think you could re-write the ones in the other package with the recursion method that the last one that gets compiled uses, but this would probably open up even more of a can of worms.
I agree that this is a good reason to do it as a language feature instead of as a macro, but not that it's impossible to do as a macro.
> isn't making a dsl out of lisp macros rather traditional
The idea of a DSL is that it coexists with the regular code. Eg:
(some-function (sql-query :select * :from ...))
The language mixes freely with other lisp forms and even with other DSLs. You can combine them like Lego bricks. When the entire program is written in a DSL, it is no longer a program in the host language, it is a program in a new language which uses the host language as an interpreter. Otherwise we would say Python files are written in C, because the Python interpreter is written in C. Python would only become a DSL in a C program if Python expressions were embedded into and cooperated with the surrounding C code. Both languages must be present simultaneously rather than one just interpreting the other.
> You can add your own syntactic abstractions to the language
The idea of an abstraction is that its closed in the formal sense. It affects only a small localised region of code. This makes it composable with other abstractions.
> I'm not entirely sure where you got these requirements for macros
It's what makes lisp macros different from the C pre-processor. You could, for example, run your C program through PHP as an additional pre-processing step and write arbitrarily complicated macros that way. The special thing about lisp is that it is homoiconic, and hence its macros can be composed in the same way other language features can. Eg:
#define TEST 1)
(1 + TEST == 2
It is valid to do this in C because macros are not closed. They can spill out and affect he surrounding code. While you technically could do something similar with reader macros in lisp, you notice that in practice, these macros are all written to be closed so that they can fit in with the code around them in a predictable way.
> hygenic macro-systems
While this is similar in the sense that it aims to have macros act more predictably, I am talking about a different aspect of macros. Specifically, I am saying that there are certain aspects of lisp macros which make them far more useful than simple text substitution. Certain most lisp dialects allow for macros which violate these principals, but I would say those macros are malformed. It is bad form to write them in lisp code, hence I don't consider them lisp macros.
Hmm, ok, I think I might get what you're saying, it's not exactly what I thought you were getting at originally. Although I'm not quite following still. I did bring up hooking into the reader, but that was for something more than just the basic TCO with macros. I'm reasonably sure that what I was talking about can be done with normal macros themselves.
Just to check my understanding:
So, going back to your example, we have 3 functions f, g, and h. These will be re-written as one big function with gotos and three helper functions that call into the right point of that function.
So, is your objection to this that the other functions are re-written, or that a new function is defined in the namespace you happen to be in?
To elaborate further, let's say f, g, and h are defined in order. When f and g are compiled, nothing happens, but when h is compiled, noting that f and g are already defined, h is compiled into something like:
(defun h (x)
(flet ((fgh-loop (start-at x)
(tagbody
(case start-at
((f) (go f))
((g) (go g))
((h) (go h)))
f (a x) (go g)
g (b x) (go h)
h (c x) (go f))))
(fgh-loop 'h x)))
Obtaining the definitions for f and g by examining the environment and copying them over. That should be alright, in your opinion, right?
There's not as much benefit, as f and g don't get the optimization, but in exchange we don't get spooky action at a distance.
How about if we used a macro to define functions instead of the traditional defun? Each one registering the function as willing to be re-written when everything they depend on is defined. That should give the user of the code a warning that these functions are likely to be re-written. Functions that aren't defined using this macro aren't re-written, but don't gain the benefit.
Edit: though I thought about it a little more. The more interesting question is regarding the standard library, rather than the compiler. Applying different licenses to the two is not unusual (gcc is GPL, glibc is LGPL). From what I can see, the standard library is embedded entirely into the compiler, and thus yeah, I do think programs that use this would actually have to be GPL. Not that that's a bad thing though :)
More important than the standard library (which isn‘t so standard, just conventional, see musl) is the GCC Runtime Library, which is GPLv3, but with the "GCC Runtime Library Exception".
There is also an interesting question about license compatibility between GPLv3 and GPLv3 with RLE, but that is mostly ignored by everybody.
Unless KamilaLisp specifies exceptions to the GPL, it means that if you ship your own KamilaLisp program in a way that is combined with KamilaLisp (for instance compiled into one big executable that includes the KamilaLisp run-time) then the entire combination has to be distributed under the GPL, meaning that your program has have a GPL-compatible open source license.
Programs that are not combined with KamilaLisp, only requiring an installation of KamilaLisp for their execution, almost certainly don't have to be GPLed.
Beg to differ. Maybe you can decipher all this license stuff but I can't and I'm sure OP was asking people who are more savvy about it to get an answer, not to be lectured.
I don't understand the legal speak on these pages. What does your comment mean? That I'm supposed to reach a certain level of understanding of open source terms to be allowed on here? Sounds extremely narrow minded
I told you I have read the terms of these things and don't understand them. You have like 5 comments on here and you're lecturing people on the culture of HN. This is why your comment was flagged. Get over yourself
I said this in another thread too, but the problem with Lisp is that it's sorta bundled with Emacs, so if you want to use LISP's powerful REPL you really have no choice other than learning Emacs. Essentially, Lisp is not just a "language"; it's a whole system designed to explore programming ideas. It includes the IDE, the minimal syntax, REPL, compiler, etc. All of this together makes "Lisp" the powerful and enlightening tool that people talk about.
I think the other "inconveniences" of Lisp could be more tolerable for beginners if learning the language didn't require learning a new IDE (or OS, depending on how you define Emacs!). But at that point you'd have to forego a major benefit of using Lisp (its REPL); you'd be back to writing "dead" programs, not image-based "live" ones.
Another problem I've faced with Lisp is lack of good documentation (except for Racket, but then again, Racket doesn't have Common Lisp's powerful REPL). Every website that teaches Lisp is in ugly HTML+CSS-only style, compare that to the more user-friendly websites of other languages.
Then there's the issue of up-to-date learning material. Aside from the fact that there are very few resources to learn Lisp, the ones that are available are too old too. "Practical Common Lisp" (2005), "Common Lisp Recipes" (2015), "ANSI Common Lisp" (1995), etc.
I like the philosophy of (s-exp) but modern lisps have ruined its simplicity for me by introducing additional bracket notations [like this]. It's confusing for me as a beginner to distinguish between (this) and [that], and honestly goes against the whole idea of "code and data look the same" motto.
> I said this in another thread too, but the problem with Lisp is that it's sorta bundled with Emacs, so if you want to use LISP's powerful REPL you really have no choice other than learning Emacs.
You can use Medley which is Interlisp-D, a different, parallel strain of Lisp, a descendent of MACLISP, itself an ur-lisp ancestor of Interlisp, emacs lisp (older than EMACS, which didn’t start out as a lisp program at all), CommonLisp, Multics MACLISP and that even begat Scheme.
Interlisp didn’t even keep its code as text files (though I wrote an eMacs for it at PARC back in the early 80s) so you may find it more accessible.
What about clojure? IntelliJ & VS Code are also popular IDEs for it.
It also support datastructure literals vec [1 2 3], map {:a 1 :b 2} which are mostly considered as helpful for beginners since it's closer to what they are used to in other languages.
While I use emacs and I'm not too familiar with other editors I'd question how true this is today, clojure in particular is often written with vscode, intelij and vim, all seem to have good repl support.
> Lisp is that it's sorta bundled with Emacs ... Essentially, Lisp is not just a "language" ... All of this together makes "Lisp" the powerful and enlightening tool that people talk about
Here we have a conflict: Lisp survives, because it is different and ground breaking. Like Bach, Shakespeare, Einstein, ... But it is also old. If you study Bach, you'll find that he composed for instruments which are out of fashion, like the Harpsichord. When do we last have heard one of his compositions on an original instrument from the time he was living? His music has been reused, re-interpreted, but the original impression, live played on historic instruments is rare.
Lisp is also not "modern", it's not fitting into the "fast fashion" world of current software with ever faster hype cycles, where JavaScript creates hundred new variants frameworks every day and your software from a five years ago is no longer supported. Like our phone doesn't get any software updates after five years (if not earlier).
There, a book like "Paradigms of Artificial Intelligence Programming, Case Studies in Common Lisp" by Peter Norvig is outdated. At the same time it might be a timeless classic.
The tooling for it has been developed and accumulated over decades and can't be reimplemented every other year. It's not powered by Microsoft, a trillion-dollar company, currently fueling the AI hype. Lisp is not in the hype cycle industry.
Using GNU Emacs plus extensions like SLIME or SLY as a dev-environment is just an effect on the low amount of resources and the concentration on a tool, which is itself programmable in Lisp. None of the other IDEs (IntelliJ, Visual Studio, ...) is easily extensible in Lisp.
> not image-based "live" ones.
to have "live programs" doesn't need "images". Image-based development is something different. For example ECL has the usual Lisp tools embedded, but can't create or start images. SBCL can save and load images, but doesn't use it much, beyond being able to deliver applications with it.
The real image-based development tools like Interlisp/Medley (-> https://interlisp.org ), Symbolics Genera, LispWorks ( https://lispworks.com ), etc. few people have ever seen or used. None of those use GNU Emacs as a dev environment.
> learn Lisp, the ones that are available are too old
Lisp is old, too.
The newer versions are no longer Lisp. It is nicely in the web, but it is no longer Lisp, like SICP has shown
Featured on Arraycast:
https://www.arraycast.com/episodes/episode74-kamilalisp