The tricky thing about if, and, and or --- the reason you can't implement them as functions in most languages --- is that they need to not evaluate all their arguments immediately. Otherwise:
// Would print!
if(false, print("oops!"))
// Would throw an error if the key is not present
and(my_hashmap.has_key("key"), my_hashmap["key"])
The way that ryelang gets around this is that you pass the arguments in a "code block" surrounded by "{}", which delays its evaluation. So you write:
// Does not print, because *if* never runs its code block arg
if 0 { print("oops!") }
// There's no example of *and* anywhere but my guess is you'd write this:
and { my_hashmap.has_key("key") } { my_hashmap["key"] }
In the maxima computer algebra system[1] which was ancestrally based on lisp it has a single quote operator[2] which delays evaluation of something and a "double quote" (which acually two single quotes rather than an actual double quote) operator[3] which asks maxima to evaluate some expression immediately rather than leaving it in symbolic form.[4]
Is this correct though? Lisp's quote would need some eval or something to evaluate later afaik. More fitting might be a (lambda () ...), a.k.a. lazy evaluation.
I would imagine it is closer to lambda than quote (though also a special form), since the implementation of if would require that the bindings in the arguments evaluate to their values in the callers environment.
Smalltalk may be influential, but is now rarely used.
The code block approach is widely applied in two massively used industrial languages though: Ruby and Kotlin. In Kotlin specifically it's one of the very central features.
Likewise in Tcl where blocks can either be sent quoted (using {}) or unquoted (using “”). In the latter, variables and procs will have a round of substitution performed before being passed to the proc.
TCL is kinda similar to Rebol in some ways but in other ways it's the opposite of Rebol, because in TCL everything is a string (although it can ALSO have another type, thanks to clever shenanigans). (You probably knew this!)
I heard this "everything is a string" line many times abot Tcl and it sounded a little unusual, but I havent delved deep enogh in tcl to see what it really meant and brought. I will.
everything has a string rep available. It used to be that every thing was also represented literally by a string. So, for pedagogical purposes, a value 1 would be "1", and to do math, Tcl would do a strtol(val_storage), with the obvious performance implications.
The way things are done now (and have been for a long time), is that values are stored in a
struct Tcl_Obj{
int refCount; // objs can be shared
int myType; // indicates whether currently a long, double, etc
long longVal;
double dblVal;
[...]
char *stringRep;
int len;
}
...in fact, the Tcl_Obj is more sophisticated than this, but for demonstration purposes this is fine.
So "native" (eg: longVal) values are used when appropriate, no marshalling back/forth between strings, but the string rep is always available (can be generated), because that's what Tcl promises: everything is representable as a string. This is what brings the homoiconicity to Tcl - logically it's just passing text tokens around, and emitting text tokens. Internally, again, more sophisticated, but you get the point.
Yes, as samatman said, main reason is that blocks of (code or data there is no difference) don't evaluate so they are passed as function arguments and the function can potentially evaluate them. So if is a function that accepts two arguments, a boolean and a block of code.
loop is a function that also accepts two, integer for number of loops and again a block of code.
Even fn that creates functions is a function that accepts two blocks, first is a list of arguments and second is a block of code. There is no big difference between function fn and function print, they are both builtin functions defined in same manner and there are multiple fns for special cases and you can create your own on "library" level.
"In REBOL, contrary to Lisps, blocks or lists don’t evaluate by default. For better or for worse, this little difference is what makes REBOL - REBOL." https://ryelang.org/meet_rye/basics/doing_blocks/
It's difficult for a language with this semantics to be made efficient, but efficiency isn't everything.
They're special forms in a language where functions are something which eagerly evaluates its arguments.
In a language where functions don't do that, they're just functions. Rye appears to be one of those languages. One could quibble about whether that's the right thing to call them, but if we say they're vau or whatever, "everything in Rye is a vau" is still true. I think calling them functions is reasonable though.
In Kernel, we could argue that operatives are the more fundamental type of combiner, and applicatives (aka functions) are the sole special-form, constructed by calling `wrap` on another combiner.
It's difficult to optimize because evaluation of expressions depends on the dynamic environment, which you don't have ahead-of-time.
You can't even assume `+` means "add the arguments", because `+` may have been bound to something completely different prior to evaluating the expression containing `+`.
Late to the party here. What would be your take on dynamic-by-default, but the ability to fix the env a la Dreams (http://elilabs.com/~rj/dreams/dreams- rep.html) or Zig (comptime)? You can obv. do this in Rye via a static context.
Yes, but with supposedly no special forms there cannot be a lambda operator.
Turns out the language does have special forms, which is OK; it's just weird to say there aren't (though it's an understandable goal).
When I worked on 3Lisp (so many decades ago) it became clear to me how many special forms there are (a small number, but more than I thought) and, honestly, how few there really are, so the "benefit" of a 3Lisp turns out to be negligible in practice. Oddly enough I didn't really notice that when writing interpreters because I thought of most special forms as simply compiler hacks ("eventually we can get rid of this").
I would say Rye does not have special forms; in LISP aiui, lists are evaluated by looking up the definition of first symbol, then evaluating the rest of the list, then apply the rest of the list to the definition. Except when the first symbol is one of a small number; then the list is evaluated fifferently.
Rye's evaluator seems more complicated, but the forms are regular. A block is always evaluated the same way doesn't change how it's evaluated based on what the first element is.
You can have lambdas without special forms in Rye because blocks aren't evaluated eagerly.
Of course I could be way off; I've been having fun this morning poking at Rye for the first time and my Lisp / Scheme exposure is limited to Uni classes eons ago and a resulting allergy to parentheses.
Seeing the meta-circular Rye would tell us for sure :)
> then evaluating the rest of the list, then apply the rest of the list to the definition
True for function calls. But not for the zillions of macros. The "small number", you mention, are the small number of special operators. But there are many more macros. Those get the arg source unevaluated and return a Lisp form, which then is checked again (either by the compiler or at runtime by a source interpreter).
> it became clear to me how many special forms there are … and … how few there really are …
I'm having trouble parsing this. The two parts there seem to be saying opposite things. Was that an accident, or were you saying that from one point of view it seems to be a lot while from another point of view it doesn't, or something else?
The latter. When writing a somewhat standard implementation people expect redundant special forms (like both if and cond) so there are more than you think. OTOH you can implement some in terms of the others so maybe there aren’t as many as one might thing.
Also, of course, in 3Lisp you can run code in your interpreter and so define new control structures and such. Turns out there aren’t many interesting ones and they have mostly already been thought of.
One new control structure that didn’t need to modify its own interpreter was method combinators. Turns out they mainly useful for unpredictable behavior, except in very simple cases like :before and :after.
In most languages, if `cond1` evaluates to true, you would not evaluate `cond2`. If `cond1` and `cond2` evaluate to false, you would not evaluate `cond3`. If all conds evaluate to false, you would not evaluate `effect1()`, and if `cond3` and either `cond1` or `cond2` are true, you would not evaluate `effect2()`
They're not functions because they don't evaluate their arguments before evaluating their body. Their operands are passed verbatim and evaluated explicitly by the body on demand.
In Kernel, for example, we can define these as operatives, which don't evaluate their operands. Assuming we have some primitive operatives `$cond`, `$define!` and `$vau` (the constructor of operatives), and an applicative `eval`:
These aren't the definitions Kernel uses in its standard environment. It uses recursive definitions of `$and?` and `$or?` which take arbitrary number of operands, and `$cond` is defined in terms of `$if`, which is primitive:
It's doing if(cond, effect1, effect2) where effect1 and effect2 are functions, and only evaluating the matching effect function. But everything is functions.
technically effect1 and effect2 are so called "blocks of code" in rebol/rye/red/...
Everything being a function is trying to say that every "active word" (a word that does something ... print, first, if, fn, context, extends, ...) is just a function.
How does it implement and, or, or if?