There's something very satisfying about how this style seems to "climb the abstraction ladder" very quickly, but all of those abstractions he creates are not wasted and immediately put to use. I think much of the amazement and beauty is that there isn't much code at all, and yet it does so much. It's the complete opposite of the bloated, lazy, lowest-common-denominator trend that's been spreading in many other languages's communities.
I was surprised at how verbose and commented the code was. Then I read the note saying that Whitney's original code was in `ref/` and the two files in the root were annotated by other kparc members.
I know you're joking but people who write C like him would argue it's not obfuscated. It's certainly not intentionally written to be more difficult to read, and those who are practiced in the art often say that it's _easier_ to read.
You know it's in jest, of course. However, he doesn't write C either - and you know that better than us. It's a DSL with its own idioms and quirks that just happens to be in C. Often hearing about "evils" of preprocessor (which I do and don't agree with) I wonder if he ever considered anything else, aside from C, that's also low level. Heck, even asm can be macro'd away.
I recall a chap called Geocar demoing kOS, in a very impressive fashion. (I additionally recall an individual had to awkwardly hold the mic for the whole presentation)
I am interested, but one thing I have still not quite understood is where to use this family of languages.
As far as I can tell they are best for data analysis but not for heavy numerical computation because most or all of them lack GPU support. is that right or have I got it wrong?
q/kdb+ is used in finance (banking + funds) for heavy numerical computation every day. high-volume realtime data straight from markets, and petabyte/trillion-row historical DBs. it runs on CPU but computation easily parallelizes over cores/clusters.
Thanks, that gives me better feel for it. Mostly analytics, good with large datasets, but probably not great for things where you get a big gain from GPU?
q is good with bulk operations on compact arrays; these are cache-friendly and the interpreter can utilize cache-level parallelism. and with q it's convenient to go from idea -> MVP in short time. it's a high-level language with functional features so expressing algos and complex logic is natural.
but it's interpreted and optimized for array ops. so really latency-critical (e.g. high-freq trading) or highly scalar logic will be done with C++. the trade-off is convenience of development.
I have a longstanding fascination with K and other "modern" APL derivatives.
There are a few intersecting truisms about coding that I believe: one is that people's working memory varies: some have an immense amount, some less. Humans definitely process spatially better than in time series (e.g. comparing side by side rather than turning over a page.)
This implies you should prefer succinct code and languages because they are less memory load for engineers working on them.
At the same time, a corollary is that a smaller standard library / language is generally better, in that less needs to be learned by an engineer for full coverage of the language.
Another truism is that some people's processing speed is higher than others, and in general I think of the combination of working memory + speed as roughly equivalent to "g", general intelligence.
K occupies this weirdo place though, because it's absolutely succinct, a very small language as counted by number of atoms supported by the interpreter, and also incredibly hard to scan.
One of the K intros I read mentioned that the language is designed to be something that takes down your thinking; essentially the idea is that the workflow is "drink coffee with fellow PhDs, annotate on the chalkboard, and then when ready, capture it directly." This seems about right to me with my own K/J/Q experiences -- the bulk of the time is spent thinking about structuring a problem solution.
I compare this to go, a language I love for its long-term readability and maintainability, where I spend a lot of time writing boilerplate and dealing with errors in-situ.
At any rate, somehow there's a sort of event horizon of terse solution making where you come out the other side and need a 170 IQ to feel comfortable, and Mr. Whitney lives where he lives, and I live where I live. :)
People complaining about how ugly the C code is here are definitely missing the point: he has bent C's preprocessor to his will in order to encapsulate how he thinks about coding: essentially functional, vectorized. It's using C to write a DSL for solving programming problems interesting to Arthur Whitney.
I think it's fascinating on those terms. In a world where you have to read 10,000 lines of code from 100 developers, the C is terrible, and hard to parse. In a world where you will mostly write code to a style you've honed over 40+ years, it's super expressive, minimal, pared down to what matters, and probably fits his brain perfectly.
One analogy I like to tell people who are overcome with shock and horror at APL-family languages is to compare it to someone used to Latin-family human languages looking at something like Chinese for the first time --- it's likewise totally "unreadable" at first glance, but then you realise that over a billion people can read and write that language fluently every day, many of which may also struggle with a Latin-family language as they've never seen one before.
I'm not convinced that APL is "incredibly hard to scan" for someone who is familiar with it; it's just a matter of experience. While I'm by no means experienced in APL either, a visually similar thing I did frequently in my younger days was reading x86 instructions not in a disassembler nor hexdump, but displayed as CP437. It was not hard, and I can still remember ┤, PQRS, ═!, and ├ as "MOV AH", "PUSH AX; PUSH CX; PUSH DX; PUSH BX", "INT 21", and "RET" respectively. Here's an example:
ò║•═!├Hello world!$
Edit: looks like HN swallowed a byte, but you can sort of see what I mean.
y=x/5; / this would be considered "noisy" by C programmers, relative to COBOL
The first is COBOL (designed to make code easier for "normal" people to read. The second is C/Java/Python/Javascript (which looks more like the math that we learn in grade school).
k/APL/J simple moves further in the direction of the algebraic notation you already know. The difference is more operations/algorithms.
When you read the one-character symbols in K as algorithms versus characters, it makes much more sense. In addition, you can read "faster" in k than in other languages, relative to the functionality being expressed.
When I review C/Java/C++, I print out the source code and write the equivalent k code in the margin. The compression ration is typically 10-20X. Doing so speeds up my work significantly when I go back over the reviewed code.
As a moderately experienced K programmer, I find K easy enough to "scan". Common idioms immediately stand out as recognizable "words" that are suggestive of what a routine does before you fully parse it:
@& (filtering)
@< or @> (sorting)
@\: (folding)
,/ (flattening)
+\ (a running total)
etc.
It's also easy to notice, e.g. in K3, a reserved name like "_f" and immediately know you're looking at a recursive procedure.
In what sense is @\: folding? Doesn't it take a list of functions on the left and a thing on the right, and returns the list of values obtained by applying each function to the thing?
Another is regular expressions. They look entirely incomprehensible if you see one with no prior knowledge, but once you get used to them they are fairly easy to understand.
Oh absolutely. Pg talks at some point about how LISP macros can create a sense of godlike power / mania, and I think there's some of that in APL-land too. The true titans don't need to debug in the traditional sense - a whole program fits on a single screen, they can load it in their head, reason about it, and see what happened.
Non-titanic engineers accidentally generate bugs in these languages with high "floor" IQ requirements, and debugging someone else's K code is terrible. Debugging your own K code is terrible, not less because when you get help you will feel very, very stupid indeed :)
In my opinion, the language that comes closest to K's functionality and is also understandable by mere mortals is K itself. It is obviously extremely close to K's functionality and is a very simple language, the only reason it doesn't seem simple is that most people are used to verbose languages. A couple of days of practice is enough to make K readable, in my experience.
Also, I am constantly amazed at how concise K is, easily rivaling not only conventional languages but also much larger array languages like APL or J. Arthur Whitney's taste in selecting primitives is out of this world.
I'm extremely biased in recommending it, but Lil is semantically very similar to Q, entirely free, and intended to be beginner-friendly: https://beyondloom.com/tools/trylil.html
It's not as powerful or concise as K, but it gives you some of the flavor of an array language tucked inside what resembles an ordinary imperative/functional scripting language.
I recommend checking out uiua.org for fun. The docs are well written and the concepts, while foreign to most, are ultimately accessible and interesting.
k and uiua are in different branches of the APL family.
I recommend checking BQN at https://mlochbaum.github.io/BQN/ and the YouTube channel code_report by Conor Hoekstra (and also "Composition Intuition by Conor Hoekstra | Lambda Days 2023"). It is well documented.
This is super accurate, and interesting in the context of the repo. It’s clearly not the most accessible language or style - I wonder if someone who has hyper optimized for themselves in the way Arthur has can write an effective teaching language. It requires a theory of mind for minds that are probably quite different in the general case.
I don't know about Golang but there are languages where this sort of highly domain-optimized, super terse syntax could be embedded as a domain-specific sublanguage, in a significantly less hackish way than what C allows.
I'm not that knowledgeable, and def not a C apologist, but what language are you thinking of? To my eyes, that is some heavy abuse of the preprocessor in a way that I don't think almost any modern "safe" language could possibly countenance.
I read that code like he got the expressivity benefits of a lisp macro system with close-to-the-metal C speeds in like 100 lines of code. I'm curious what else could do this.
Rust’s macro system is safe and hygienic, people have implemented lisps in it. I just did a google search to find an example, so I have no idea how well supported this is, https://github.com/JunSuzukiJapan/macro-lisp
The K language seems to be limited to CPU only. Yes, it can call out to GPU (https://code.kx.com/q/interfaces/gpus/) but K only runs on the CPU. This strikes me as odd for an array language.
It's around, and they recently stopped providing free download links for recent versions, from which I take it that they have a reasonable enterprise sales program rolling now.
The feature split on the free / enterprise edition holds a lot back; my vibe on the free version was it was just enough to validate that shakti is performant, and then they want you to pay.
Heavy use of the C preprocessor and C defaults to embed a functional programming language. Language with a small number of core functions and ability to apply functions to lists of atoms. Aesthetically favoring short identifiers and minimal whitespace to create high semantic density. Eschew comments.
You know how when you first start learning to code, the kids who really "get it" right away start off thinking shorter code = smarter code = better code?
k always seemed like a bunch of those kids managed to become highly accomplished and brilliant engineers without ever breaking that terrible habit. Is there actually a reason to write these array languages (and interpreters for them, apparently) this way, or is it just a cultural difference?
> k always seemed like a bunch of those kids managed to become highly accomplished and brilliant engineers without ever breaking that terrible habit. Is there actually a reason to write these array languages (and interpreters for them, apparently) this way, or is it just a cultural difference?
They're more readable and less buggy that way. But unfortunately most programmers would rather spend 10 days reading 100,000 lines than 4 days reading 1,000 lines.
Whitney and I both worked in the 1970s for I.P. Sharp Associates, which used an email system written beautifully in APL by Leslie Goldsmith. Most user names were simply our initials. Ian Sharp was IPS, Leslie LHG, and Arthur ATW. (Middle name Taylor.) I’ve been SJT ever since (including to two wives) but was not smart enough to grab the domain: see 5jt.com.
Arthur Whitney is showing how to build an array language using C in array language form.
AW is known for kdb+ which is often used in finance due to its extreme performance properties and ability for quants to quickly explore ideas.
Personally, to get a better handle on how array languages worked, I implemented KlongPy which is a python implementation of Klong, which descends from K (which AW wrote).
You have to play with this stuff to understand it intuitively.
Ha! For those of us who were in the Morgan Stanley Fixed Income group for many years in the 1990s and 2000s AW is known for APLus which was used extensively for modeling and application work.
There were a lot of different and kind of weird attempts at this in the industry. The basic problem is that recalculating DAGs and monte-carlo simulations using a ton of floating point math was extremely expensive. Excel was way too slow, C++ required expert programmers and much better organization of programming teams to figure how to coordinate cached intermediate values. MS went with APL, which reduced a lot of the boilerplate and allowed for relatively easy memoization. GS went with SecDB which treated computation as memoized DAGs. Not sure what the others did
FWIW, kdb+ is not that extremely performant - there's a lot of things that could be faster, and a lot of limitations that mean that you often would be better of not using a DB at all (or to use another DB and just pull everything you might need into memory). There is/was a tradeoff in that many things that would make it faster would require more code, and a cool thing about q/kdb+ is that it takes so little code you don't have I$ issues, but I think that's a tradeoff that doesn't make as much sense anymore in 2023.
What it's really great for is that it's really neatly integrated into the q language, which is great for exploratory programming, and it's fast enough not to get in the way.
I've encountered this idea that k's terseness somehow improves instruction cache use before. Can you explain further? It seems nonsensical, since instruction caching is about machine code, not source code. Why should it use the instruction cache better than any other JIT? Or is it interpreted, in which case "the terseness of the language improves cache use" might seem more of an admission than a boast... :-)
Thanks the insights. Not to over do self promotion, but aside from learning, the main reason I made KlongPy was to allow for optionality with the ecosystem. Use Klong for array operations and other libraries for standard stuff.
Basically it is an APL-like language that is used (or was, I think it has moved on to a successor language called q now) in the proprietary kdb+ system used by some financial companies. Even if you aren't into finance it is a fun language to play around with.
array languages are very fascinating and I'd love spending more time learning about them
I read that they are (K in particular) used in finance...would you reckon it would be easier to find job in that field having that in the CV? I know I cannot compete on the C++/Python side but maybe as a skilled K developer I could sneak in
Might not, since for this kind of thing most likely either you give up in couple minutes (which it is not a heinous since it does not waste your time anyway) or you just read it.
Ha! The guy who wrote this is the same guy who invented ~~APL~~ a number of APL inspired languages (Edit: He did not invent APL. Thanks for the corrections!), so I suspect he may just be built different.
APL was invented by Kenneth Iverson, the other person mentioned on the page, not Arthur Whitney.
I was not convinced of the readability of APL, but compared to its successors which tried to stick to ASCII, I've learned to appreciate the merits of an extended character set.
https://www.jsoftware.com/ioj/iojATW.htm
(Discussed previously: https://news.ycombinator.com/item?id=25902615 )
There's something very satisfying about how this style seems to "climb the abstraction ladder" very quickly, but all of those abstractions he creates are not wasted and immediately put to use. I think much of the amazement and beauty is that there isn't much code at all, and yet it does so much. It's the complete opposite of the bloated, lazy, lowest-common-denominator trend that's been spreading in many other languages's communities.