Hacker News new | past | comments | ask | show | jobs | submit login

I know. It’s a feature, but it makes it harder to share code (and Chuck Moore believes that sharing code is rarely worth it). Maybe he is right. His productivity is unbelievable :)



The more time I spend dealing with blowback from excess complexity being imported in the form of 3rd-party libraries that offer complicated solutions to simple problems, the more I think that Chuck Moore was very, very right on that point.


I agree with you, but I wonder what his answer to stuff like GUIs would be. There’s a tremendous amount of complexity and domain knowledge in stuff like drawing fonts, and in cryptography, and so forth — and very very few of us have the time to become competent in even one of those, let alone all of them. Then consider the amount of work necessary to have a modern browser: text parsing of not one but three languages, language interpretation, more graphics, more cryptography.

It would be awesome to get back to first principles, but modern systems try to do so much that I wonder how practical it would be to reinvent them — and I would how practical it is to say, ‘well, don’t do that then.’


I don't know what Moore would say. Personally, I've retreated to the back end - used to be full stack, but I'm just sick to death of how overcomplicated front-end work has become.

I'm inclined to say that, e.g., the modern browser is a cautionary tale that complements the Chuck Moore approach to things: By forever piling thing on top of thing in an evolutionary way, you end up with a system that ultimately feels more and more cobbled together, and less and less like it ever had any sort of an intelligent designer. Perhaps the lesson is that it can be worthwhile to occasionally stop, take a real look at what things you really do need, aggressively discard the ones you don't, and properly re-engineer and re-build the system.

Obviously there are issues of interoperating with the rest of the world to consider there, and Moore has made a career of scrupulously avoiding such encumbrances. But a nerd can dream.


Also consider that all we know is a world that has become more global, open, and relatively peaceful post 1970s. If collaboration were to slow or decline, open-source would be harmed. And/or if Google and Facebook lose its dynamism from politics, regulation, and maturity, corporate sponsored open-source could be shaken. Google could become like AT&T and Facebook like Ericsson or something in some way.

Once unstoppable sectors, like aerospace (to mix comparisons) began to reverse and decline in the early 70s. No one really saw it coming. I can't think if one publicly known or credible person called it in 1969 shortly after the moon landing, at least on record. Oversupply of engineers in the US and the West became a thing. And engineering still suffers here because of aerospace's decline. Forth began to lose steam around then, right? Forth, hardware and Cold War (barriers) politics are inextricably linked, perhaps. And then GNU/Linux and BSD saw its high-collaboration paradigm birthed around that time. Nixon/Kissenger talks with closed China began around then too, and now relations are breaking down with a more open China today.

Look how Lua scripting came about not terribly so long ago. Some parallels. Brazilian trade barriers. Now half believe Huawei is evil. Cross-hardware story may be cracking. Many believe Google is evil. Open software may be cracking. And there are rifts between US, EU, and China on how to regulate the internet. A new Cold War may be brewing. It's a nerds nightmare.

If anyone can tie in distributed ledger and specialized AI coder productivity tools, or something to counter this argument or round it out, that would be awesome.

EDIT: I was mistaken. Forth caught on with personal computer hobbyists in the 1980s, per Wikipedia. However, as a career or industry,slow downs with NASA and Cold War spending seemed to take some wind out of Forth's sails. I've noted that lot of that type of work was what paid people to write Forth. And the open-source paradigm with C/C++ and GNU Linux was even more limiting, I believe.


“I agree with you, but I wonder what his answer to stuff like GUIs would be.”

Couldn’t say exactly, but it’d probably look something like this:

https://en.wikipedia.org/wiki/Display_PostScript

:)


As far as I recall, Display PostScript was display only - what you really want is NeWS which used PostScript for display and for building applications:

https://en.wikipedia.org/wiki/NeWS


Potayto, pohtato… it’s all Polish to me. ;)


Ehh... the reductio of this argument is writing everything in assembler (libc? giant hunk of third party code right there). I surmise that, by comparison, the blowback you encountered was relatively minor.


No, not writing everything in assembler, this isn't about high or low level. It's about writing things yourself for what you actually need.

Because most of the complexity comes from code (esp. libraries and drivers) trying to solve a larger problem than you actually have.

That's the same reason why, when you follow that logic, you eventually write your own Forth. Not because it fun. Not because you want to learn about Forth or compiler. But because my Forth solves my problems the way I see fit, her Forth solves problems the way she wants, and your Forth is going to solve the way you want.


It is entirely and completely about high level vs low level.

"High level" means details abstracted away and solved so you don't have to think about them. Our CPUs understand only the most primitive of instructions; the purpose of all software is to climb the ladder of abstraction, from a multiplication routine abstracting over repeated addition, to "Alexa, set an alarm for 8 AM." To write things yourself is the very essence of descending to a lower level.

Abstraction comes at the price of loss of fidelity, yes - Alexa might not ask you to specify exactly what form your alarm will take - but the benefits are a vastly increased power/effort ratio. It's worth it, because most of the time you don't care exactly how a task is done - you just care that it IS done. And - mostly - your needs are not that special.

Frankly, sharing information on how to do things so that others can build upon them is the only reason we have technology at all. Perhaps you've read "I, Pencil"? With a lifetime of effort and study, you would struggle to create a single pencil drawing from "scratch". Chuck Moore's supposedly astonishing productivity notwithstanding, I notice that all of the software I actually use is a heavily layered tower of abstraction (and, curiously, none of it is written by Chuck Moore). It appears that by and large the choice is between layered, multi-author code - and no code at all.

https://fee.org/resources/i-pencil/


> Chuck Moore's supposedly astonishing productivity notwithstanding, I notice that all of the software I actually use is a heavily layered tower of abstraction (and, curiously, none of it is written by Chuck Moore)

Perhaps you never saw the images from the Philae space probe? Because that's an RTX2010 that powers it, one of Chuck Moore's designs.

Maybe you don't use Moore's software directly, but you never know when it has been used for you [1].

[1] https://wiki.forth-ev.de/doku.php/events:ef2018:forth-in-tha...


There's a significant practical difference between importing the complexity at build time versus as part of the running application. Building on top of a compiler is not the same thing as importing external code.


Software today is developed by teams, not individuals. Systems custom-fit to an individual programmer are next to useless. You need libraries of common code in order to collaborate effectively without duplicating effort.

See also: Emacs, the ultimate customizer's editor, easily shapeable to your particular needs -- and currently losing badly to Visual Studio Code which is only readily customized with configuration options and third-party packages. When you need to pair or mob, having a common toolset and vocabulary beats having a special-snowflake environment.


at least we can get rid of the bloated web apps once everyone begins to do so...but would it be possible if libaries are written in a way that it's easy to just integrate just a portion into existing code?


The problem there is that most "libraries" are actually frameworks.

The difference I'm drawing being, libraries just provide a mess of utility functions. Theoretically, even if your compiler won't strip the library stuff you don't need, you'd be able to take just the bits you need by copy/pasting a relatively small volume of code. And dropping the library would be a small change, that just requires finding replacements for the functions and classes you were using.

Frameworks tend to involve some Grand Unifying Abstraction that you need to inherit from, and that gets imposed on your own code. Things tend to be so tangled together at a conceptual level that it's not really possible to use them in an a la carte manner. Migrating off of a framework tends to require more-or-less a rewrite of all the code that interacts with it.

To take some Web examples: jQuery's more on the library side of things. D3 is more of a framework. React is very much a framework.


Wow that go me thinking. What if specialized AI code recommenders could sniff out solutions. Get away from libraries with objects or structs with methods that mutate. As more people realize composing functions (Forth has concept of composing words, correct?) with fewer side effects is a good thing, I wonder if it's possible. There is some amount of my workflow where I'm looking at StackOverflow, my git project history or others, examples even on blogs (at least when I was new), or my little code snippet journal for stuff already solved. Automate getting idiomatic solutions from a StackOverflow or Github commits of sorts, or something. I know we are no were near, but FB's Aroma and others have the first gen AI recommenders in the pipeline that at a high level do this. That way we are just dealing with code snippets. I've only read Forth code and introductions to it, but it seem all about composition. However this is hard to conceive with today's coding forums and repos because most are gluing mutating library APIs (turtles all the way down) together. So a code recommender paradigm of this sort is chicken vs egg.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: