Hacker News new | past | comments | ask | show | jobs | submit login
“Coding is basically just ifs and for loops.” (twitter.com/id_aa_carmack)
496 points by x43b on Dec 4, 2021 | hide | past | favorite | 393 comments



"First a punch is just a punch, Then a punch is not only a punch, And finally a punch is just a punch" (heard from Bruce Lee)

Basically it means that in the beginning we punch like we can.. then a martial arts student learns the proper and different ways to punch.. so a punch is no longer just a punch... Is a lot of instructions.. Then to sheer practice and repetition the right way to punch becomes second nature.. so the right way to punch is the only way so it's only a punch again.

So coding it's just ifs and loops... Then is OOP and Functional and Logic... But then once you transcend all those paradigms you understand that is all ifs and loops.


Before one studies Zen, mountains are mountains and waters are waters; after a first glimpse into the truth of Zen, mountains are no longer mountains and waters are no longer waters; after enlightenment, mountains are once again mountains and waters once again waters.

Translation from http://www.livinglifefully.com/zensayings2.htm


"Before enlightenment: Ifs and loops. After enlightenment: Ifs and loops."

:)


heh.

"Before enlightenment - chop wood, carry water. After enlightenment - chop wood, carry water."


Before enlightenment coding is logic.

After a first glimpse it’s all about structure and architecture.

After enlightenment, coding is logic again.


I started off writing a lot of 68000 machine code. I was always amazed what you could accomplish with a lot of loops and branches. I never lost that sense that at whatever higher level I was at later on, it was all loops and branches.


Have you seen this? https://news.ycombinator.com/item?id=25788317

Seems like we lose a lot of good technology and progress for random reasons, like the “ram” of society is finite.


Thank you GhettoComputers for the link! And thank you Zhyl for the summary.

Sounds like the 2 hard things in Computer Science returning! Cache invalidation, naming things, and off-by-one errors. [1]

* "Knowledge is ephemeral": Cache invalidation

* "tower of abstraction": naming things, indirection [2]

Recommendations:

* "Reduce complexity, reduce dependencies": minimise entropy, maximise connectedness

* Learn: welcome newcomers; recurse to the next generation

[1] https://martinfowler.com/bliki/TwoHardThings.html

[2] All problems in computer science can be solved by another level of indirection" "...except for the problem of too many layers of indirection."

https://en.wikipedia.org/wiki/Indirection#Overview


The redirection he called an abstraction, and how it makes things easier to make, with less performance and in manh way it’s good, but there’s a point where it isn’t, and solutions like the idea of needing an AI to tell us everything shouldn’t be required.

I like how he talked about using bare metal to run games and not needing an OS, relying on an ISA, or how games were easier to at one point before excessive abstractions.

One very memorable comment he made was when he was asked about what he thought about the M1 chip running software faster, his response was that we could use current hardware and make it run 100x faster if it was well written. Huge part of the many reason I switched to Linux. ;)


Yes, it's a good talk. Think of all the extremely talented internal combustion engine engineers that will be obsolete in 20 - 30 years.


Obsolete or repurposed? People studying physics like string theory may be completely wrong, but their skills can still be useful.


They’re obsolete now, except on YouTube.


ICE has thousands of uses, they aren't just for cars.

An example? Up above the treeline, there are communities relying on diesel for power.

Solar is a no go, with months of darkness. Wind turbines are hard to maintain at -50C, and snow and ice can impede them.

Even electric cars can only cover some of car use-cases, both due to range and refuel time issues.

Expect ICE in 2050 still.

And in most places, electric cars take petrol out of the car, and instead, see things like coal or natural gas burned, to make the electricity to charge them.


> And in most places, electric cars take petrol out of the car, and instead, see things like coal or natural gas burned, to make the electricity to charge them.

This is true, but also a good thing, while you present it as at best a wash and possibly a negative.

Coal powered EVs are better than gasoline ICEs in most situations, and nobody has a 100% coal grid anymore so the benefits are even greater. There's lots of reasons why but the main one, which is relevant to this conversation as well, is the inefficiency of ICE compared with electric motors.


What is more green? Coal or natural gas plants, AC lines that lose some power, building transformers, using AC to DC converters for all our power supplies, refining rare earth minerals to make decaying chemical cells, and using brushless motors (I think?) versus oil drilling with huge machines, refineries, transport and risks of oil spills, and building combustion engines? ICE looks like it can be way more efficient.


We don't need to guess, lots of time and energy has been put into answering this question:

https://evtool.ucsusa.org/


They don’t even have a Toyota Corolla or Camry in the miniature database.


In the list of electric vehicles, I wonder why not?


> Coal powered EVs are better than gasoline ICEs in most situations

Are you taking in account battery disposal?


Yes. It's actually one of the things that makes EVs greener. A lump of valuable metals in one relatively easy to access and pure form is a lot greener thing to have than a decade of combusted fossil fuels.


Hydrogen fuel isn't going away.


Reminds me of biological metabolic systems. All loops and branches…


Stephen Wolfram thinks the entire universe is built from simple rules

https://en.wikipedia.org/wiki/A_New_Kind_of_Science


For the algorithm to work in many dimensions (XML tags), I feel like it must be a simple algorithm.

Rule 110 is known to be Turing-complete. https://en.wikipedia.org/wiki/Rule_110

I have a feeling that Rule 90 is the most efficient algorithm: the Sierpiński Triangle, self-sacrifice, infinite recursion to a perfect pattern. I'm not mathematically talented enough to prove that though.

https://mathworld.wolfram.com/Rule90.html

Please come to chat about A New Kind of Science on Facebook! A couple of weeks ago I noticed that the page didn't exist, so created it. Hopefully Wolfram doesn't mind. We stand on the shoulders of giants.

https://www.facebook.com/A-New-Kind-of-Science-1004052657757...


I'm no physicist, but doesn't the Standard Model fit on a few pieces of paper? That the universe has rules that are a lot simpler than the behavior that emerges from those rules doesn't seem like novel claim. Wolfram's idea to examine the universe by exploring CA-space is the novel part I think.


With a lot of fuzziness, some state and temporal stuff.


Yea, the fuzziness comes from protons shuttling about, tunneling around, electrons moving in cycles with phosphates. It's all just vibrating, bumping and bumbling about. I remember how blown my mind was when I first learned that all reactions catalyzed by enzymes would have happened anyway given enough time and the enzymes just tune the systems to work in concert in a time frame conducive to life.


Yes, one of my early jobs was writing Z80 code and I have the same sense.


When I eventually learned C it took me awhile to stop optimizing and let the compiler do it


Ultimately all the interesting stuff happens in loops and branches, the rest is just organization.


The organization is only needed for the inevitable bloat you add later to justify the rewrite.

Its something like the sage uncovers the boiler, points at it and announces: Look! It is a beautiful marvel of technology! Why would you want to cover it up? Look! You might learn something!


It’s not about how many loops you have; it’s about what you put in them and how deep.


Coding is not "ifs and loops, then $foo". That's a false premise. If you want to be fundamentalist about it (which I don't advise), a CPU is just a state machine.

assembly instruction + machine state => new machine state

Our idea of "jumping around code" and "control flow", as in for loops or if statements, are themselves abstractions of control state (an instruction pointer) and data state (perhaps a CPU register).

So coding is really "the execution of a notional interpreter (aka semantics), then $foo." That gets to the absolute bottom of what "code" even is: instructions for an interpreting device (be it physical or mathematical).


Oh, but CPU instructions can be built up from a series of μops which are then executed out of order or in an overlapping fashion, making both "instruction" and "state" boundaries in your equation more fuzzy than it looks. So at the absolute-absolute bottom of "code", it is instructions for an abstract model of an interpretative device.


(I'm not sure if your "but" is a retort, or an addendum. Assuming retort...) We can consider one more level of implementation machinery, but I fail to understand how uops aren't another form of instruction, and CPU internal gobbledygook like register files, caches, etc. aren't another form of state. It doesn't seem so fuzzy.


The difference is that you don't get to access the uops machine, subsets/sequences of that instruction set are offered as (macro-)instructions instead. Programming happens on the idea of a CPU.


> Oh, but CPU instructions ...

And who says your code runs on a CPU? :^)


That's more or less shu-ha-ri https://martinfowler.com/bliki/ShuHaRi.html. Very useful concept, I often refer to it: for example to argue why "just be agile" probably won't work when there's juniors or when the team is new-ish. That's skipping to ri, start with shu (scrum or whatever set process).


That’s what disappoints me in modern Java. There is practically no ifs. It makes it inaccessible to beginners on the project, and the streams are about twice as slow… just because devs think they are more clever if they pile 7 “.map()” to transform a list.

list.stream().filter(Objects::nonNull).map(User::getUsername).filter(Objects::nonNull).filter(name -> name.contains(searchString)).map(name -> “We have found your user: “ + name).orElse(“We haven’t found”);

Basically unreadable.

for (User user : list) if (user != null && user.getName() != null && user.getName().contains(searchString)) return “Found it!”;

return “Not found”;


That’s a failure of Java’s language design, not a failure of the functional/declarative paradigm.

Your for loop can do all kinds of damage to the list, and you have to read it all to find out what it does. Saner languages make the functional version more expressive.p:

  return first(users.filter(u => contains(u.name, searchString))
I’m not saying JS is a sane language; it has an anemic standard library without a robust “first()” or “contains()”. But that code gets you the first user where the name matches, without resorting to indexes or null checks or worrying about how a for loop can mutate lists that it should not.

EDIT: I should have used filter(list fn) here, I was trying to write something plausible in JS instead of a purer functional language and misses that translation.



Same caveat applies here as my other sibling comment. I shouldn’t have to do null checks before calling functions.


I shouldn't have to QA properly written code.

Alas, reality, she is a cruel mistress.


>"Your for loop can do all kinds of damage ..."

So can programming in general


>and you have to read it all to find out what it does.

Which isn't a problem, because as long as its all loops'n branches, the code is easy to understand.


Hard disagree. For loops do not have guarantees about in-place mutation; that’s why they require a full read. Anything that starts with a map/filter is guaranteed to only apply to the thing that it is assigned to. That’s more than worth its weight in grey hairs.

EDIT: “full read” includes “everywhere else this array is used, which can have far-reaching consequences if it was passed into the containing function by reference.”


Most temporary variables are single assignment, and many newer procedural languages do have variable declaration syntax for single-assignment variables. Such as Javascript's `let`.

Another wide-spread syntax feature is for-loops over collections like `for item in list`, again being extremely easy to read and not requiring any assignments. So, accidental in-place mutation is not an issue in practice. But, in-place mutation can be very useful to express logic in an easy to understand way.

Coincidentally my main language does not have any of these features, and in-place mutation is like 0.1% of all my problems. (And notably, memory management is < 5% of my problems).

These are all just theoretic problems, what really matters is developer experience. You can write terrible bug-ridden convoluted code in any language, ESPECIALLY with many filters and lambdas and callbacks.

The issue with the functional style is that it becomes harder to see the control flow and run-time characteristics. I'm sure functional style is great to express set operations (like DB queries) but IME they tend to optimize for source code brevity and pessimize for runtime performance.


DB queries are a good analogy, especially if we qualify them as atomic operations. I’m hard-pressed to find a situation where procedural mutation-in-place makes more logical sense than an atomic update of the whole list, so why not make atomic updates the default?

It’s different if you’re working in lower-level code. I don’t intend to trivialize performance; the option to “go procedural” should be available. I just think it’s the wrong default. Operating systems and some kinds of Big Data need those perf boosts.

But for your run-of-the-mill CRUD app, map/filter is both clearer and safer than a for loop. Strong language support encourages this both algorithmically and syntactically—unlike what Java did, which is what I originally replied to.


JS "let" is not single-assignment; did you mean "const"?


Oops, guess you're right. I haven't written any Javascript in years.


>For loops do not have guarantees about in-place mutation;

Every modern language has a `for elem in range` construct, giving exactly that guarantee, as long as the element isn't a pointer.

Besides, I neither want nor need such a guarantee. There are many scenarios where in-place mutation is exactly what I want to do in a loop, and I don't want to fight the language, aka. my craftsmans tool, to do it, just because it imposes an arbitrary restriction on what I can or cannot do.

Is this a potential source of bugs? Of course it is. That's why I have tests, code reviews and debuggers.

And besides, nothing prevents me from doing the functional approach in languages like Golang, Python or Julia *if I want to*. I simply refuse to use languages that that force a paradigm on my code. This is true not only for "pure functional" languages, but also Java, which wants me to use OOP everywhere.

To me, paradigms are tools. They need to be there when I need them, and get not get in my way when I don't.


The first, filter, map, etc. functions are also only loops&branches under the hood, no? :)


JS has Array.some() for "contains"


The caveat there is that you cannot call it without a null check on “u.name” first. contains(null, “whatever”) would just return false instead of throwing a runtime exception. Hence my edit: users.filter() has the same problem.


the two snippets you posted do different things

In a list

  John
  Jonhatan
  Joy
searching for "Jo" returns all three of them in the first example, it stops when the first has been found in the second.

Start including the tedious bits about adding found items to the list and the waste of intermediate variables and your "clear" code is wrapped around a lot of repetitions, that only add noise.

It just happens that you are more familiar with the second style, but pipelines are better in many other ways, clarity of intentions being one of them.


The code he posted isn't actually valid because you can't "orElse" a list. That being said, I would presume it was meant to include a "findFirst". Something like

    list.stream().filter(Objects::nonNull)
        .map(User::getUsername).filter(Objects::nonNull)
        .findFirst(name -> name.contains(searchString))
        .map(name -> “We have found your user: “ + name)
        .orElse(“We haven’t found”);


exactly, that was the catch.

the point I wanted to make is that the snippets presented seem carefully crafted to make pipelines look bad, but usually that's not the case.


The snippets actually do the same thing, as long as you add a .findFirst() to the first example to make it valid Java code.

Intermediate stream operations like map or filter are always lazy. And .findFirst() is a short-circuiting terminal operation, that does not need to consume all stream elements.

https://docs.oracle.com/en/java/javase/11/docs/api/java.base...


obviously that was the point, if you don't even care to write correct code, you can make every piece of code look bad.

Truth is java streams and, even more, reactive java are the only two things that make java bearable, god save who invented them.

If I had to program java with for loops and list.add or null checks everywhere, I would probably kill myself on the spot.

But the amount of programmers taught to program like it's still 1995 it's so damn high

And I am closer to my 50s than my 40s, I can't understand why people fresh from uni say they can't understand something like

  list.stream() 
  .map(this::maybeGetUser)
  .filter(maybeUser -> userIsPresentAndNameStartsWith(maybeUser, prefix))
  .findFirst()
  .orElseThrow(new UserNotFoundException());

with the benefit of being lazy and only inspecting list items up to the first one that matches (and if you don't mutate any data, easily parallelizable)


You forgot to add null-check ifs for `list` and `searchString`. Oh and check-wrap for exceptions too!


Modern Java looks horrible to be honest. I create abominations like this in JS pretty often, but it isn't good code. Best if wrapped in a huge try/catch where the error case juwt prints "didn't work for some reason...".

https://esolangs.org/wiki/Pancake_Stack Pancake stack doesn't need this hipster crap like functional programming or ifs or fors.


I don't write java, but assuming it has a null coalescing operator you can likely do all that in a single map in a way that is (in my opinion) cleaner and easier to read than both of your examples.


Kotlin does but sadly Java’s Optional<T> has no syntactic sugar, so it ends up looking like

  Optional.ofNullable(user).map(User::getUsername).map(name -> name.contains(searchString))


Java has never really liked to do in 10 characters anything that could be done in 30 characters instead, especially if it obfuscated things a bit more at the same time.


I just started using Java's functional side a few weeks ago. Lambdas are really nice and often make the code clearer. I like writing more declaratively.

But the stream interface is just horrible. I had to read a lot before I understood how to just filter a list and collect the results. And even after that, it still somehow didn't scream out what it was doing to me. I guess you get used to it.


Coding is values, arrays, dicts, ifs, loops and functions. An expressive set-like operations library over arrays (& dicts) gives superpowers.


Groovy is still far better and more readable. And with @CompileStatic it is basically the same speed.


Scala is super readable

   users.find(_.name.contains(str))


Perceived complexity peaks at intermediate skill. There's a concept in learning theory that perceived complexity is diamond shaped.


I wonder whether you could apply this analogy to physics and science in general. If so it seems like we haven't reached the peak/middle yet.


Trying to explain Zen by starting with "basically" is a bit ambitious of your part haha :)


“参禅之初,看山是山,看水是水;禅有悟时,看山不是山,看水不是水;禅中彻悟,看山仍然山,看水仍然是水。”

These are all from the Zen philosphy, it says:

When getting start with Zen, I see hill as hill, water as water When get experienced with Zen, I see hill as more than hill, water as more than water. When finally got mature with Zen, I once again see hill as hill, and water as water.


Everything else is only there for the monkeys pressing the keys.


And the monkeys reading the code to see what it does. I don’t recall who it was exactly but some early computing heavyweight, maybe Von Neumann, thought that any language which requires a compiler was a waste of resources.


Computers used to be so expensive that it made more sense to hire a “programmer” to convert an “analyst’s” flowchart into machine code using only a keypunch.


I love this :). How much of mankind is creating things sheerly for their pleasure?


Or a set of composable primitives if you’re using an array language. What, you think if and for aren’t abstractions over jump/goto?


Before enlightenment: chop wood, carry water. After enlightenment: chop wood, carry water.


As with so many other things, through mastery and practice of the “right ways” you will often discover flaws in them and that is when that which you are practicing begins to become a true art form.


Become water my friend - don't stick to just one way.


One of my quips about programming was "it's all just ones and zeroes"


That first reply is so funny to me because it hits too close to home

https://twitter.com/nice_byte/status/1466940940229046273

The more I do this, the more I gravitate towards the simple things.


New developers write simple but shortsighted code. It gets the job done, but it will be painful to work with in the future.

Intermediate developers, bitten by their past mistakes, decide to future proof their work. But they don’t just look one or two steps ahead, rather they try to look five steps ahead and identity problems that do not and may never exist. Consequently, they over-engineer, over-abstract, and over-complicate everything. Carmack’s recent “But here we are” speech resonated with me.

Advanced developers identify the right balance for each task between simple and complex, concrete and abstract, and pragmatic and idealistic. In my experience, they favor simple and pragmatic solutions, use abstraction effectively, and can satisfy near-term goals quickly without painting themselves into a corner. “As simple as possible, but not simpler.”

I try to avoid working for tech leads stuck in the second phase, which is not uncommon. If you suggest taking the “ifs and for loops” approach of solving a simple problem with a simple solution, they’ll assume you’re on the wrong side of the bell curve.


Had a boss once who insisted that all if statements should be pushed out to factory classes, and all control logic should be done by constructing a different instance of an interface. It was a pretty rigid system, but at least it did force me to write small focused classes that were easy to test.

Debated for a long time whether that methodology was stuck in the second phase or if it was actually the third. Still don't have an answer, but these days I think having a plan is better than just letting engineers run roughshod, as long as the conventions are easy to follow.


That just sounds like enterprisey Java to me, which I firmly believe is closer to 2 than 3.


Programming alone vs programming in a team are very, very different occupations. A lot of what applies to one doesn’t apply to the other. I’m still painfully learning this, after all these years.


Could you elaborate please?


I think it comes down to there being two kinds of codebases...

In the first kind, all relevant developers have deep expertise in the system, or build towards having deep expertise. There's an expectation that flexible abstractions will be used, not abused, unless it's one of those scenarios where the use outweighs the abuse. The abstractions are tomato cages, and they're there to support the system as it grows, provide some structure, but not to strangle it.

In the second kind, the default expectation is that a developer will have little to no familiarity with the system, they will be isolated from it as much as possible, and they will be given such a tightly constrained sandbox that they can't break anything outside it. You will write your little plugin, or whatever, get in and out, and you're done.

These can both be useful kinds of systems/codebases in orgs of any size. The first kind of codebase can enable an experienced team to move really fast and be extremely productive. The second kind of system can help lots of different teams of different skill levels jump in and leverage your system with little required knowledge, and thus be productive that way. So there's really no way to say one of these patterns is good or bad.

But in general if you churn in and out a bunch of replaceable cog code monkeys, probably low-paid, the second kind of system just ends up working better. Giant "enterprise" software shops like parent poster aluded to typically end up in this kind of high turnover scenario after enough finance/MBA people have been brought in, hence their bad rap.


I have much more experience working alone, but: git alone is a breeze. I don’t need to impose myself some arbitrary constrain because I know (or think I do) I won’t abuse it. I can use the stack I consider best/am more productive on, instead of what’s fashionable this year. No linting, coding conventions, etc. Just the pure joy of problem solving.

On the other hand, on the right conditions, the amount you learn on a good team is ridiculous compared to what you’d do alone. Weeks vs years kind of thing.


The key insight to "at least destroying your architecture makes it easy to unit test" is that being able to unit test is not actually that important. There's other kinds of testing out there!


Q: What's the difference between a novice and an expert?

A: The novice thinks twice before doing something stupid.


I don't understand this saying.


Often doing the stupid thing is the right thing to do, instead of thinking hard to do the smart thing that ultimately wont be needed and will be harder to understand.

So juniors thinks twice before doing stupid simple stuff. Intermediates thinks twice and does smart stuff. Seniors only does smart stuff where it is needed and does the stupid simple stuff without thinking in most places.


well actually I am not familiar with the phrase before, so maybe KineticLensman knows the proper meaning.

That said for that specific meaning I would prefer something like - The novice fears simplicity.

Maybe throw in a - The expert loves it - at the end.


when the expert thinks twice they don't do something stupid?


That's not how I interpreted it at all.


"It's common to become very good at doing it badly"


There is another important difference: the expert thinks twice before doing anything at all.


And that's also why a lot of Architecture Astronaut that looooved Java, didn't see Python coming.


Funny, I took over a modern python service and I was pretty shocked at what I inherited. Long gone are the days of "There's one way to do things".

Instead, this thing would give the most "enterprisey" Spring JEE application a run for its money with its endless annotations, dependency injection magic, all sorts of pseudo-types - both the "built-in" Python 3 ones like Set and List, but also the libraries like Pydantic. But unlike Java, these types aren't even really guaranteed by the language at compile time, so even if your IDE plugin can successfully detect them, things will still (silently) slip through at runtime.

The async functionality that's been bolted on to the language is worse than even the old and confusing Java multi-threading primitives, and the funny thing is it still doesn't actually run things in multiple threads. For that, your simple Rest API is running on layers of C programs like Uvicorn which itself is then wrapped by another few processes running Gunicorn which in turn is probably running behind NGINX. LOL, and we thought the Servlet stuff with Tomcat and JBoss was clunky - this is insane.

To be honest, if there ever was a sweet spot for Python, it would have been for smaller code bases that weren't complex enough for big "enterprisey" langs like .Net or Java, but were more permanent and complex than shell scripts or (back in the day) Perl could handle.

But nowadays, I don't think modern Python fits any use case real well. It's still dynamically typed, slow, single-threaded, and has a poorly managed ecosystem and hodge-podge of tools like virtualenv, pyenv, poetry, etc. that never quite become standardized and stable.

So unless you've got a bunch of Python experts who aren't interested in moving to a better lang, I'd find it hard to go with Python for new projects.


> ...endless annotations, dependency injection magic, all sorts of pseudo-types

This sounds like what happens when a bunch of Java/C# developers jump over to python without learning the "python way" - this is more related to the developers than the project

> But nowadays, I don't think modern Python fits any use case real well

Python has effectively taken over the data science / machine learning space. For most use cases, the algorithms are massively more important than the language.

> poorly managed ecosystem and hodge-podge of tools like virtualenv, pyenv, poetry, etc. that never quite become standardized and

This is true, but Java and C# also have many issues in this respect. The move from Java 8->11 is particularly painful - many fundamental libraries related to security or connection handling were not backwards compatible. Many libraries now require multiple branches for different JDK levels. Maven and Nuget are about as good as pip - they all have weird edge cases.

I use both Java and Python on a daily basis - each has their strengths and weaknesses. Java is great if need long running processes with weeks/months of uptime, Python is great for backend data manipulation and analysis.


Long gone are the days when Python was a hacker tool. It’s been an enterprise thing since startups got wind of the fact that it could be used to attract programmers, to attract VCs.


Recently posted this, which wasn’t well received.

https://motherfuckingwebsite.com/

Where does this fall?


In going too basic. We can go to simple and elegant without going all the way back to crappy stone tools.

http://bettermotherfuckingwebsite.com/

https://evenbettermotherfucking.website/


It’s interesting that on mobile the original one is actually nicest (to me).


right, the problem is that on wide displays we really aren't meant to read, say, 27 inch-wide columns of text.

The silly thing is that it's literally one CSS property, max-width, that takes care of that problem.


Thanks for that link! I actually like the other one much more. I really appreciate JS free sites, Amazon can be used without JS for example.


That site is a piece of shit, because it uses google analytics, thus snitches to a big brother.


You’re right, I didn’t notice! Gonna link to evenbettermotherfucking.website from now on.


I'm currently working with a team of Advanced developers for the first time in my career. Nobody is padding their CV with fancy shit. Everyone has gotten the "new shiny" out of their system.

Everything is as simple as it can be, no simpler and no more complex. Sometimes a bunch of flat JSON files in an S3 bucket is enough of a database, you don't need a 42 machine Aurora cluster.

All fancy "Machine learning" stuff really is just a bunch of ifs and for loops internally :D


> Carmack’s recent “But here we are” speech resonated with me.

Watching this right now and all I can think about is Microsoft Bob.


Well said, this hits home. Also constant refactoring and maintenance of the codebase without adding any new features.


Bull's eye.


Despite being a joke, I know it's the "Ha Ha Only Serious" [0] sort. I can't help but think this is severely biased by the trends of "enterprise software," where you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature, Jedi-like way of thinking about programming, like the meme suggests. (And, consequently, you spend no less time debugging hairy, nested, imperative conditionals with for-loops that are off-by-1.)

I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.

Simpler building blocks does not necessarily mean simpler solution. If only!

[0] https://en.m.wiktionary.org/wiki/ha_ha_only_serious


> you eventually "give up", and clock your 9–5 writing if+for making a fine living, but erroneously pass that off as a mature

This comment sure indicates to me where you most likely are on the curve.

In all seriousness, I think this is considerably off the mark. After enough experience you realize that expressivity and convenience are antipatterns and don't actually simplify things but are harbingers of complexity, bugs, tech debt, even the downfall of organizations and products.

Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar.

I try to keep a healthy and balanced diet, myself.

> industry is slow to assimilate most state-of-the-art ideas, sometimes by as much as 40 years.

Most of those ideas are terrible. The industry is so incredibly young and has experienced so much change over those 40 years that I have a hard time accepting the notion that the industry is slow to adopt. The reason the old building blocks are still popular is because they are a thin abstraction over how computers work, and ultimately that is at the root of everything we do.


Really, there are no if+for, just compare and jump. Why don't we use what the metal uses, instead of these "expressive abstractions"?

If+for have no deeper foundational significance in the construction of programs or computations, literally, than say a lambda function. But because the latter is unfamiliar, it's spoken about in the same manner you present: as if it is some highly abstract, complicating, high-level feature (when truly that take is just baloney).


> Why don't we use what the metal uses, instead of these "expressive abstractions"?

Because the "expressive abstractions" are much easier to reason about and save programmers lots of mental effort. And, as I have commented upthread, ifs and for loops are by no means the only such abstractions.

> because the latter is unfamiliar, it's spoken about in the same manner you present: as if it is some highly abstract, complicating, high-level feature

If expressing your program in the lambda calculus is easy for you to reason about and saves you enough mental effort, go for it. But don't expect your code to be readable or understandable by many other people. The reason why ifs and for loops (and other "expressive abstractions", since as I have said, those are by no means the only ones) are ubiquitous in programs is that they are easy for lots of programmers to reason about. Whereas the lambda calculus is only easy for a very small subset of programmers to reason about.


I'm not suggesting people "express programs in the lambda calculus", but instead that incorporating a philosophy of functions and their composition is not at all a bizarre concept. Loops and ifs work miserably with any hierarchical data, compared to recursion, for example. A lot of the world's data is hierarchical.

We now have a chicken-egg problem. I can freely admit that for+if is easy for programmers to understand solely because of how we are educated, and not due to any Bruce Lee hocus pocus about simplicity or fundamentalism, as so many others here suggest.

A programmer who, say, learned from SICP first would find a for loop awkward and bizarre when you could "just" tail-recurse.


> incorporating a philosophy of functions and their composition is not at all a bizarre concept

Sure, functions and their composition are very often another highly useful abstraction that is easy to reason about and saves the programmer a lot of mental effort.

> We now have a chicken-egg problem.

No, just a recognition, which was not there in the original claim, that ifs and for loops are not the only useful abstractions.

> A programmer who, say, learned from SICP first would find a for loop awkward and bizarre when you could "just" tail-recurse.

Perhaps, but I think such a programmer who needed his code to be readable and understandable by a lot of other people in a large project would end up inventing a "for-loop" macro or something similar that, while it might use tail recursion under the hood, would express the construct in a way that programmers could understand regardless of how they learned programming.


> as if it is some highly abstract, complicating, high-level feature

But symbol calculus is a highly abstract, complicating, high-level system assembled out more reality-based systems beneath it. If it seems simple to you, you're just under the curse of knowledge.


I'm not sure what a "symbol calculus" is. Do you mean "lambda calculus"? I think that's a lot less complicated and abstract than a fabled machine with an infinite tape that's controlled by a transition graph of symbolic states. :)

And I don't know what a "reality-based system" is.


You're confusing computer science with writing software to run on hardware. Coding requires the latter, but not the former.


Oof, and to think one could helpfully inform the other! :) To be clear, I am a programmer, not a computer scientist, so my opinions are based off writing code and managing teams to write code that works, and less so about abstract machinations of computer scientific thinking.


For the record, I generally think that people exaggerate the complexity of Functional styles.

I think the reason that stateful, imperative code is the default is because that is how we tend to "think". Example, if I wanted to get a list of items for the grocery store, I get a piece of paper (instantiate an array), go through the list of things I might need (loop through a source array) and write down the ones I definitely need (append onto the destination array). If I run out of space, I get a new piece of paper and continue the process.

With that algorithm, I can stop and test each individual statement; I don't need to "think about it" because I can write and execute a tiny part of it (even if I'm only executing it in my mind).

In a functional style, I have to think more declaratively. Given all my items, filter out the ones that need replacing and give me that list. It's much easier to reason about abstractly, but I have no details about w how that's actually happening.

I think this kind of thinking is superior most of the time: I would rather read map.filter.reduce more than a for loop with branch statements any day of the week, but I am trusting the implementations of the mapping and filtering and reducing functions. Of course, for any non-trivial algorithm one is still passing a function into the map/filter/reduce anyway so I can still reason about those smaller stateful sections without worrying about the map/filter/reduce piping.

Perhaps I've never worked in a truly pure functional style so I may not be dealing with the mountains of abstractions others seem to complain about


Which without side effects can be very heavily optimized (probably even better) than imperative code? Hell, the pipelining inside GPUs, and CPUs are perhaps closer to lambda calculus than Turing machines.

It’s not like the machine code will look much closer to your C code either. That’s also a spell of “compiler writers and hardware vendors trying to uphold the view that C programmers are so close to hardware and that memory access is flat”.


Personally I disagree. We should be using state machines and pure functions. If+for loops are just what's easiest to express in the major languages of today. they are no more or less computationally expensive but due to lack of tooling they are often cheaper to write.

In languages and libraries that allow FSM and pure functional kernel based designs you can get just as clear logic that is expressible not just to the programmer but also to business personnel. It's counter-intuitive to a certain extent because so much of programming is built around imperative programming but FSM based logic is and will continue to be easier to understand long term because you can trivially visualise it graphically. This ultimately is what a lot of the functional paradigm is built around. Use the mathematical and graphical representations we've used to understand systems for decades. They are well understood and most people can understand them with little to no additional education past what they learned in their business or engineering bachelors degrees.


In a very real way, it's all conditional jumps in assembly, and every thing you've learned to make programming easier by allowing more directly letting you express your high level intent is just sugar. It might even help some or most of the time. But what you're actually doing is creating a bunch of branches and loops, and as much as the high level stuff might help, you really shouldn't forget this is the medium you actually work in.

Most professions have a healthy respect for the base materials they work with no matter how high the abstractions and structures they build with it go. Artists know their paints, stone, metal, etc. Engineers know their melaterials as well. They build by taking the advantages of each material into consideration, not assuming that it's no longer relevant to their job because they get to just work in I-beams. Programmers would do well to adopt a healthy respect for their base materials, and it seems like often we don't.


> high level intent is just sugar

I disagree with how your use of "just" here. It's common for programmers to dismiss the importance of syntax but syntax and notation are the interface and UX between the language semantics and your brain. It's no less important to get this right. There's a half-joke that Europe was able to rapidly advance in Calculus beyond Britain due to the superiority of Leibniz notation.

> healthy respect for their base materials

What's unique about computers is the theoretical guarantee that the base does not matter. Whether by lambda calculus, register machines or swarms of soldier crabs running from birds in specially designed enclosures, we're fine as long as we appropriately encode our instructions.

> bunch of branches and loops

You could also easily say it's just a bunch of state machines. We outsource tedious complexity and fine details to compiler abstractions. They track things for us that have analogues in logical deduction so that as long we follow their restrictions, we get a few guarantees. When say, writing asynchronous non-deterministic distributed programs, you'll need all the help you can get.

Even designing close to the machine (which most programs will not need) by paying attention to cache use, memory layout, branch divergence or using SIMD remain within the realm of abstractions.


Agree with this a lot. In other words, don’t be too clever. That leads to an unmaintainable codebase. There is value in simplicity and not overly using abstractions that take you farther and farther away from bare metal.


> There is value in simplicity and not overly using abstractions that take you farther and farther away from bare metal.

This is a contradiction. Simplicity is obtained through abstractions. As an example, fifty years ago, 'goto' reigned supreme. Then structured 'if/else/for' took over, as the superior abstraction over 'goto'. Now use of 'goto', while being far simpler to implement and closer to the bare metal, is commonly derided by many programmers.

The long term trend in software is constantly increasing abstraction hiding mountains of complexity, which increases both simplicity and complexity. Writing print('hello world') is simple, but the compiler, OS, silicon, etc. that makes that simplicity possible is extremely complex.


Programming is primarily about managing complexity. Other than perhaps mathematics, there is no other field that must, should and can apply the amount of abstraction on top of abstraction as software engineers do.

It is a must, because decades of business requirement built on top each other without understanding the whole is complex. Writing a JIT-compiler that can routinely change between interpreting code and executing it natively, a database optimizing queries, a mathematical library using some fancy algorithm are all complex, in a way that is not reducible.

Complexity easily outgrowth even the whole of our mathematics, we can’t prove any non-trivial property of a program, halting problem, etc.

So all in all, no, we can only respect our “base materials” by finding the proper abstraction for the problem, as our base material is complexity itself. It might be for loops and ifs, but it very well be a DSL built on top of who knows how many layers, because at that abstraction level can we even start to map the problem domain to human consumable ideas.


> Personally I disagree. We should be using state machines and pure functions. If+for loops are just what's easiest to express in the major languages of today. they are no more or less computationally expensive but due to lack of tooling they are often cheaper to write.

In my experience programming programming with primitives and basic flow control operations frequently tends to be at least be order of magnitude faster than more complex state management paradigms. Compilers are very good at optimizing that style of code. Loops often get unrolled, the and the branch predictor is kept happy. A good compiler may use vector expressions.

In many cases with cold code it flat out doesn't matter, the CPU is plenty fast, but when it does matter, explicit if-and-for code absolutely mops the floor with the alternatives.


Will your manually written imperative code beat an SQL database for the same task? Because it uses a very very high level description on what it has to do and chooses an appropriate algorithm for that for you.

You can optimize one specific query (quite painstakingly, I believe) to beat a general db, but it is very far from clear that “for loops will beat higher level abstractions”, imo.


In designing my search engine, I ended up building a custom index instead of using an off-the-shelf database solution. Franky it's so much data a SQL database can't even load it when run on the same server that runs my entire search engine.

With my own set-up I can have my data in pre-calculated immutable tables that are memory mapped off disk. That means I can write lock-free code. My updates are large hour-long batch operations off a written journal.

For the lexicon I'm also rolling a special one-way compression scheme/hash function that guarantees uniqueness but only works due to quirks with the data.

General purpose databases can't do these things, because then they wouldn't be general purpose databases.

SQL databases perform really well on average given your query is relatively optimized, but they do not represent the pinnacle of performance. Any time you want the fastest most efficient code to deal with a particular use case, it pays enormous dividends to roll your own implementation that has domain awareness. It means you can cut corners general solutions simply can't.


for loop goes brrrrrrrrrrrrrrrrrrr!


At least in the C++ space, stuff like boost-sml is able to produce assembly that is often as fast or occasionally faster than handwritten if or switch based FSM logic.


FSM and pure functional kernel based designs you can get just as clear logic that is expressible not just to the programmer but also to business personnel

I’m yet to see a secretary who could “return a new table state such that as if documents became a right fold as binding together a map of composition of signing and copy routines over documents” instead of “get these documents from the table, copy, sign and put them back in a binder”. This is a nonsense some of us want to believe in, but it is not true.


I inherited a piece of code that was designed as a state machine and the state machine design basically made the project become pure technical debt as requirements increased over time and engineers had to make it work with the state machine model that had been chosen when the project started.

If the project had instead been designed to have less unnecessary state and “transitions” it would have been a lot easier to make changes.

All those ideas sound good by themselves but they are really bad for “defensive” coding. Good luck selling a project to refactor something when it’s going to take multiple people-years. Once you’ve made the mistake of committing to an inflexible design it’s either that, replace/ignore the thing, or deal with low productivity as long as the thing exists.


State machines are refactorable if you use state charts. Using one giant FSM is like committing compiled code.


> state machine model that had been chosen when the project started.

so was the chosen model the issue or choosing a state machine model at all?


I've written many SM implementation starting from one used in low protocols and up to business process middleware so I have an experience and know how incredibly useful and powerful those are when used in right place. But to use them everywhere especially in some math algos would be an insanity worse than GoTo.


>They are well understood and most people can understand them with little to no additional education past what they learned in their business or engineering bachelors degrees.

Imperative code:

    Take A and B.
    Add C ml of Water.
    Stir for 2 minutes.
    If it thickened, add D grams of flour, else go back to stiring.
This is easily understood by everyone, degree or no. Once someone figures out loops and the difference between statement and expression, they can essentially understand imperative code.


Okay, now put boxes around each sentence and connect them with arrows. ;-)

Imperative code quickly devolves to a point where it is no longer understandable by anyone but the developers who wrote it.


> We should be using state machines and pure functions.

For problems where those are the right tools, sure. But they aren't the right tools for all problems any more than ifs and for loops are.


I've always felt like explicit state machines are the sledge hammer you break out when you can't find any good abstraction to encapsulate the logic. As an intermediate step for parsers it's pretty powerful, but it's not something I want in my hand written code if I have any alternatives.


do you know some nice examples of medium complexity that you can show for inspiration?


> Saying it is all ifs and for-loops is completely true. Everything else, all the abstractions and high level features, are just sugar.

You could just as well say that ifs and for loops are just sugar for gotos and all programming is just gotos.

The reason ifs and for loops are used instead of gotos is that they are very useful abstractions that are easy to reason about and save the programmer lots of mental effort. But they are not the only such abstractions.

To the extent that other abstractions can create problems, it's not because they're really just sugar for ifs and for loops, it's because they are not well crafted abstractions so they are not easy to reason about and don't really save the programmer any mental effort. But there are plenty of abstractions other than ifs and for loops that are well crafted and do save the programmer mental effort, in many cases lots of it.


I completely agree with this. A lot of people use the term "too much abstraction" when they mean "bad" or "ineffectual abstraction." For-loops can be a wonderful abstraction, but they're certainly no basis for all abstractions.


> After enough experience you realize that expressivity and convenience are antipatterns and don't actually simplify things but are harbingers of complexity, bugs, tech debt, even the downfall of organizations and products.

Suggesting that experience leads to jettisoning expressivity is at odds with my direct observations of experienced software engineers working in large teams. The more experience, the _better_ the engineer gets at picking the right level of abstraction to write code that can be maintained by others. Picking a single point on the abstraction spectrum (just above goto but not below it!) is far too rigid for the diversity of tasks that software engineers need to solve.


[flagged]


Math today does a lot of things you couldn't do with math back then. It isn't just syntactic sugar, there were simply lots of problems they didn't know how to solve since the math they had wasn't powerful enough to solve it. Calculus for example solved a lot of problems they have, and calculus isn't syntactic sugar for anything, it is a new instruction set.

However the code abstractions people used 50 years ago could still implement all programs we have today. We added a lot of sugar on top, but the core of coding remained the same. For example, you could implement Map, Reduce etc in C, they are just functions and not novel features of programming.


It’s just so sad that the lowest common denominator has become the standard now. When I first learnt Clojure it entirely changed the way I think and solve problems. The code really was elegant.

Obviously, it can only be read by someone who can also understand programming beyond ifs and fors. That’s a non-starter in most environments - enterprise or otherwise.

Funny enough, I see most innovations coming from consultants who do the same work for multiple companies and realise the repeating patterns and extract an abstraction.


Ifs and fors are the easiest concepts to explain to non-developers, so it makes sense to start there.

I wouldn't say that they are the standard now, but using and mastering all features in a language is hard.

Add to that design patterns, classes and code layout it becomes a full-time job to keep up.

I have been in contact with code most of my professional life, but still isn't comfortable writing large amounts of code. The simple reason is that i don't do it full-time.

Here are the features in C# just to illustrate how complex a programming language is.

https://www.c-sharpcorner.com/article/c-sharp-versions/


I agree that modern software development for non-full time developers is brutal, several of my data scientist colleagues are remarkably brilliant people and yet they struggle with some more advanced programming concepts.

However, most of those features are relatively standard and are more conceptual than syntactical in nature. Bashing people because they don't know stuff is stupid and counterproductive, but I shouldn't be forced to code in the programming equivalent of roman numerals just because someone else can't be properly fucked to understand lambdas or interfaces or generics, all stuff that's literally covered in your run-of-the-mill CS 101 course.

It all boils down to having enough humility and empathy to understand that other people are not, in fact, the same as us.


That’s what I mean. Each language has a different syntax and it takes a while to gain mastery over it and that’s fine. But there are concepts that are immediately portable to multiple language.


C# was good enough for me at version 4.0. I use many of the newer features but they seem to be well into the law of diminishing returns now.


>I have no beef with if+for, but a large part of the reason they're "goto tools", if you will, is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years.

For assimilation to happen, the state-of-the-art solution also has to result in a net gain over the existing solution, and the higher the differential in complexity between the two, the bigger that gain has to be.


Functionally, this looks like selling off your client base and closing the doors rather than rewriting internal tools that mostly still work.

There's no "rubber meets the road" in OPs position because there's no cost in their calculations.


And, these days, "net gain" in an industrial context is typically tied to almost no aspect of the quality of the code, but more to the management of large groups of people, as well as stability and growth of the business.


What this really means that once you get to a certain level of experience and seniority the actual code you write in the small is pretty much irrelevant. What matters is the overall architecture of what you’re building: the data structures and APIs. The challenge becomes about working together as a team, and with other teams within your ecosystem. Sophisticated language constructs don’t actually help you solve those problems, and imo their benefit is marginal where they do help.


I view map/filter as better abstractions than for loops, and do not consider them to be sophisticated language constructs. They correlate with descriptions of a program’s goals much more naturally than for loops do: get the name of each user, remove everyone under 21, only show accounts with positive balances, etc. Reduce is more arguable, but I think it also applies: show the total amount of money in all of this user’s accounts.

“If” however seems pretty fundamental.


This!

With at additional level of abstraction you could say “goto jumps”, but “if and loops” gives an commonly understandable logic for everyone; deeper abstractions increase reading complexities, while higher abstraction is achieved via functions and overall architecture.

Scaling up those “if and loops” is the challenge as a team or a single, with the common goals being to keep the software under control.


Meh, most business logic really is "if" and "foreach". That doesn't mean it's not complicated, as you say. But all that category theory stuff, at the end of the day, really is just an attempt to manage the complexity of lots of conditional branching and iteration.


If and loop aren't mathematical in the same sense as set theory and set operations are.


> they're "goto tools"

I see what you did there.


> debugging hairy, nested, imperative conditionals with for-loops that are off-by-1

Isn't this just a complicated case of ifs and fors?


Sure, but the word "just" is doing a lot of work. It seems to be where a code base of uncomplicated ifs and fors leads to asymptotically, because both of those constructs don't prohibit you in any way from sneaking in "just another line" to either of them.


There are a lot of sophisticated problems dealing with enterprise software even in higher languages and even in situations where things like performance or resource usage is not a primary concern.

For example, how do you handle authorization, logging, and how do you make the code maintainable? That's a really tough problem that requires a lot of thought about the overall system design.

And of course it's always a lie to say that performance and resource usage aren't a concern -- they're not a concern until they are.


> is because industry is slow to assimilate many state-of-the-art ideas, sometimes by as much as 40 years

How convenient that the software industry is about 40 years old. So these ideas should "break through" this invisible arbitrary corporate waiting area into the limelight any day now, right?


They are breaking though. For instance, Python just got (a very limited form of) pattern matching. It has been A Good Idea since at least the 1970s. Garbage collection has been known since the 1950s but only became "mainstream" in Java.


The significance of off-by-one errors depends on whether you predictably get a runtime error on the first execution, or not.


- it's okay to use printing instead of a debugger

- you don't need to write classes for everything

- it's okay to write something in a more verbose way to make it clear for other people

- your tools don't need to be perfect to get things done

I need more of these, maybe some that aren't as reductionist as Carmacks's original post.


The really useful 'print' debug lines might be kept at additional 'debug' flag levels. This is particularly useful for tracing the behavior of programs that no longer have debug symbols and are running in real environments.


This post by Aaron Patterson made me realize it's fine to debug with print statements

https://tenderlovemaking.com/2016/02/05/i-am-a-puts-debugger...

In rare cases I pull out a real debugger, but most of the time the right prints in the right places are just as good. I can also iterate much faster because I'm not jumping between the code the the debugger, or pulling the debugger out of the loop it's stuck in.


I've come to the conclusion that it's a good skill to have since it's (or logging which is basically the same) the only debugging method that's always guaranteed to be available. For example there's lots of build and CI tools out there that have no Real Debugger.


I'd never seen that meme before, but there's a Bruce Lee quote (maybe apocryphal) that has had a lot of meaning for me ever since I got over the same hump myself.

“Before I learned the art, a punch was just a punch, and a kick, just a kick. After I learned the art, a punch was no longer a punch, a kick, no longer a kick. Now that I understand the art, a punch is just a punch and a kick is just a kick.”


makes me think of the Buddhist "Before enlightenment: chop wood, carry water. After enlightenment: chop wood, carry water".

not my favorite source since it doesn't go into the 'scaling the mountain' bit, but every source that talks abt that part seems to be...eh: https://buddhism.stackexchange.com/questions/15921/what-is-t...


I always took that to mean something like this:

Q: What is the difference between an enlightened person and an ordinary person?

A: There is no difference, but only the enlightened person knows this.


“Before enlightenment: if then. After enlightenment: if then.”


Before agile: it's not done yet

After agile: it's not done yet


After agile: it's not done


Funny, I posted this on another HN thread [0] recently, but it's perfectly relevant again:

We shall not cease from exploration

And the end of all our exploring

Will be to arrive where we started

And know the place for the first time.

T. S. Eliot - Little Gidding

[0] https://news.ycombinator.com/item?id=29043941


How then am I so different from the first men through this way?

Like them, I left a settled life, I threw it all away

To seek a Northwest Passage at the call of many men

To find there but the road back home again

Stan Rogers, "Northwest Passage"


Sounds good. I still remember my BASIC.


That image[a] is so funny because it is so true:

Anyone who has ever written any software has felt like the unenlighted half-person in the middle of that distribution at least once -- for example, when learning how to code with a different approach in a new language.

I have felt like that more than once. Everyone here has, I suspect.

--

[a] https://twitter.com/nice_byte/status/1466940940229046273/pho...


I'm may be biased because I spent too much time arguing about this

but you hear those $fancy_principles / fp / hard oop / "clean code" evangelists, and then you go to any repo of real world software - linux, compilers, kubernetes, git, blablabla and everywhere you see for loops, goto, ladders of if statements


I mean, you cherry-picked by quite a criteria there. It’s all C and Go, and they somewhat lack higher level abstraction capabilities. On the other hand compilers are often written in C++, or are bootstrapped in the very same language though. Also, what about OpenJDK, elastic search, all the web servers running the whole cloud? Linux might be the underlying OS, but that’s not the program actually doing business logic. If anything, it’s just another layer of abstraction.

Also, let’s be honest, C does all these “virtual method” magic on a per-project basis which will not be understood by any tool ever (all those function pointers to whole new implementations passed from God knows who, with barely any typing). At least FP and OOP knowledge somewhat transfers and is queryable by tools.


Ok, another example: C# Compiler, .NET internals/std/yadayada, same stuff.

My claim is:

Almost all real world, OSS, big, battle-proven code bases are nowhere even close to evangelist's "sanity".

They're full of loops, nulls, gotos, ifs and all ""bad stuff""

>all the web servers running the whole cloud

I literally opened first file that I've seen in nginx repo and take a look at this:

shitton of ifs, for loops gotos and all of that in one, a few hundreds lines long method.

https://github.com/nginx/nginx/blob/master/src/http/ngx_http...

>On the other hand compilers are often written in C++

are they?

>or are bootstrapped in the very same language though.

thus what?


I gravitate towards useful abstraction. I've written the same loops (find an element, find a maximum, do something to every element, filter elements, etc.) 10 000s of times by now. It got old after the first 100.


The more experience you have, the more you see through the code to what is underneath.

"...there's way too much information to decode the Matrix. You get used to it, though. Your brain does the translating. I don't even see the code."


This is basically the progression in any field or craft. As one becomes more experienced, one basically figures out the optimized stuff needed to successfully solve the problem in hand, successful in the sense that it both meets current requirements and enables future changes or evolution.

I describe this path of discovery as:

beginner: function over form

intermediate: form over function

transcendence: form is function

However, I will disagree that coding is just about ifs and for loops. To me, coding, programming, software development, or whatever you want to call it is about three things: how to use a computer to do something, communication between people (including your future self), and how to think about a domain. “ifs and for loops” does not capture this.


Except it never can be that simple because many systems have millions of ifs when the entire system is considered. So architecture and parallel evolution of those millions of ifs becomes an entire field of study :)


The problem with this is that most software is absolute garbage and most code is an awful, unmaintainable mess. The middle of that curve are all the people who are trying to improve that situation. The right side of the curve are the people who have given up. But maybe that does make them the smarter ones.


Do you write your code in notepad or pico, or use brainfuck as your primary programming language, or copy your code to a new folder for version control? Those things are all the simplest in their tool category.


I also like "all web development is basically fancy string concatenation", and as a web dev I feel seen.


George Hotz said something once that most modern developer jobs are depressing because you're not doing any actual programming. I.e. you're not given a problem to solve with code, you're just taping together frameworks and pieces of code that someone else wrote to order. It's a bit like studying to be a chef for five years and then having to put together one of five types of burgers.

Like everything Hotz says it's spiced up of course, but there's a kernel of truth to it.


I love George, but he's a bit of a reactionary to a fault and this anecdote is a perfect example. A person with deep knowledge and thoughtfulness will make almost the exact same point with much more nuance, aka Jim Keller: https://www.youtube.com/watch?v=Nb2tebYAaOA&t=1363s

The choice quote in contrast to Hotz is "executing recipes is unbelievably efficient -- if it's what you want to do"


There's definitely an assumption by Hotz that programming and solving "real" problems is what everyone should aspire to, and that anything else is just meaningless. Like anything in life, what's meaningful is of course completely subjective, all the way from some people actually finding it fulfilling to others just not being interested in putting in that much effort into their career and preferring to do other things with their time.


The irony of this point is if you ever watch a livestream from Hotz, he is, at an amazing level, literally taping together frameworks and systems to graduate to a point where he can begin to express solutions to a real problem. It's one of his great strengths -- nothing cannot be accomplished through hours dedicated to a problem with the extant tools we have. If he wants to impugn the flawed systems we have at our disposal it's just because nothing exists that is in congruity with what is happening in his head.

He is, however he may dislike it, good at taping together frameworks and stands as a success case for the systems he might look down on.


Making crud apps feels like I'm doing a data entry job


I've for sure felt this. I'm impedance-matching rather than making things that work.


It's almost like "a series of tubes" that do nothing more than squirt text around.


"U+1F346 is an eggplant" and other oddities from the tubes we build modern society on


String concatenation has its own quirks &amp; pitfalls of course.


Pretty much. Many frameworks exist to make it safe.


It may optimize down to string concatenation, or better yet streaming output, but you really shouldn't be doing that concatenation directly. https://glyph.twistedmatrix.com/2008/06/data-in-garbage-out....


and file access


Well, quite: if you add that code must run sequentially it's the Boehm-Jacopini theorem: https://en.wikipedia.org/wiki/Structured_program_theorem


Pure functional programming languages take this a step further and say everything is just composition. (From my understanding)


Thats why i am seeing DB schemas without indexes lately. End of the day people with this kinda thinking make others to fix the broken code that they left behind.

As a senior software engineer i had to spend a lot of time at night fixing code written by junior devs and interns.

the code that company and devs (just ifs and loops gang) proud of was a pain in the ass for me so i quit the job entirly and do freelancing these days.

I tried to explain how something was wrong and why but no one would listen all believe they were 10x developers, Life lesson learned, never ever try to correct an idiot.

Here are some of the practices they followed * No indexes on tables, not even unique * No DRY, NO KISS * Switching to new shiny framework/library every week * no tests * Keeping entire codebase in a single file (4k LOC) * no versioning * Exposing DB credentials to the entire internet and not acting when being warned


I think there's a general lesson to be learned here, which is that you should be wary of any colleague within any branch of IT that honestly claims the things you're working on are "simple". I'll take an insecure colleague that wants to do a good job over a confident colleague that refuses to listen any day.

Writing code that works is simple. Writing code that doesn't break is not simple.


Isn’t this just a rant that the senior management/devs failed to set up an environment[1] to stop this sort of thing happening? I don’t really understand how it’s related to the linked tweet or the subtext of it.

[1] either a policy like strict code review or perhaps a cultural change about quality or responsibility or mentorship.


I was brought into to work on a 2 years old project built by junior devs and interns.

Companies thought they were saving money and devs thought they were born coders so no to read a book on software architecture/engineering and hence i had to deal with the big pile of ....


Yup, when I encounter these massive abstract code bases, with some tyrant "genius" running the team, I just walk away quietly.


4K LoC is small.


And really, aren't loops just ifs and jumps under the hood? So coding is just ifs


Indeed. Jump on zero and integer manipulation are sufficient for turing-completeness. For example: https://en.wikipedia.org/wiki/Counter_machine


Can it compare two values and do one thing or another depending on that comparison? (IF)

Can it do something multiple times? (LOOP)

Can it change a piece of data?

Congrats, it's Turing complete.



All you need is `mov`

`mov`

`mov` is all you need.




It’s a pretty fun and enlightening exercise to sit down and work out how you’d implement all the usual instructions when all you have is Subtract And Branch If Negative.


it’s NANDs all the way down


It's NP-junctions all the way down.


all the way up, shurely?


and capacitors


Wires figure in there somewhere.

And lack of wires in the other places.


Until you find your NANDs are made of XORs :)


https://youtu.be/otAcmD6XEEE?t=1965

Sequence, selection, iteration (or recursion if available).

Note that the child overlooked the assumption that it's sequential.


Haha, yeah I was thinking the same.

I started in GW-BASIC when I was a kid, and all I needed was IF and GOTO. You could do anything with that. No loops, no functions, no nothing. IF's and GOTO's!!!


Ach no! It's arrays, too.


iterating and conditionals?


If you want to go even lower than that, coding is basically just saying yes or no in response to a yes or a no.

Sure, that's oversimplifying it, but that's the smallest unit of information being changed during computation.

But yes, once you learn the basics that are shared between most programming languages and don't get distracted by the nuances, it doesn't take that long to pick up a different language. Being proficient is of course another question, but achieving a basic understanding shouldn't take all that long when you just focus on how to if-else, how to setup loops, and how to assign variables.


All chemistry is just sharing electrons.


Well, it's more complicated than that - you see, occasionally the electrons are not shared, but given up entirely.


but that's one-sided sharing.


and electrons are just quarks, and quarks are just a state in a quantum field, and a quantum field is just...


Or positrons.


Not quite. It has to somehow store and later retrieve a few (infinite) “yes-nos” to be general enough.


All machines that satisfy a test of whether they can do yes-no, store the result of that yes-no, and then go to another yes-no are verifiably considered yes-no machines, that is, they are yes-no complete.


yes, at the lowest level its binary code, 0 and 1


I like to think of myself, actually, as not a code writer, but an author. I just use zeros and ones instead of words 'cause words will let you down.


I like that. Someone on my team referred to us as (data) plumbers and I thought that was a pretty fitting analogy too.



This is funny, but it's like saying "Math is basically pluses and minuses".

I see coding as playing with hardware without having to use soldering iron.


A marathon is just putting one foot in front of the other, after all. What’s the big deal? I mean a two year old can do that, and they can’t even handle logic yet.


As a marathon runner, people sometimes ask "How does a person run for 3 hours O.O" and well it's about the same as running for 5 minutes except you don't stop.

Simple != easy


Strength training is equally simple. Just keep increasing the weight you lift.


Being thin is also easy, just don’t eat too much.


They said simple, not easy.


Being pedantic is simple, you just say things that don't contain detectable errors.

(I'm agreeing with you about "simple, not easy")


Mathematics is "just" set theory and everything else, including arithmetic, can be built on top of that.


Honestly that's not a great example given that you can't understand ZFC until you already know enough set theory to understand the motivations for ZFC.


Well, it is just counting natural numbers and making up placeholders for whenever a subtraction or division wouldn't work out.


Yeah we went beyond Presburger a long while ago


Maybe math is just equations and sets.


It is just sets! Set theories like Zermelo–Fraenkel can be the foundations for all of mathematics.

https://en.wikipedia.org/wiki/Set_theory


I think he knows.


Math is just writing on a blackboard. Equations and sets optional.


Math is just addition and subtraction.

Coding is just copy and pasting boilerplate code and googling how to make it work.


Copilot. FTFY.


UGH. Back in my day the only language was BASIC and we only had IF and GOTO. Dijkstra has made these children SOFT and I'd piss on his grave if I could be arsed to get out of my rocking chair.


What version of Basic were you using that lacked FOR? Even the shitty one crammed into my Sinclair ZX80 had FOR.


Oh, I can't say that it didn't. But I was only 7, and more amused by things like

  10 PRINT "POOP"
  20 GOTO 10
than, what, reading a non-existant manual? My parents weren't programmers, so I learned by stumbling in the dark.

But, you've got me curious. I recall using BASICA, GWBASIC and QBASIC -- reading over their respective histories, I'd have gotten my start on BASICA on a hand-me-down 286. I'm not finding good docs on the language, but a familiar program [1] uses FOR loops -- so they were supported. But I distinctly recall hand-rolling a for-equivalent loop with GOTO and IF, counting down to zero.

[1] https://web.archive.org/web/20130918210121/http://www.coding...


More visually pleasing, and several keystrokes shorter, for rapid input on an unattended store demo machine:

10 print "poop ";:run

FWIW, the very first version of BASIC in 1964 supported FOR (https://en.wikipedia.org/wiki/Dartmouth_BASIC#First_Edition). I never used DOS machines but https://gunkies.org/wiki/Microsoft_BASIC claims that BASICA was based on Microsoft's BASIC, and that definitely had FOR.

Young you must have arrived at the concept of a for loop without knowing there was a keyword in the language designed to support that, which is pretty cool!


In my twenties, I wanted to use all the cool PL techniques: meta-programming, reflection, monads, macros, type-level programming, etc. I'm getting older and now I want my programs to be 90–95% functions, arrays, structs, enums, and loops and only parsimoniously throw in more advanced language features.


Yes. The 'paradigms' have seriously diminishing marginal returns.

Super basic imperative typed language with maybe some kind of nice way of handling nulls and that doesn't use pointers ... is most of what we need.

Everything is very expensive optimization.


So I hear you want Clojure.


I really don't.


Cool.


There’s a joke in the fp community I can’t find right now that describes the evolution of programs from imperative side-effectful statements to a for comprehension, with exception catching, that looks nearly identical.


Probably not the right one, but "The Evolution of a Haskell Programmer"[0] sounds like a similar idea which goes from a Freshman Haskell programmer's simple factorial to the abstract heights of a Post-doc Haskell programmer, then back down to a Tenured professor's simple factorial.

0 - http://www.willamette.edu/~fruehr/haskell/evolution.html?


Then what the fuck is a monad?!?

Seriously I still don’t know what a monad is and apparently it’s just a bunch ifs and for loops, so I guess I’m pretty stupid.


It's like if you wrapped every statement ending in ; in a C program with the same macro.


I’m a Python programmer. We don’t have macros … or semicolons.


A monad is just a monoid in the category of endofunctors... duh!


It's not "the" monoid in the category of endofunctors though, just "a" monoid. That also describes Applicative (less powerful) and Arrow (more powerful).


A monad is like when your cat poops on the carpet, so you wrap your cat in toilet paper. Then your cat has kittens, and the kittens are born with toilet paper wrapped around them!

No here’s a better explanation:

A monad is like when you have a clock, but it’s broken and goes too fast. So you rewind it, and the next morning you wake up and it’s 5000 BC.

Or here’s a better explanation:

A monad is like when Donald Trump won the election, and then stacked the Supreme Court, and now we’re totally fucked!

That’s it! That’s exactly what a monad is!


It's what you add to a search to get more interesting answers on Google.


I don't understand monads either.

They look like concise programming for mathematicians, which in my opinion, is tending towards set theory notation. https://xkcd.com/2545/

Personally I prefer more names, more XML tags, more comments, more parables, more hyperlinks, more different ways of expressing the same thing, to make it less ambiguous and easier to communicate and agree what we're all talking about.


Actually coding is just literary translation into a language spoken only by pedantic idiot savants

That's why it's pedantic idiot savants who tend to be the best coders


While Carmack is law, don't think this was worth a share here, as it was just a cute thing a young kid said? Only slight jokey whatever


Good code explains the problem to the reader and implements the solution as a side effect.

You can often express the problem quite well using a combination of if, for, comments, variable names, function names, list/set and map data structures.

Sometimes (<10%) you encounter a problem that's interesting enough to define a custom type, and sometimes you make tree data structure out of it (<1%). On very rare occasions (<0.1%), you need a more complex data structure.

Code that uses thousands of classes is very hard for a reader to approach, but so is a single file with a never-ending ifs and for loops. Always write code for another person. If nothing else that other person might be you in a few months (or much sooner).

Coding is basically just writing for humans in a language a computer can execute.


All life is just cytosine, guanine, adenine, and thymine.


Those mean nothing without the proteic machinery that translates codons into other proteins though. DNA and RNA without ribosomes is like a language without a compiler.


Umm.. what are those codons and other proteins made of? ...


Codons are an (effectively arbitrary) grouping of 3 consecutive nucleic acid bases that ribosomes can translate into a amino acid when building a protein. Technically any combination of 3 consecutive bases could be considered a codon, however biologists usually only group base pairs based on how they are normally read by a ribosome. Note that this means it is possible for a single sequence of nucleic acids to code for multiple proteins, with the only difference being the starting offset.

Proteins are made of amino acids not nucleic acids. Nucleic acids are polymers whose elements contain a phosphorus group. Amino acids contain an nitrogen group (an amino group to be specific).


They are distinct biopolymers. DNA and RNA are polymers of just 5 different types of nucleotides. Proteins are polymers of some 20 radically different aminoacids.

https://en.wikipedia.org/wiki/Proteinogenic_amino_acid


Codon is just a triplet of nucleotides. But proteins are made out of amino acids, they have “nothing” to do with DNA in itself. It just happens that DNA transcoded by a codon table to proteins encode some useful ones that in certain environments can bootstrap the whole process and do some useful work.


Everything, everywhere is just entropy.


Coding is basically just ifs and for loops.. But software engineering (or development) is much more than just coding.


Isn't software engineering pretty much ifs and loops too? Massively complicated ifs, and looping pretty much samething over and over again.


I like the phrasing "software engineering is programming integrated over time." It involves things like rollouts, migrations, deprecations, adapting to new requirements, and designing in so a way as to make all those easier.


The part that does stuff is just ifs and + signs. But making a computer do stuff isn't hard; that's what they're for. The problem in software engineering is stopping it from doing the wrong things.

(Hot take: the way we do this is wrong; we should be adding superpowerful not-doing-stuff features to languages, vs doing-stuff features.)


coding is just ifs and for loops, computer science is just types, (to me) software engineering is a mix of those two


I like "Coding is to Software Engineering, as building a chair is to running a furniture factory". You need to know a fair amount about the former to excel at the latter - but the latter also requires project management, process and product design, collaboration, and communication skills.


I remember explaining recursion to an aspiring programmer to apply to some tree node walking or something, and at some point it clicked! I saw the second it worked in the reflection in her eyes, they got big and lit up and there was this palpable sense of "a-ha" in the room! It was one of the coolest moments of my professional life. But yeah, my kids (one of whom is picking up programming) would be right behind the "ifs and loops" statement.


Reminds me of a coworker who said that they only data structure you need is an array since it can be used to to mimic every other data type.


Somewhere in the 90s a colleague asked about why those pesky types we had to use while we had void *. We fought, I won. (but somehow we still ended up with Javascript 20 years later)


Sounds like the Re-birth of the "Expert System". Now with neural networks ;)

Learning from Artificial Intelligence’s Previous Awakenings: The History of Expert Systems

https://ojs.aaai.org/index.php/aimagazine/article/view/2809


Coding is just electrons having fun.


my professor at uni always said that machine code is just 1's and 0's and if you can count to one you can understand computers

(with a smile on his face and shrugging his shoulders)


The way I interpret this is that the 'just ifs and for loops' is like Matrix rain code. In the beginning it looks like gibberish scrolling down the screen. When you master it, it's still gibberish scrolling down the screen, but it's simultaneously something else on another level as well.

I often find myself writing simple things with a compact-but-high-level-conceptualization, that when edited by someone else, clearly only saw the rain.


Ifs and for loops are trash. Real programmers just write massive linear algebra operations that they can throw on a cluster of GPUs to get 50,000x parallelism. ;)


I wish I could have a son who understands Turing completeness without being explicitly told... I guess that's the power of the Carmack bloodline.


also "coding is basically over glorified plumbing". ~ my cynical coworkers


And thank goodness that it IS glorified!


and ops is custodial engineering for software. we mop up what leaks out after the plumbers are done


you guys have for loops?


This was my first reaction too. Now I work in a (Swift) codebase where I could probably count the number of for loops we have on one hand. Pretty much everything is done through map/reduce/filter, etc.

At first, I thought it wasn't as readable, but now that I'm used to the syntax, I think it's much easier to parse what happening. I know what transformation is happening, and I know there shouldn't be any side-effects.


We used to have. Now we mostly chain map-functions. But there are those rare instances where we don't operate on iterables and still need to do something many times. I've seen people so weirded out by this that they prefer to generate an iterable to work on a rather than to use a for-loop.


Lol no. It is a whole sets of skills, from extracting just right abstracts from the problem domain and defining adequate sets of interfaces, to arrive at just right (clear, concise, mostly-functional) modular implementation.

Almost no one nowadays possess these skills. I we have is node_modules crap and J2EE-like design bullshit.

This like saying that poetry is just words and commas.

But he is just a kid, I know. ;)


Ahem, and math in between. Quite a lot of it.

Calculating the square root of a number is ifs and loops but wouldn’t be much fun without the math.


I'm one one of a small number of people at my FANNG company that can work on a legacy MM LoC Perl project.

Many of them are much smarter than me, but I think my insight which I remember getting around 6 months in is that it's just code, in many places hard to read/follow, but it will ultimately just do a logical set of instructions.


Building a house is just lumber and nails.


It really is. Everything complicated is either an interface or handling some edge case. Sound familiar?


His mind will be blown to lean it all runs on protons, neutron, and electrons.


There's 57 elementary particles in the Standard Model.


Quite literally how I explain coding to people.

It’s basically that, plus resource management.


I describe coding as similar to game of chess, in that it requires you to think several moves in advanced.

Both involve filling up your mental stack.


Ah, keen observation, young grasshopper! But nota bene: just as man cannot live on bread alone, one's understanding does not arrive merely from the consideration of a collection of atoms.


Yeah, it's just branching and iteration. The kid's right.


Just to be clear: a language with only if() conditionals and for() loops with known number of iterations (so for instance all for-each loops) isn't actually Turing complete. This is called primitive recursion, and an easy to prove example of what it cannot process is computing the Ackerman function.

Also, you can easily see that something is fishy because all such code must terminate (where's the halting problem here?)

https://en.m.wikipedia.org/wiki/Primitive_recursive_function

Otherwise I completely agree with the aphorism.


And functional programming is just algebra. Ez Pz.


Thing with loops, they are complicated. Like infinity. I remember when someone taught me about multiple infinities. It is wonderful!


A Turing machine is basically just ifs in a loop.


for-switch, the weird uncle of the turing machine family.


It's not about the code, it's always been about the data over spacetime. You have a data problem, not a code problem.


Computers are just infinite tape marked out into squares... (Turing machine). A pyramid is just a stack of cubic rocks.


The earth is just a ball of mud


It's a ball bearing covered in dust



Wait till he reads about a Turing machine


I really wish it was as simple as that.


And ppl nagging you. No clueless ppl nagging you with things like jira and idiots like jeff sutherland.


Wild stab in the dark here, but for some applications it's possible you may need assignments too.


Until he finds out about Promise.then chain. So, programming is basically just ifs, for loop and then.


It’s all just strings. Make the right strings happen in the right place and the computer does things.


Hey I found a post about my username.


One time I told someone I work on web apps and they said “oh that’s just html and css that’s easy.”


It's nice to learn so much about someone so quickly.


It's just 0s and 1s, really.


Well it is ifs and loops and assignments. Kind of not useful without the assignments.


Until you get to network programming, then it's "just open a socket"


joke aside. we know a language is Turing-incomplete if it has only ifs and bounded for loops. And if 90% programmers' tasks can be done using a Turing-incomplete language, is there any benifits we can get from this?


for loops aren't bounded though; that's for...in loops. for loops you write your own stopping condition, which was probably a mistake.


Yeah, but a master of coding uses a lot of Mathematics, like his dad does :-)


Add in some form of mutability and you’ve got yourself a Turing machine, yes.


Living is just moving stuff around based on the stimulus received. :-)


All we need are functions!


In fact it is just ifs and jumps. Loops are a higher level construct.


No, coding is just converting some data into other forms of data.


I once told a girlfriend that's what I did. It worked, she wasn't impressed with my job at all after that.


In the eyes of an 85 year old Italian woman I was a "writer"(or perhaps just "typist" - something could have been lost in translation).

Fair enough.


10 years on and I'm still converting relational data into HTML or JSON


I had a brief time in my career where we used a database that you would query and it just returned the html. It was amazing.


I kinda want to give you a hug and say it's all gonna be okay...


I squint at a screen and push buttons. But which buttons in which order matters.


We're data plumbers


Routing shit around the world…


"construction is basically just concrete and steel" :)


Amateur. We do branchless coding and unroll every loop here.


haha, it is! it is a wonderful experience when you make that realization. That's the point with programming. Easy to learn. Difficult to master.


Coding is basically just ifs and while loops.


Or, coding is basically just folds and filters.


Gastronomy is just fat, sugar and protein.


And alcohol!


with input, output, and rw to arbitrary memory.. then I think that just about satisfies turing completeness


And c++ is basically just c with classes.


Coding is basically just zeros and ones.


For loops are basically ifs and gotos.


And print. Don't forget print


Computers are just thinking rocks.


> When we write programs that "learn", it turns out that we do and they don't.


Programmers are just linguists that use and make artificial languages, like Esperanto!


It’s just 0’s and 1’s, baby


just when you think it is, concurrency is a whole new level


And datastructures.


And assignments.


Humans are basically just ifs and for loops


The masses are just loops. The mindful are gets and loops. The ruling class are gets, sets, and loops. The elite are ifs, gets, sets, and loops.


With quantum irrationality and self induced sleep deprivation!


At least 5 ifs


Whys?


And types.


git is just a graph, man


I think it's funny that it's kind of like a blockchain, and as soon as the business people and MBAs realize this, they're going to "evangelize this amazing application of blockchain to all code!" to death.


So Bitcoin is just Git with extra steps!


Github is just programmer Wordpress.


it's a chain. of blocks... hmmm.


Ok


“coding is basically just ifs and for loops.”

Maybe, he knows only that much, yet


This is the most insightful thing I've ever seen on Twitter.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: