Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Many interactions are not entirely on/off, and they are influenced by a lot of other factors in the environment of the cell. This is a part that really doesn't match well to how people understand coding.

Eh, I think you are correct re: being a rough and oversimplified analogy, but those same assumptions of how digital tech works break down when doing digital tech at large scale as well (FAANG).

A well-known example + cases I've seen repeat over the past few years include: IO Latency variations because of noise (someone yelling at a disk array), bit flips on the network just right to evade TCP error checking, ICMP packets bit flipped just right so they crash network control plane software. ASICs with die defects such that the brick if you try to load a specific set of subnets into TCAM n times, CPUs that wear out such that specific instructions return erroneous results after n years of use, and so on.

Basically, with warehouse-scale computing you encounter the consequences of very unlikely events every day, and many unlikely events happen at the interface of digital to analog (which is defined by... different chemical and physical properties).

At some point you accept that the details will always be fuzzy, but if the model works well enough most of the time, it's good enough for most practitioners. You just need to socialize that there are exceptions and make sure people know where to turn when something doesn't behave as expected.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: