MongoDB didn't become a public company through innovations in fundamental distributed database technology or even through good engineering. They became a public company because once Javascript became adequate for building client software, there was a strong incentive to build MVPs using the same data structures from client to server to DB, and once you build an MVP that gets adoption there's a strong incentive not to switch databases.
That's the sort of shift in the environment that the grandparent is talking about. Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.
> Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.
I believe you are a victim of survivorship bias.
There was plenty of shitty software in the 70s and 80s. The difference between then and now is that we haven't been able to wait for 4 decades, to see what software of 2018 stood the test of time.
I agree, back then there was not a mindset of move fast and break things. It was foundational research, a lot results and learnings from then are still applicable today.
In the 1980s there was a lot of "foundational research" (poorly re-inventing the wheel) for microcomputer people who did not know about the work done on large computers in the 1960s. Move fast and break things was also very much a thing for microcomputer manufacturers and most microcomputer software vendors. Look at how many releases software packages went through, and at what rate.
I think you're agreeing to disagree. His viewpoint, to me at least, is opposite to yours. Or at least parallel. He never said that today there's no foundational research.
And the mainframes that run our banks, transportation systems, healthcare, public safety.. etc etc. Use the right tool for the job, price it against what the market will bear. Pacemakers and insulin pumps driven by npm updates - shudder -
It's before my time, but I'm pretty sure people had at least an intuitive understanding of what partitioning does to a datastore before Eric Brewer wrote it down.
I'm not sure why you get downvoted, I'll upvote you.
The "modern" database systems are now going back to the exact design principles that the books you refer to solved long time ago. There is tons of research, dissertations,.. that focuses on this from decades ago.
Its just now that the new systems realize that these problems actually exist.
If you dont know the history of a certain field and what came out, you repeat and make the same mistakes again. This seems to also apply to software engineering.
Yeah, distributed database systems from the late 70s, early 80s actually had certain transactional guarantees that some of these "modern big distributed systems" you refer to still dont have.