The idea of a unified computer, resembles also with the idea when iPhone merged, calculator, mp3 player and phone.
With ubuntu's try a decade ago, https://www.indiegogo.com/projects/ubuntu-edge#/ it was obvious there is a market for this. But the ecosystem chain beats it all. Everyone will wait for their favorite OS to catchup.
Maybe that's a bad example, as your build can fail because of a breaking change in a dependency regardless of whether you use a statically typed language.
Also your statement is only partially correct. Breaking changes in dependencies end up in production only if you don't have tests. And I know this is news to many people using static types but in many Ruby shops for example there are test coverages in excess of 90% and at the very least I never approve a PR without happy path tests.
Refactoring from a function returning a string to another returning a string, and all compiles, yet without tests nothing works in production because it’s not the same string.
On top of that, sometimes mocking in tests also hide the string breaking change you don’t yet know about.
On top of my head, I saw this happen with a base64 string padded vs unpadded, or emojis making their way through when they did not before, etc.
So yeah, the compiler tells you which pieces of your jigsaw apparently fit together, but tests show you if you get the right picture on the jigsaw at the end (or on some regions).
> Breaking changes in dependencies end up in production only if you don't have tests.
Which are opt-in in dynamically typed languages.
You get the same functionality in statically typed languages and it's not opt-in, AND the developer doesn't have to do the work of type-checking (the compiler does it).
In a utopic scenario, yes, you fix type-checking and you can live happily ever after.
In an interoperable environment, when additional programming languages/teams are in play, which is very frequent nowadays. You have the problem of centralizing types/entities/schemas, reused types, partial types, and union types, and you can keep the story on.
At some point, instead of typing being an invisible tool, it becomes the tool. Then you superset the tool to other languages, because they are in an immutable state. Another language will emerge with dynamic typing in decades then people will take the same spiral and spend more time to reason wages.
I mean sure, if your test suite does fuzzing, then I guess so. Most test suites focus on higher level stuff like behaviours and entire features these days. This is instead of testing in isolation the technicalities like the signatures of functions and what happens if you pass a null here or there.
The same applies to type systems these days - you don't just express basics like nullable or not, you build high level business concerns into your types.
Leaking bugs, I believe, do not relate to static typing or dynamic typing; it mostly involves deployment. Your types might match, but you would still leak bugs in dependencies :/
From a CI/CD perspective, you should make sure that on updates, things won't break. As others suggest, a maintainable project would have test suites.
Except if you aim to have a program that you will never update again. Write the code once, compile it, and archive it. When you decide to keep that program available to potential clients, be prepared to back up dependencies, the OS it runs on, and everything that makes it operable. If there is a breaking change in the ecosystem of that program, it will break it.
No, we used a tool, and we've got addicted. Some fought that addiction, others are not figuring out what is going on yet. Free market and shiny gadget.
It's a pity, of all web resources advancements, js, css, runtimes, web engines. HTML was the most stagnant aspect of it, despite the "HTML5" effing hype. My guess is they did not want to empower HTML and threaten SSR's, or solutions. I believe the bigest concern of not making a step is the damned backward compatibility. Some just wont budge to move.
HTML5 hype started strong out of the gate because of the video and audio tags, and canvas slightly after. Those HTML tags were worth the hype.
Flash's reputation was quite low at the time and people were ready to finally move on from plugins being required on the web. (Though the "battle" then shifted to open vs. closed codecs.)
As far as I understand, this might be is the best tool for the guy who acquires a SaaS and have no clue how to develop/tweak stuff. Again still have no clue how you enable this or do you still have tilt in production. Marketing is not a big deal for tilt as far as I understand.
You bring up an interesting question -- I think Tilt works best for those that find themselves in an environment where product is delivered using a service-oriented architecture deployed to Kubernetes. It's also easy to get started within a small team in a big company.
To your other point, Tilt is to development as ArgoCD is to deployment. Tilt enables on-demand, reproducible development environments that are sufficiently high-fidelity that you can often replace your shared and/or long-lived testing clusters.
With Tilt, I test my application using the same Kubernetes specs / Kustomizations / Helm Charts that you use to deploy into production. When it comes time to deploy my application, I supply these same specs / kustomizations / charts to ArgoCD.
Because I can reuse the specs for both testing and production, I enjoy far greater testability of my application, improving quality and time to market.
With ubuntu's try a decade ago, https://www.indiegogo.com/projects/ubuntu-edge#/ it was obvious there is a market for this. But the ecosystem chain beats it all. Everyone will wait for their favorite OS to catchup.