> browsers have various popular language runtimes (and perhaps even popular libraries) preloaded, so that all web pages requiring that runtime can share the same (read-only) copy of that code.
That sounds a lot like the idea from some years past that commonly used JavaScript frameworks would be served from a few common CDNs and would be widely enough used to be almost always in cache in the browser, and therefore won't need to actually be downloaded for most pages (hence, the size of the js frameworks shouldn't matter so much)
I'm no expert but from what I understand, that didn't really work out very well. A combination of too many different versions of these libraries (so each individual version is actually not that widely used), and later privacy concerns that moved browsers toward partitioning cache by site or origin. Maybe other reasons too.
Of course, you didn't mention caching and perhaps that's not what you had in mind, but I think it's a tricky problem (a social problem more than a technical one): do you add baseline browser support for increasing numbers of language runtimes? That raises the bar for new browsers even further and anyway you'll never support all the libraries and runtimes people want. Do you let people bring their own and rely on caching? Then how do you avoid the problems previously encountered with caching JS libs?
These are good questions and I think there's more than one answer that's worth exploring.
I think that the privacy problems caused by shared caches could be solved, without simply prohibiting them altogether. Like, what if you only use the shared cache after N different web sites have requested the same module?
But if we really can't get around that problem, then I think another approach worth exploring is for there to be some sort of curated repository somewhere of Wasm modules that are popular enough that browsers should pre-download them. Then the existence of the module in a user's browser doesn't say anything about what sites they have been to.
Versioning is a problem, yes. If every incremental minor release of a language runtime is considered a separate version then it may be rare for any two web sites to share the same version. The way the browser solves this for JavaScript is to run all sites on the latest version of the JS runtime, and fully commit to backwards compatibility. If particular language runtimes could also commit to backwards compatibility at the ABI level, then you only need to pre-download one runtime per language. I realize this may be a big cultural change for some of them. It may be more palatable to say that a language is allowed to do occasional major releases with breaking changes, but is expected to keep minor releases backwards-compatible, so that there are only a couple different runtime version needed. And once a version gets too old, it falls out of the preload set -- websites which can't be bothered to stay up to date get slower, but that's on them.
This is definitely the kind of thing where there's no answer that is technically ideal and people are going to argue a lot about it. But I think if we want to have the web platform really support more than just JavaScript, we need to figure this out.
I think a better model would be for the site itself to provide the modules, but the browser will hash and cache them for the next site that may want to use the same module.
This way, there's no central authority that determines what is common enough.
This model does not allow for versioning. For this model, it would be risky to allow it (one website could provide a malicious model that infects the next site you visit).
> Like, what if you only use the shared cache after N different web sites have requested the same module?
That would still let websites perform timing attacks to deanonymise people. There's no way to verify that "N different websites" isn't just the same website with N different names.
Though, we could promote certain domains as CDNs, exempt from the no-shared-cache rules: so long as we added artificial delay when it "would have" been downloaded, that'd be just as safe. We're already doing this with domains (HSTS preload list), so why not CDNs?
Web browser developers seem to labour under the assumption that anyone will use the HTML5 features they've so lovingly hand-crafted. Who wants something as complicated as:
<details>
<summary>Eat me</summary>
<p>Lorem ipsum and so on and so forth…</p>
</details>
<Accordion>
<AccordionSummary id="panel-header" aria-controls="panel-content">
Eat me
</AccordionSummary>
<AccordionDetails>
Lorem ipsum and so on and so forth…
</AccordionDetails>
</Accordion>
Maybe the problem isn't the libraries. Maybe the problem is us.
The problem is the libraries. Browsers are still mostly incapable of delivering usable workable building blocks especially in the realm of UI. https://open-ui.org/ is a good start, but it will be a while before we see major pay offs.
Another reason is that the DOM is horrendously bad at building anything UI-related. Laying out static text and images? Sure, barely. Providing actual building blocks for a UI? Emphatically no.
And that's the reason why devs keep reinventing controls. Because while details/summary is good, it's extremely limited, does not provide all the needed features, and is impossible to properly extend.
That sounds a lot like the idea from some years past that commonly used JavaScript frameworks would be served from a few common CDNs and would be widely enough used to be almost always in cache in the browser, and therefore won't need to actually be downloaded for most pages (hence, the size of the js frameworks shouldn't matter so much)
I'm no expert but from what I understand, that didn't really work out very well. A combination of too many different versions of these libraries (so each individual version is actually not that widely used), and later privacy concerns that moved browsers toward partitioning cache by site or origin. Maybe other reasons too.
Of course, you didn't mention caching and perhaps that's not what you had in mind, but I think it's a tricky problem (a social problem more than a technical one): do you add baseline browser support for increasing numbers of language runtimes? That raises the bar for new browsers even further and anyway you'll never support all the libraries and runtimes people want. Do you let people bring their own and rely on caching? Then how do you avoid the problems previously encountered with caching JS libs?