I don't, actually. Wasm is immature in the browser, too, for the same reasons. Writing a client-side web app entirely in Wasm generally doesn't work well today because you'll force the client to download much more code before the page can run. Lots of work is being done to improve this, like adding built-in garbage collection to the Wasm runtime so that apps written in GC'd languages don't need to ship their own. Dynamic linking will help too, assuming that language runtimes are allowed to be cached across different sites.
In both environments there are certainly use cases where Wasm provides huge advantages. But those use cases are still narrow. Over time it'll grow but there's still tons of work to do.
Yes, if you want to write a client-side webapp you run into limitations. That wasn't one of our main goals when we created wasm, though! It would be great if that materializes - more options are always good - but JavaScript is frankly the right tool for 99% of sites and we never intended wasm to directly compete with JS there.
Wasm is stable and mature for solving the needs of sites like Google Earth, Unity games, Figma, Meet, Zoom, etc. Those require more than what JS can offer and wasm is the perfect fit for the relevant parts of them.
On those websites wasm is often the difference between shipping and not shipping. That's a huge deal, and why wasm has been focused there. Other use cases like replacing JS with wasm might offer some benefits in speed, perhaps, but the impact of that would be smaller (but it could eventually apply to a wider set of sites, potentially).
Right. Same for server. You can do most of what you want to do in JavaScript and it'll be fine. Use Wasm to fill in the gaps where JavaScript doesn't work. That's a good strategy today with Cloudflare Workers, too.
It's when people want to write entire apps entirely in their language of choice, and want to accomplish this using Wasm, that the technology is still missing things. A lot of people want to do this, both on the browser side and the server side.
I agree that's a common desire, but how likely is it that actually becomes feasible? I think programmers underestimate the degree to which languages and VMs are coupled.
Adding GC to WASM makes it essentially like the JVM because it has to know about the layout of every type (to find pointers, etc.) As far as I can see, this effort is like bolting a VM that's 2x-5x as big (in terms of semantics) on top of the existing small WASM VM.
I think they will end up with something like the union of JVM and CLR [1], and even that's not enough.
JS already has garbage collection, but its runtime data types can't really host something like Java or OCaml efficiently.
----
The CLR is supposedly language-agnostic, but I'd argue it's not. Visual Basic was "broken" for this reason -- VB.NET is more like C# than VB6. The old code doesn't run.
I've heard PowerShell described as a weird shell-like syntax for writing C# programs.
And I remember F#'s behavior around null, algebraic data types, and exceptions was heavily influenced by the CLR. In some ways it's probably closer to C# than its prime influence of OCaml.
So while I don't know anything about the WASM GC effort (and haven't kept up with it), I'm skeptical that we'll get a true polyglot experience. What's more likely is that some languages will be favored over others, with the "losers" experiencing 2x - 10x slowdowns.
And this doesn't even get into the runtime library issues. For example PyPy is essentially perfectly compatible with CPython at the language level, and has been for over a decade. Still, many applications have difficulty migrating to it because they lose bindings to native libraries, like linear algebra with NumPy, and OpenGL, Win32 bindings, etc. (these are enormous)
I expect the analogous issue to be a big problem for using WASM in a polyglot fashion too.
----
As a separate issue, WASM is still not up to par with native code in terms of protections around the stack and the heap: https://lobste.rs/s/a9ghhz/maintain_it_with_zig#c_ghawis . Thus it favors Rust over C/C++, since Rust enforces more integrity at compile time.
Real apps need to poke many holes in the VM to get anything done, and those attack vectors matter more as that happens. WASM follows the principle of least privilege better but it has regressed in other dimensions (at least if you want to run legacy C code, which was the original use case advertised)
CLR includes instructions for closures, coroutines and declaration/manipulation of pointers, the JVM does not
Another smaller difference is that the CLR was built with instructions for dealing with generic types and for applying parametric specializations on those types at runtime.
The language that fully exposes its capabilities is C++/CLI and not C#.
In fact, most of the performance improvements since C# 7, have been how to surface those C++ capabilities into C#, while keeping the generated MSIL verifiable. C++/CLI is also able to generate unsafe MSIL sequences.
This is specially relevant since C++/CLI is Windows only, so one cannot just switch to it in cross platform .NET.
Secondly, .NET was also an excuse to reboot VB language, so some QuickBasic quirks were also thrown out, additionally, they have increasingly made VB less painful to migrate old VB 6 code.
Other than that, you are right, and this is also a reason why platform languages always have an edge over guest languages, even if they aren't as shiny in getting new language features out the door.
Hm very interesting! So what it sounds like is that CLR started out like a JVM-like VM, but now they're adding a WASM-like VM to it :)
That is, WASM is a much more natural target for C++/Rust than anything higher level. It can also express unsafe code when you consider the issues brought up in the paper, i.e. that it's unaware of heap integrity. Conversely, the JVM/CLR was traditionally better for Java/C# like languages and you couldn't run C/C++ naturally.
This makes sense as I've heard some of the more recent C# features are to recover performance, like value types, slicing, etc.
You got it wrong, CLR supports C++ since version 1.0, released in 2002, including the concept of safe and unsafe code.
It is WebAssembly that tends to be "sold" as if it was the first of its kind.
In fact even the CLR wasn't the first one, there were other bytecode formats for languages, like EM from Amsterdam Compiler Toolkit, IBM and Unisys mainframes/micros.
And around 2003, there was a Swedish startup trying to push a mobile OS that used a VM capable of J2ME, C and C++, the name I cannot longer remember, just some Sony-Ericson models used to have it.
In both environments there are certainly use cases where Wasm provides huge advantages. But those use cases are still narrow. Over time it'll grow but there's still tons of work to do.