I just serve my ESM modules directly from the filesystem. It works well, builds in literally 0 seconds.
You also don't need a transpiler like Babel if you drop support for IE11. Almost all ES6 features– such as classes, proxies, arrow functions, and async are supported by modern browsers and their older versions. As long as you avoid features like static, private, and ??=, you get instant compile times.
It's not 2015. You don't need a transpiler to write modern JavaScript. As of writing, the latest version of Safari is 14 and the latest version of Chrome is 88. These features fully work in Safari 13 and Chrome 70, which have less than 1% of marketshare.
I would say the primary motivators at this point have shifted from browser compatibility to syntaxes like TypeScript and JSX that will (likely) never be supported natively
> I would say the primary motivators at this point have shifted from browser compatibility to syntaxes like TypeScript and JSX that will (likely) never be supported natively
I suggest using a single-pass transpiler like Surcase[0] that doesn't translate to IE11-compatible syntax, and just loops through the string once, avoiding generating an AST– making it much faster– removing TypeScript annotations and desugaring JSX.
If you additionally need to support IE11, you can use a development build with Surcase and a production build with a bundler like Snowpack. C/C++ developers have been doing things like this for ages: compiling files as objects during development, and compiling them into one big binary for production.
Surcase is often slower than ESBuild since it is single-threaded. The benchmark posted on its README forces ESBuild to run single-threaded, and is thus a bit misleading.
> 1. How do you handle npm dependencies that don't expose ES modules?
See my comment above.
> 2. How do you handle the performance impact of your browser doing multiple sequantial requests because it doesn't know your whole dependecy tree?
Put all your dependencies in the HTML file, so it fetches them all in parallel. So instead of just loading the root application file and forcing the browser to resolve dependencies:
You can see a demonstration of this technique on my production site, a complex JavaScript SPA that loads everything (including dynamic content) in less than one second: http://vnav.io
For a small performance boost, you can load smaller files first via <link rel=preload>.
> Put all your dependencies in the HTML file, so it fetches them all in parallel.
> ...
> You can see a demonstration of this technique on my production site, a complex JavaScript SPA that loads everything (including dynamic content) in less than one second
But it doesn't work like this in practice! There is a limit to the number of concurrent connections a browser will make, Chrome for example is <=10 IIRC. You can see this in the waterfall:
`rendering.js`, the last `.js` in the queue, stalled for 300ms.
Additionally, each round trip is dependent on:
- The user's latency
- Your server's response time
So with each connection there is overhead. You would be much better served concatenating these files.
`util.js` took 510ms, of which 288ms was spent stalled (i.e. waiting for a connection) after which spent 220ms waiting (time-to-first-byte).
Furthermore, Lighthouse gives your page a performance score of 60, which isn't great. Key metrics:
- 1.7 seconds to paint (which is okay)
- 7.7 seconds to interactive (which is terrible)
Finally, why on earth are you redirecting to HTTP after serving an HTTPS response? This makes your page load even slower:
> But it doesn't work like this in practice! There is a limit to the number of concurrent connections a browser will make, Chrome for example is <=10 IIRC. You can see this in the waterfall:
In HTTP/2, which our site will switch to soon, the concurrent limit is 100. The world's changed, change with it, and throw away those bundlers.
Mixed Content: HTTPS sites cannot connect to HTTP resources. The game needs to dynamically scale without expensive load balancers in front. Until I get dynamic DNS and HTTPS working, the site will not have HTTPS.
So your solution to build systems has absolutely none of the benefits that build system do in fact bring, or the ecosystem of a language that you might consider terrible but actually has a decent amount of quality re-useable code. But this works for your very specific use-case?
Alas, many of us still need to support IE11. We alas get a lot of value out of TypeScript, which needs to be transpiled. <1 second with esbuild is pretty good!
In development, use an index.html that loads it directly from the filesystem, using a single-pass annotation remover such as https://github.com/alangpierce/sucrase A single-pass transpiler doesn't build an AST, instead just looping over everything once, which will surely make it faster than esbuild.
You can annotate your code with JSDoc comments and get the full value of TypeScript while just writing JavaScript without any transpilation step[1]. It won’t help you with IE11 support though.
That's a helpful feature but it's by no means a replacement for Typescript. You do not get the full value with just inline comments, in fact that usage is extremely limited.
I generally avoid third-party JavaScript modules for performance and maintainability reasons. When I do use them, I'll frequently vendorize them (put them in my source tree) and make ESM modules for them.
Right, but that is basically saying we should all go back to something similar to before npm/bower/etc.
I love using plain ESM and I prefer to pull in stuff that only relies on that. But when I need graphing on a single page I'm not going to try to vendor Vega and I'm not going to pull it in on all pages. When I need xls parsing I don't want to vendor xls.js and maintain the diff myself.
So I have to use something like snowpack to make it work with my ESM system and dynamic imports.
If you don't have to require heavy libraries for certain pages then that's great and I encourage to use pure ESM without build systems for that, but you should also recognize that not all use-cases are ready for it.
Currently I use snowpack to handle dependencies but still run all my own code unbundled and unminified.
> Right, but that is basically saying we should all go back to something similar to before npm/bower/etc.
That position isn't without good reason. Refer, for example, to the linked post, explaining the engineering decisions behind esbuild. ("It's Golang, not JS" is only part of the answer to "Why is esbuild fast?" The implications seem to be lost on many of esbuild's users, given the nature of esbuild itself and the job it's supposed to do.)
So you’re the one writing everything from scratch even though the wheel has already been invented and perfected and tested over years by many developers.
Also your website is serving dozens of cascading files instead of a single minified one.
Bundlers aren't the "wheels" here. The real "wheels" here are ESM modules, which have been available in every modern browser for a while. Bundlers are just pre-wheel, prehistoric, stopgap technology from the 2010s that we should be moving away from.
And there's no problem serving multiple files in 2020 because we have HTTP2 multiplexing now. In fact, it's probably more efficient than using bundling, because you can cache much better. Minification is also virtually unnecessary with Brotli, and not minifying has the added benefit of making the debugging experience much better.
And bundlers haven't been "perfected". Not even close. Webpack is without a doubt the worst piece of technology in my stack right now, together with Babel. Those two are terrible by themselves, but they also manage to "infect" other things elsewhere in my stack: for example, ESLint and Jest need Webpack/Babel plugins to work.
Basically everything you wrote is wrong. Loading multiple files is never more efficient because compression works on a file at a time in that case, plus subdependencies start loading only after their parents finish downloading. This is CSS’ @import cascades all over again.
The wheels I’m talking about are the actual modules that exist on npm used in production by millions daily. Everyone here can write some code that does something common like, say, left-pad, and then they screw it in some obvious way that was fixed in 2014 by the third user of that library.
Sure I agree that having no build is nice, but user performance is not comparable and you’re limited to your own code.
You seem to be still holding a lot of assumptions from 2010.
In practice compression is not a big issue if you use something with a pre-defined dictionary specialized for HTML/CSS/JS, which is Brotli. The advantages of bundling are not exactly too large. Sure you might not be deduplicating some identifiers, but the lion's share of code in practice is normal javascript/CSS/html.
Also, in the long term, caching and the removal of code added by bundling more than makes up for any losses from compression.
Also, with Multiplexing there's no problem with download multiple files sequentially, because you don't need TCP handshakes and HTTPS negotiation, which were the reason for bundling in 2010. In fact, multiplexing multiple files might be faster in some cases, because you're able to parallelise the downloads and the parsing, rather than having to download a big bundle before parsing it.
There are of course caveats to this, but neither bundling or separate assets are silver bullets.
If you're using straight ESM then you don't need any build tool a lot of the time. If I need any 3rd party module it's either include it in global using an ES5 library, or use https://www.skypack.dev/ which has a ESM ready built module of most npm packages that you can import directly in your code.
This works IF you write 100% vanilla javascript. In larger projects a vanilla javascript approach usually falls short. I cant see a large project being written without a types in 2021. So you still need a build step, no matter if you are using Typescript, flow or some other typed version.
> You also don't need a transpiler like Babel if you drop support for IE11
I really dislike this line of thought: Babel isn't just a polyfill tool. It's an incredible way to enhance your code. Further, the future of javascript changes isn't static and shouldn't be.
You also don't need a transpiler like Babel if you drop support for IE11. Almost all ES6 features– such as classes, proxies, arrow functions, and async are supported by modern browsers and their older versions. As long as you avoid features like static, private, and ??=, you get instant compile times.
It's not 2015. You don't need a transpiler to write modern JavaScript. As of writing, the latest version of Safari is 14 and the latest version of Chrome is 88. These features fully work in Safari 13 and Chrome 70, which have less than 1% of marketshare.