"There is a good chance that upgrading fails and you would need to give it a second or 3rd try."
Sad that this has almost become the norm when developing in the modern javascript ecosystem. I dread touching those projects and creating one even more because stuff just rots away and your app might break in days, weeks or If you are lucky months. I'm sure there are better developers out there that can handle all of this and know how to avoid it, but a bozo like me does not. This is actually causing me stress irl.
I recently was convinced to start using a packer (webpack, parcel) and was blown away by the hoops people have to jump through to make stuff work. I've been spending an hour or so a day coming up to speed for the past few weeks, and every yarn/npm package I installed had some kind of error/warning that required a hack, or version contortion. I naively thought I could webpack with electron+vue+pug+ts and was soundly smacked down repeatedly, even after trying the boilerplate from each components' docs.
I appreciate what the developers are trying to do, I really do. Only, it is too many cooks + death by a thousand cuts. As I read through pages of closed-but-not-really GitHub issues for each package, sometimes going back years and years, it feels like a constant stream of hacking that erodes the well-intentioned first versions.
> but a bozo like me does not. This is actually causing me stress irl.
Me too! I dread package upgrades because it can instantly turn into an all-hands-on-deck emergency, and these are just the stand-alone packages, not all the ones I mentioned above.
To me what’s interesting is that ESBuild (which is a Go based JavaScript bundler) has come out of nowhere in a relatively short period of time, and it’s basically already better than any of the existing JS JS bundlers. Not just faster: it also has a radically simpler UI.
I have written JS since 1997 and kept up with ES6 and so on over the years. Used it in frontend until now.
The JavaScript-ecosystem just isn't very productive [1] for multiple reasons. The language is untyped. You have no multi-threading. It's hard to debug. You have to spend many brain-cycles on learning Browser-problems, which muddles your understanding of the language. The web-way of: we can fix it and everyone will reload the browser, leads to a "don't care" attitude.
For web/backend this is approachable, though still not good.
For build-systems this is terrible. Why? Because if a bug shows up in a build-system you stop a million developers from being productive. The bug won't get fixed right away, most will just work around it, use different plugins, copy/paste yet another webpack config etc.
Plus JavaScript tends makes people realized what FP is, and they all start putting functions in their functions and making everything modular and plugin-able. (See high-order-components in React). While this can be nice, it leads to even more confusing bugs.
Writing plugin-architectures is hard in any language. Writing it in JavaScript is a recipe for disaster.
Hence the modern JS-ecosystem.
[1]
Google: nullpointerexception -> 4.360.000 results
Google: undefined is not a function -> 800.000.000 results
> Google: undefined is not a function -> 800.000.000 results
You didn't put quotation marks around the second, and the first has no spaces, so this is not a fair comparison. "undefined is not a function" with the quotation marks has 449,000 results.
I don't doubt that there are many results for "undefined is not a function", but I feel quite confident saying there aren't anywhere near 449,000. If you ask google for just 111 of them, you'll get "In order to show you the most relevant results, we have omitted some entries very similar to the 110 already displayed."
please please please do not use google's results as an indication of really anything at all. Don't take my word for it, do this test: search something, like the name of someone you know that's not famous. You'll find that google will reports tens of thousands (or millions! lol) of results. But click through 10 or 15 pages. You'll soon find that many of the pages have almost nothing at all to do with what you searched (and may not even contain any word from the phrase!), but that if you get far enough, maybe 300 results or so in, you'll see that when google runs out of results it will magically just update to show the real number, which is often just a few dozen for a search like the one I described.
That makes perfect sense though. Older projects have a ton of baggage that they carry through all of the revisions. New-comers to the space can learn from the mistakes of previous approaches and also utilize the latest and greatest paradigms to solve the problem without worrying about backwards compatibility.
This categorically not true. It does exactly the thing it's supposed to do. Yes, it is best that potential users wait for now, but that isn't because it "doesn't do anything".
What it _doesn't_ do currently is provide an API which would allow it to be plugged in a similar way to Webpack (not exactly the same, as the deliberate aim is not to cover every usecase, as Webpack allows). This is a requirement for wider adoption as a competitor (rather than just as the module bundling step). Plus the tree shaking needs work, though I think that is less important than has sometimes been made out.
Parcel may have a simple API for simple things, but it's extremely slow and generally suffers from exactly the same issues outlined by OP.
> Me too! I dread package upgrades because it can instantly turn into an all-hands-on-deck emergency, and these are just the stand-alone packages, not all the ones I mentioned above.
To all package manager developers: make sure you have a rollback function, and make sure it is flawless.
Committing the package-lock.json file should do this for you: I always upgrade one dependency per commit and I put code changes in separate changes from upgrades. Also, you just have to learn which packages to avoid: react-router, for example, likes incompatible major version updates. Something simple like routedux that’s stable is almost always a better choice.
With all the different package managers out there, it's just not much fun to keep track of all the rules and special cases for each tool. Better to just have a "--rollback" option.
Isn’t this solved more generically by version control? As long as your package manager uses files to control the packages searched for, them you commit those files between upgrades and revert them to rollback. nix handles this at the system-level by making your system configuration match a declarative specification.
Would be interested to hear what issues you ran into with Parcel. I've used it on dozens of projects without doing any sort of deep dive into their pipeline - just using the default no config compilation and it's worked pretty much perfect every time.
Sorry, my post was misleading, I was convinced to use -A- packer, so WebPack, Parcel and FuseBox in that order. WebPack had the best documentation hands-down because 4 weeks ago I didn't know what a packer did, I had been purposefully avoiding them for years. Now that I know way more about the concept/theory, I'll go down the list. Although I'm reluctant to waste too much time because my experience trying to bring all of the frameworks I already use into webpacker was so painful. But I really am pulled in by hot module swapping...
Strange, because parcel in particular is zero configuration and I have used to it build client projects running in production. No complicated webpack configuration needed.
Parcel is a rare thing in the JS world: a tool I’ve actually found useful and efficient. As a bundler, it does indeed do a lot of things correctly out of the box, and as a direct result the experience of using it is far more pleasant than most such tools.
However, it is also a tool for one job: bundling. If you also want to do things like unit testing or running any sort of linter or style checker over your code, you probably need to figure out how to get those tools to read your TS or ES20xx code as well. You might still need a handful of additional libraries to implement your test suite. You might still need to configure your preferred style and which rules you want to enforce in your static analyser(s).
So even if Parcel was flawless about installing what you needed for your dev and production builds automatically, you’d still end up needing to install a bunch of other tools and write a bunch of other configuration files. The overall efficiency of setting up a realistic development environment, even using Parcel, is still limited by a kind of Amdahl’s Law problem.
So far what I've been doing is keeping my webpack config as simple as possible (< 40 lines) and just relying on default behavior. I know I'm losing out by not bothering with some of the more intricate configuration properties like caching, etc.
But one of my big pet peeves is when you clone a project and the installation says all you need to do is run make install or npm i, but in reality requires 20 google searches and an hour of banging your head to get the project running and even then you end up with 50 cryptic warning messages in your terminal so you don't even know if you did it correctly.
With a very simple webpack config you might lose out on optimizations but at least you can get just about anyone running a project locally and if things go awry you can generally pinpoint the problem to a specific line of configuration.
I've found that a slightly longer (~100 lines) webpack config allows me to use all the caching and fancy tricks I want and is still just as debuggable. I think the main thing is setting it up yourself so you know what all the plugins/settings do.
Caching was a major dissapointment for me. DllPlugin is a nightmare to setup and other caching plugins I tried don't really speed things up. What eventually worked for me is HardSourceWebpackPlugin [1] - two lines added and went from 2min to 15s (with a tradeoff of a slightly longer initial build).
I got the same message. Still, having faster builds with one additional plugin without configuration needs on projects locked on 4 or waiting out the transition is a plus in my book :).
Webpack is kind of more "build your own bundler" than a bundler. The configurability and extensibility is immense and with that level of complexity, it's guaranteed things won't work smoothly for everyone (incompatible plugins, unexpected configurations etc.). Anyway, every big project always has hiccups with new major X.0.0 release.
Think about your own code for a second. It's typically a well defined code that doesn't have to accommodate a combinatorial explosion of configurations. It is bug free?
Keep in mind that 90% of code is written by one guy. It's not Webpack Corp. with a legion of QA testers.
Run it three times however is not part of such bugginess though. It means that they're randomly encountering different executions/run-states on what should be the same config/setup, each time you run it. And they have no idea why.
Or they're running things in parallel, and encountering data races and such, and randomly working as a result.
But the real concern is that they don't really know what a correct installation looks like, or they'd just check-and-retry on a loop until it succeeded (the easiest hack around such problems)
> Run it three times however is not part of such bugginess though. It means that they're randomly encountering different executions/run-states on what should be the same config/setup, each time you run it. And they have no idea why.
Maybe we’ve interpreted that message in different ways, but I took it to mean you may need to try it again at a later date once the community have fixed those issues/bugs. I don’t think they meant literally 3 runs of the same setup.
They’re still working through stuff so there may be edge cases for some where 5 won’t work right now.
I think it’s more if 5.0.0 doesn’t work the first time, you could simply let the community find those issues for you and by 5.0.xx it will mostly likely cover your specific setup (except for the documented api changes)
Configuring the modern JS toolkit (webpack, Babel, your framework of choice, Jest, etc) is such a pain. I’ve been doing front-end for 10+ years. You’re probably not a bozo.
Deno does away with most of the tooling configuration. You get linter, bundler, docs generator, formatter, watcher, version manager, std library, inbuilt tests, and many things that you would otherwise source from third parties in node ecosystem.
Support for webgpu and local storage incoming. Makes it a delight to write scripts. You can also scope them by permission.
I wrote a CLI tool to try it out. It monitors CPU/GPU temps then generates an HTML chart on SIGINT.
I like top-level await, the standard library and first-class typescript support.
I do wish the sandboxing was more granular. My small CLI tool requires: --allow-read --allow-write --allow-run and --unstable. I only need a read/write in a single directory, run a single binary. Unstable is required for signal handling, but that shouldn't be the case forever.
I'm glad someone is re-imagining JS/TS on the back end. A robust and stable standard library could well improve the dependency hell and broken projects issues.
And likely will have similar issues in five years as node has now.
Node was once thought to be the cleaner alternative that had a lot of these features built in, it was the supposed savior of Javascript, and now look at where we are.
I do believe constant iteration is better, but Javascript has many problems at its very core, and many Node developers transitioning to Deno are going to create and write solutions that are more akin to their comfort in Node.
This will lend itself to reproducing similar issues as they currently have.
I am primarily a javascript developer, I've written both Node and Deno projects, but I don't see the issues being solved by just rewrapping the source output.
I have found that over time, I have become more sceptical about new shiny things in the programming world. Every now and then, I wonder if I’m becoming the dreaded veteran dinosaur type, the developer who has more demands on their time now and who just hasn’t kept up with the tech.
But then I look at all the troubles the kids have with their fragile, short-lived tools. I recognise short-sighted design decisions they’re making because they’re being pushed into them by some shiny new library. I notice that many of those tools and libraries have big names behind them, and that the Zeitgeist people think you’re weird if you’re not using them, and remind myself that these things are only ever means to an end, no matter how popular they become.
Of course it’s useful to keep generally aware of what’s happening in the industry and from time to time a genuine improvement does come along. However, there is definitely a difference between not keeping up and not wasting your time repeating past mistakes or following blind alleys that often end up at dead ends.
Yeah not-giving-a-damn-driven-development is also the only way to stay sane and not be over come with paralysis from being surrounded by a thousand broken windows in need of fixing.
It's specifically the frontend ecosystem. A problem caused by the numerous languages, assets, and formats that have to come together, the lack of JS standard library, poor module system, and the disaster of browser compatibility mixed with transpilers, polyfills.
Webpack has already had zero/very-little config required by default. That's what most should use, and that's if you even need to use Webpack directly. Otherwise I recommend sticking to the major site frameworks like Next/Nuxt/Gatsby and let those do all the config wiring for you.
Once I had the freedom to decide, I gave up on using most of the standard Javascript ecosystem stuff in my front-end projects.
No node, no npm_modules, no build chain. I'm lucky to be able to get away with that. It is incredibly efficient, not only because it no longer gets in the way of what I'm trying to build. The previous team I was on, we spent roughly 15% of our time tending to these things.
I know the feeling. It's not just modern stuff, I have an old ExtJS app that I dread to update. The best solution I have found is a dev / build environment in a dedicated VM that I can spin up once or twice a year to make updates. Bringing libraries up to date is a fucking nightmare and often a waste of time. For continuously updated apps I have moved away from complicated build processes. Most web apps are fine without a build process, I think there is a lot of premature optimization going on.
I too maintain an old and complex ExtJS 4.1 app and I never dared upgrading the framework. I carry a big zip around and it will be that until the end of its life or when I finally decide to rewrite it using React. For such an old framework it’s surprisingly bug free though.
The way you avoid problems with a .0.0 release is by not upgrading to a .0.0 release. You would have to do that by hand; npm and yarn have been pinning major versions for a long time now, so that upgrade only happens when you do it by hand. So if you don't want to deal with breaking changes, all you have to do is nothing. Meantime webpack 4 is not going anywhere.
I'm reminded of when npm released 5.7.0 (which some got upgraded to automatically because it wasn't tagged as pre-release) and had a critical bug that deleted your system files.
It was a new minor version, of a totally different package, which was version-tagged in such a way that some distro packagers picked it up when they shouldn't have, and which happened to contain a bug affecting a use case that's been deprecated for at least half a decade now.
Look, I get that there's a lot of hate for modern JS, and it's hardly as if there is any lack of basis for criticism, just as there is with every other highly active, heavily adopted, and fast-evolving software ecosystem. But mistaking this kind of uninformed slagging for meaningful commentary says more about the person who does it than about the subject of their ire.
I've been using ASDF (https://github.com/asdf-vm/asdf) for a while now to manage each of the language (and sometimes build tool) versions for a folder/repo in a consistent manner. I'd definitely recommend taking a look at it.
It helps to pin dependencies so that only the exact specified version is downloaded/installed/used.
I recently updated the dependencies of a frontend project. `vue-apollo` from version 3.0.3 to 3.0.4 had a breaking change. The discussion in the commit on GitHub proposed to release it as v4. But in the dev’s mind it was a fix so he released it as a patch version.
Started my first React Native app some time ago and I can totally confirm that. It feels like a jungle of very fragile dependencies and mechanisms between code and output.
Yes RN is certainly the worst for that for me. I can't count how many hours I've had to put in to wrangle the RN packager in to a monorepo with TypeScript and code-sharing across packages, and it's still unbelievably fragile and I can't actually share components between them. At this point I've made so many little alterations to the config from GitHub issues and StackOverflow answers that I literally have no idea how it works or how to replicate it to a new project. It's an absolute nightmare.
I frequently have to rerun `npm i` when it fails for seemingly spurious reasons. We used to laugh at Windows years ago, when the solution was so often "turn it off and on again", but it's basically what you need to constantly do with tools like npm and webpack.
Yarn is much better in this regard. I'd thoroughly recommend it. It's drop-in compatible (except it will generate it's own lock file), so it's pretty easy to try it.
Yarn also gives much cleaner output. NPM is extremely verbose, but it seems that "verbosity" doesn't really confer any benefit. And yarn is also generally much faster.
Hey Sean from webpack, this was really just about our third party plugin ecosystem needing to catch up to v5. If you are feeling IRL Stress, then please don't update yet. Sometimes you have to break things to create progress and ship major versions+bring new features. You don't have to use them, you can just use webpack's zero config out of the box.
This is why I’m such a fan of the meta-bundlers/frameworks, like create-react-app or Next.js
I like the features that we pack gives me, but knowing webpack and spending the effort on configuring it just seems like a waste. I’m incredibly grateful for the CRA maintainers to handle that for me.
Webpack itself is quite low level. `react-native init` or `create-react-app` gives you the ready to use developing environment and updating that is usually a breeze.
Depends on what packages you use and how well they do semver. Chances are you will get major version bumps for most of your dependencies after 1-2 years of development.
What works well then is to do upgrades partially. Each major library separately, fix issues, go with the next one. Otherwise it's hard(er) to track what breaks your app.
My current project has 100 dependencies and 150 devDependencies. Upgrading takes a week if not longer for one experienced dev. We tend to do it every 3-4 months.
Sad that this has almost become the norm when developing in the modern javascript ecosystem. I dread touching those projects and creating one even more because stuff just rots away and your app might break in days, weeks or If you are lucky months. I'm sure there are better developers out there that can handle all of this and know how to avoid it, but a bozo like me does not. This is actually causing me stress irl.