...you're making your code incompatible with anything that requires a higher or lower version of that library.
Actually that's not correct when using node/npm (or anything else in the ecosystem like browserify). That is one of the impressive things about this platform: any number of different versions of the same module can be required by the same app. It would be nuts to do that in your own code, but as long as the craziness is in your dependencies it really doesn't matter.
And that kind of works in a dynamic language. You could make it work in a statically-typed language, but then the problems will become more apparent. If X depends on Y and Z1 and Y depends on Z2, and Y exposes an object created by Z2 in its public API, the X developers might look at the Y api docs and try to call Z1 functions on that Z2 object! Worst of all, it might very well work in the normal cases, and the issue might not be noticed until it's in production.
Using multiple versions of the same library is code smell. It's a stability issue, a security issue, a complexity-breeder, and an absolute nightmare for packagers.
Yeah I'm sure it sucks for distro packagers. Why are they using npm, though? It's not designed for their use case.
Actually though you're just talking about some bugs in X, or possibly some design flaws in Y. Passing [EDIT, because bare objects are fine:] class instances around like that is the real code smell. So much coupling, so little cohesion. We call them "modules" because we like modularity.
It's not that packagers are using npm. It's that they might want to package a node application for their distro's package system, and now they have to sift through thousands of npm packages (already a nightmare). They can't just make a system package for every npm package, not just because that would violate packaging guidelines for any reasonable distro, but because one project can pull in multiple versions of a single package. The Nix and Guix crews can handle that (and that they can is as much of a bug as it is a feature).
There is no clean way of packaging a typical node application.
Passing class instances around like that is the real code smell.
Often, yes, but not always. Allowing fine-grained control over performance is one good reason that a library might expose class instances from one of its dependencies.
Is Node.js itself appropriate for packaging? I think maybe not. It changes really quickly, and has done for some time. Anyone coding in Node installs the particular versions she needs without regard to the distro. Most Node modules are just libraries installed in and for particular projects. There are tools written in node, but for the most part they focus on coding-related tasks that also tie them to particular projects, e.g. beefy or gulp. There's no need to install such tools above the project level, and certainly no reason to install them on the system level.
A distro that still packages python 2 (i.e. all of them) has a particular "velocity", and therefore it has no business packaging Node or anything written in it. Maybe a distro could package TJ's "n" tool (helpfully, that's written in bash rather than node), which would actually be handy for distro users who also use Node, but that's it.
I'm not talking about packaging node libraries for developers. No node developers are going to use system packages to install their libraries. What I mean is packaging applications written in node for end users.
For example, you can install Wordpress on Arch with `pacman -S wordpress' and you'll have a managed wordpress installation in /usr/share/webapps/wordpress. Then you just edit some wordpress config files, set up your http server to serve php from that directory, and you have a wordpress blog.
It would be nice to be able to do the same with Ghost.
Ghost may be a special case. I wasn't familiar with it, but I just attempted to install in an empty directory without success. The first time I ran "npm i ghost", with node v5.9, it went into an infinite loop creating deeper and deeper ".staging/ghost-abc123/node_modules/" sub-directories of node_modules, which seems an... odd thing to do. After killing that, I noticed that they recommend Node LTS. Fair enough. I ran "sudo n lts", then "npm i ghost" again. This time, I didn't have to kill it because the preinstall script errored out. Based on the log, this script is installing semver, then requiring a file that can't possibly exist at preinstall time. Both of those are obnoxious, but at least it's possible to install semver.
I'm sure if I look hard enough there are some silly idiosyncratic steps one might take to install this module. Suffice it to say that it's not installing the "npm way", so it's misguided to blame npm for packaging difficulties.
More generally, I can certainly understand distro packagers' refusal to recreate towering pyramids of node dependencies in their own package system. Some lines have to be drawn somewhere, and "major" modules must bundle many of their dependencies when packaged for distros. If module maintainers don't do this, and distro packagers can't, then the modules can't be packaged.
Or you could have all the bugs introduced in everyone's hand-rolled implementations of it. I'll take multiple versions of the library instead. It's much easier to track issues and submit patches to update their dependencies later.
> Or you could have all the bugs introduced in everyone's hand-rolled implementations of it.
Only one buggy implementation per project. Compare this to including the same
library in dozen different versions, because dependencies have their own
dependencies. And you can neither track the versions nor update them.
Actually that's not correct when using node/npm (or anything else in the ecosystem like browserify). That is one of the impressive things about this platform: any number of different versions of the same module can be required by the same app. It would be nuts to do that in your own code, but as long as the craziness is in your dependencies it really doesn't matter.