Wouldn't build size increase a lot if transitive dependencies were pinned to direct dependency lockfiles? Like if library A says "use version 1.0.0 of library X" and library B says "use version 1.0.1 of library X", then you'd likely end up bundling duplicate code in your build.
Not saying the tradeoff isn't worth it, but pinning to dependency lockfiles isn't without downsides.
FWIW, that's not what Go does. In your scenario, a Go binary ends up with a single copy of library X -- the 1.0.1 version. That's because library A is stating "I require at least v1.0.0 of X", and library B is stating "I require at least v1.0.1 of X". The minimal version that satisfies both of those requirements is v1.0.1, and that's what ends up in the binary.
That behavior is Go's "Minimal Version Selection" or "MVS". There are many longer descriptions out there, but a concise graphical description I saw recently and like is:
That's the default behavior, but a human can ask for other versions. For example, a consumer of A and B could do 'go get X@latest', or edit their own go.mod file to require X v1.2.3, or do 'go get -u ./...' to update all their direct and indirect dependencies, which would include X in this case, etc.
Continuing that example -- in Go you end up with v1.0.1 of X by default even if v1.0.2 is the latest version of X.
That is a difference with many other package managers that can default to using the latest v1.0.2 of X (even if v1.0.2 was just published) when doing something like installing a command line tool. That default behavior is part of how people installing the 'aws-sdk' tool on a Saturday started immediately experiencing bad behavior due to the deliberate 'colors' npm package sabotage that happened that same Saturday.
In any event, it's certainly reasonable to debate pros and cons of different approaches. I'm mainly trying to clarify the actual behavior & differences.
What if the requirement was pinned specifically to 1.0.0 in order to avoid a bug introduced in 1.0.1. With a package that also requires a minimum 1.0.1, that should be unresolvable set of requirements and your package manager should fail to make a lockfile out of it.
The practice of dependencies’ dependencies being specified using SemVer version constraints to auto-accept minor or patch changes is the difference compared to Go, and why lockfiles will not always save you in the npm ecosystem. That said, approaches like Yarn zero-install can make very explicit the versions installed because they are distributed with the source. Similarly, the default of using npm install is bad because it will update lockfiles, you have to use npm ci or npm install —ci both of which are less well-known.
So it’s not impossible to fix, just a bad choice of defaults for an ecosystem of packages that has security implications about the same as updating your Go (or JS) dependencies automatically and not checking the changes first as part of code review. Blindly following SemVer to update dependencies is bad, from a security perspective, regardless of why or how you’re doing it.
> The difference is the npm ecosystem actively encourages automatically following SemVer because by default it uses a ^ to prefix the version number.
So does Go. In fact, Go only supports the equivalent of ^, there is no way to specify a dependency as '=1.2.3'. That is, whenever you have two different dependencies which use the same dependency at different (semver compatible) versions, go mod will always download the newer of the two, effectively assuming that the one depending on an older version will also work with the newer.
The only difference in this respect compared to NPM (and perhaps also Cargo or NuGet? I don't know) is that Go will never download a version that is not explicitly specified in some go.mod file - which is indeed a much better policy.
Not saying the tradeoff isn't worth it, but pinning to dependency lockfiles isn't without downsides.