Presumably if your goal is a reproducible build you just wouldn't do any unconstrained downloading in the process of designing the dockerfile and building the image. Making a choice to use a tool poorly for you requirements isn't a problem with the tool.
The claim the parent was addressing was that Docker helps with reproducibility. It doesn't. Docker does nothing at all in this regard.
If you want a reproducible Docker image, you're on your own. For example, the most common problem is that many build scripts out in the wild download stuff willy nilly without verifying anything. I've seen NPM package post install scripts do the craziest things. Catching all of that is harder than most would give credit for at first glance, considering that tons of build scripts are written in Turing complete languages. Help from tooling is essential.
When you have to fight the tool to achieve reproducibility, not choosing to do so isn't "using a tool poorly." It's simply using the tool as is. Especially when the vast majority of Dockerfiles out there happily run something along the lines of `apt install foo`, again, without verifying anything.
The Nix package manager. It forces all inputs to a package be specified precisely. Files either have to come from the build recipe or have its hash verified. Builds are properly sandboxed and can't freely access the network. Dependencies are completely static and no "resolution" ever takes place. The tool catches sources of irreproducibility for you.
Incorrect. Step one of reproducibility is "disable unconstrained downloading from the internet". Docker does the opposite.