So here's my question: Are things in a wheel allowed to rely on ANY site-packages? IF not, this is spectacular. If so, another unfortunate attempt in the long long struggle.
I'm not sure if this answers your question, but the format includes PKG-INFO, which includes a "Requires" field. How do you envision disallowing dependencies for python packages working? Would all packages have to contain all of their dependencies? I don't think that would ever happen in python-land. It would break lots of things, for dubious benefit.
PKG-INFO has included "Requires" since PEP 314, and I think actually there was tool support for it before that. So it might be a gradual process, but we're getting there. In the meantime, if you can, use a modern tool like "pip". If you can't because you're on Windows, it seems like this new "wheel" format is intended to ease construction of "bdist_wininst" or "bdist_msi" distributions from "sdist" distributions, so that should make your life better too.
Only if you want to limit the people who can install your software to those with one of the package managers and configurations you target, and who have root.
To give a specific example of what michaelhoffman wrote, I have multiple Python virtual installations because my full regression suite tests for about 12 different installation possibilities, including for nice error reporting if a given module isn't installed.
There's no way to do that using the standard package managers, except by the much more complicated solution of having multiple OS instances available.
Leaving dependency management to distros can cause problems when they upgrade a package to a new one that breaks the old API or when they lack a package for a library you need. Dependency isolation is something that Java got right: compile everything into one jar and then deployment is just copying the jar and executing it.
The drawback is that you are responsible for security vulnerabilities in the libraries you install.
The drawback is that you are responsible for security vulnerabilities in the libraries you install.
That is a much larger problem than most people realize. It makes vulnerable dependencies fairly impossible to deal with.
If an OpenSSL bug comes out, I upgrade OpenSSL on all systems in my enterprise. Everything linked to it is automagically taken care of. If a vulnerable library is buried into jar files (or whatever equivalent in your language of choice), I'm screwed. You'll never find those, and there is no assurance that your systems are patched.
If I put on my developer or end-user hat, I totally agree with you. But for most software, this is dangerous.
Dependency isolation is something that Java got right: compile everything into one jar and then deployment is just copying the jar and executing it.
So that's why jars are always 100 MB! I don't think the python community will ever perceive this as the "right" way to distribute libraries. (Actually I'm not sure even java thinks jars are the right solution for libraries.) You'd probably get more agreement if it were declared the right way to do AMIs or buildout recipes or whatever.
The drawback is that you are responsible for security vulnerabilities in the libraries you install.
It's one thing to say that the sysadmin is responsible for security, but it's another to make sure they have access to fifty rebuilt binary distributions every time a library upon which those depend has a security update. There's dependency isolation and then there's dependency isolation.
> So that's why jars are always 100 MB! I don't think the python community will ever perceive this as the "right" way to distribute libraries.
I'm not advocating it as the right way to distribute libraries. It's the right way to deploy applications. Compiling everything into one simplifies deployment. It does not simplify public distribution.
> the sysadmin is responsible for security [...]
It just means that whoever is responsible for security (which should be operations _and_ development, IMO) need a list of the libraries used as part of an application. In Python this is normally stored in the file "requirements" in the root of the source tree. Java projects have similar things, so does Haskell.
I'm not advocating it as the right way to distribute libraries. It's the right way to deploy applications.
OK, I agree that's a different case, one for which jars may be appropriate whenever you don't have access to an OS-level package manager. But when is that? When using Windows, but not in the enterprise? Surely most would avoid that grim situation. Since I use apt or the equivalent to install and update apps (unless I have a reason to be closer to the current version), I primarily think of python distributions as providing libraries, although of course they may include applications as well.
It just means that whoever is responsible for security (which should be operations _and_ development, IMO) need a list of the libraries used as part of an application.
Of course it's nice when the applications one installs are supported by developers in the same organization, but it's a rare luxury that doesn't obtain for the typical sysadmin. Even in that case, however, aren't you proposing to make a lot of work for yourself that your package manager is happy to do instead?