This seems to be focused on packaging applications, and suggests that wheel is not good enough if you have a dependency on a C library. I am surprised since I thought you could embed a C library in a wheel.
I would like to know, is wheel+pypi an acceptable way to go to package and distributed a Python library that includes a C library (so/dll)?
I get the impression that big libraries like numpy and tensorflow use this method to pretty good effect. It seems easier than trying to compile the C libraries on the target machine, but is it quite difficult to achieve multiplatform support this way?
It depends on what you intent to do by including a C library. It’s a good choice you want to use that library in your Python modules. If you expect users to link to that library directly, wheels would be a poor choice that leads to never ending headaches (not wheel’s fault, but mainly due to how shared libraries are designed).
> The Conda package system packages both Python packages and C shared libraries and the Python interpreter into Conda packages.
This doesn't sound correct. Conda packages don't actually include the entire Python interpreter in them right?
A prerequisite to installing Conda packages is to have Anaconda/Miniconda installed which brings in the Python interpreter and a given set of packages depending on the version you install but that doesn't mean that each Conda package will be carrying the interpreter with it.
[1] https://docs.python.org/3/library/zipapp.html