Ah yes, because writing a requirements.txt is complicated, writing a pyproject.toml is complicated, or distributing your app with pyinstaller is complicated.
The Python ecosystem has many options that solve dependency management, this is what makes the standard evolve (like the PEP about pyproject.toml which is now a standard). This is healthy, and installing Python deps, even native has never been a problem since the wheels package have been standardized.
Can we stop with those obsolete claims that Python packaging is a mess? The package format barely changed in the last 20 years, and every solutions rely on setuptools and/or pip.
> Ah yes, because writing a requirements.txt is complicated, writing a pyproject.toml is complicated, or distributing your app with pyinstaller is complicated.
Writing it isn't complicated. Getting deterministic behaviour out of it is. (Yes, there's freeze which is better than nothing, but that doesn't help if you're actually developing a project; you can freeze your transitive dependencies, but sooner or later you'll want to upgrade something that depends on something you froze, and so then you have to unfreeze everything and hope that nothing else breaks).
> The Python ecosystem has many options that solve dependency management, this is what makes the standard evolve (like the PEP about pyproject.toml which is now a standard).
The Python ecosystem has many options because they all suck. Every few years someone writes a new one that claims to fix the problems, but they still haven't managed to catch up with where Maven was 20 years ago. (Indeed I'd argue they've actually gone backwards in some ways, e.g. including pip in newer versions of Python).
> Can we stop with those obsolete claims that Python packaging is a mess? The package format barely changed in the last 20 years, and every solutions rely on setuptools and/or pip.
The claims are not obsolete, and the fact that things have barely changed in 20 years is a big part of the problem.
As you said, there is `pip freeze`, but also the Pipfile.lock for pipenv and the poetry.lock for poetry.
> but sooner or later you'll want to upgrade something that depends on something you froze, and so then you have to unfreeze everything and hope that nothing else breaks
Same for Rust with its Cargo.lock, same for JS with it's package-json.lock or yarn.lock, same for everything that lets you freeze your dependencies.
EDIT: That's also what test suites and CI/CD are for.
> The Python ecosystem has many options because they all suck.
They let me write software, manage dependencies and virtualenvs, there is even pdm which supports the recent PEP for __pypackages__ (equivalent of node_modules) instead of a virtualenv.
Can you clarify how every single one of them suck?
> they still haven't managed to catch up with where Maven was 20 years ago
What does Maven do to prevent the problem of upgrading frozen dependencies?
> I'd argue they've actually gone backwards in some ways, e.g. including pip in newer versions of Python
> What does Maven do to prevent the problem of upgrading frozen dependencies?
You don't need to freeze your dependencies in the first place, because dependency resolution is deterministic. (In the case where you want to upgrade a transitive dependency without upgrading your direct dependency, you specificy it explicitly; if you want the latest available version, or the latest available version with the same minor number or whatever, you can run a command to populate your project file with that, but it's an explicit, visible operation).
> EDIT: That's also what test suites and CI/CD are for.
That's an admission that the dependency management isn't doing its job.
> Can you clarify how every single one of them suck?
My point is that the huge proliferation of options is not a sign that they're good, but rather the opposite. Ecosystems that have decent dependency management don't feel the need to write a new system every couple of years. Maybe PDM is finally the one that doesn't suck, but at some point after six or seven tries that all suck (and yet were always widely trumpeted as "python packaging is fixed now") my confidence is pretty low.
> > I'd argue they've actually gone backwards in some ways, e.g. including pip in newer versions of Python
> Care to clarify?
Sure. I want a single entry point that I can install on my system (via my package manager) and run to build any Python project, when realistically I'm going to be working with a bunch of projects that each have separate sets of dependencies and each require separate Python versions. Putting pip in the standard library means a) half my python installs have pip and half don't b) I have several different versions of pip installed at any given time. And so it entrenches the complexity where I have to have a separate tool like virtualenv for managing which Python version I'm using for each project as well as pip. (Which wouldn't be so bad if accidentally running pip for one project while your shell was in the virtualenv for another didn't fuck up that virtualenv, potentially permanently...).
Ah yes, because writing a requirements.txt is complicated, writing a pyproject.toml is complicated, or distributing your app with pyinstaller is complicated.
The Python ecosystem has many options that solve dependency management, this is what makes the standard evolve (like the PEP about pyproject.toml which is now a standard). This is healthy, and installing Python deps, even native has never been a problem since the wheels package have been standardized.
Can we stop with those obsolete claims that Python packaging is a mess? The package format barely changed in the last 20 years, and every solutions rely on setuptools and/or pip.