Looks great. Switching from wezterm and kitty before that, and the native feel on macOS is indeed better than those emulated tabs. Also it is really quite zero config where to me it seems only one line of color theme is needed, together with setting TERM for ssh. (Default font happens to be mine.) Bundling so much color theme felt a bit waste of space, but it is nice that I don’t need to define them myself which also makes switching theme tedious. I also love to see it lands at 1.0 where features, documentation, and promotional materials seem polished already.
Chances are, it will become my default terminal in 2025!
The only nitpicking is the doc does not seem to have a search function. It would be great if I can search color or ssh and find the relevant page address that immediately, without navigating the tree or relying on external search engines.
My goal is not to give an overview of available tools, and I know quite little about Conda specifically to begin with - my expertise is in building (and even then really only on the Python side) more than specific tools. So I'm not planning a specific critique of it or anything, no.
I will certainly mention Conda in contexts where it makes sense to do so.
Ok, thank you. It’s a pity for me as I think having a resource that explain the big picture of both would be great. But I don’t have enough of a big picture to write about it yet.
The thing about conda is that it is not just a tool, it is a completely different ecosystem to handle what you describe in your post. You touched on the topic of building & packaging, distribution, environment management, etc. And all these are different in conda land. To me the answer to the problems you mentioned in the post is conda. But I think it is commonly misunderstood and some people hate to use it. Not to mention needing to maintain publishing in both PyPI and a conda channel is laborious.
>it is a completely different ecosystem... all these are different in conda land.
This matches my understanding. But I also understand that Conda doesn't treat Python as anything special; it just provides its own kind of environments for multi-language computing.
Many people do swear by Conda and I should look into it eventually. (In some far future, maybe Paper can work in a Conda environment and avoid the problems that Pip encounters there.) If I'm going to fix problems (and educate about them) on the Python-first side, I need to have priorities, though.
There’s a chain of stuffs happening regarding packaging and the first step in the chain using PyPI limits what one can do to fix it. Conda starts with a different design in how to build a package and also how to bundle it. It specifically handles compiled code much better, because in that non pure Python case, compiler is needed at some point, and that is going to cause problems if you have no control over it. Conda essentially achieve reproducibility, with the necessary compiler considered as part of the dependencies. While in a certain sense of reproducibility can be achieved with PyPI packages, my understanding is that, eg with
manylinux, it takes a hope for the best approach and can break in exotic cases, which should never happen with conda packages.
> The genesis of Conda came after Guido van Rossum was invited to speak at the inaugural PyData meetup in 2012; in a Q&A on the subject of packaging difficulties, he told us that when it comes to packaging, "it really sounds like your needs are so unusual compared to the larger Python community that you're just better off building your own"
This was already in my research pile. "Myth #2" there is what I meant about "not treating Python as anything special". The GvR quote is infamous, and seen as short-sighted, but it made sense in context.
The story is often told (though the link doesn't do this) as if the "main" Python community intended to ostracize the PyData folks. But 12 years later there is still much important dialog with them, and NumPy is one of the most commonly downloaded PyPI packages. My discussions on discuss.python.org exposed me to entire websites of analysis of the existing problems (like https://pypackaging-native.github.io/) and people like Henry Schreiner and Ralf Gommers are driving much of the discussion.
> in that non pure Python case, compiler is needed at some point, and that is going to cause problems if you have no control over it. Conda essentially achieve reproducibility, with the necessary compiler considered as part of the dependencies.
This is one of the major differences, yes: Conda has a system for ensuring you have the right compiler. It's hoped that PEP 725 "Specifying external dependencies in pyproject.toml" (https://peps.python.org/pep-0725/) will be a first step towards resolving this in the Pip world, or perhaps towards better interoperation with Conda. (Of course, the hard part is actually using that metadata to install compilers.)
Interesting. I’m looking forward to your future posts.
One question I have in mind when dealing with PyPI vs conda is that why wouldn’t conda already solved all the problems with PyPI. My own answers to that includes 1) PyPI is still the de facto place to claim a namespace and it would be foolish to only release it in conda and not PyPI, and 2) there are many more packages available in PyPI and not conda, and the effort to just make all PyPI packages I used available in PyPI is too laborious.
Another issue is that many packages available on PyPI are not packaged well, from version constraints to setup.py madness to licensing issues. People managing the conda-forge channel is more gate keeping about those issues. I wonder even if the tooling around PyPI is improved, the community culture is much more difficult to be changed.
How home manager users handle the case of setting up the home directory on a system they have no control over, and therefore nix is not usually available?
Good point. Not to mention the circulation will be all wrong had the radiator been cool instead of warm. (The primary means of heating by radiator actually comes from convection rather than radiation.)
In this sense I personally prefer pixi because of this. It is pixi like but resolves using conda channels like conda, and similar to conda it supports PyPI packages via uv.
With a background in scientific computing where many of the dependencies I managed are compiled, conda packages gives me much more control.
P.S. I’d like to point out to others to differentiate between package index and package managers. PyPI is an index (that hosts packages in a predefined format) while pip, poetry, uv are package managers that resolve and build your environments using the index.
Similarly but a bit more confusingly, conda can be understood as the index, hosted by anaconda but can also be hosted elsewhere, with different “channels” (kinda like a GitHub organization) where conda-forge is a popular one built by communities. Conda is also a reference implementation of a package manager that uses anaconda channels to resolve. Mamba is an independent, performant, drop in replacement of conda. And pixi is a different one with a different interface by the author of mamba.
Even more confusingly, there are distributions. Distributions come with a set of predefined packages together with the package manager such that you just start running things immediately (sort of like a TeXLive distribution in relation to the package manager tlmgr.) there are anaconda distributions (if you installed anaconda instead of installing conda, that’s what you get), but also Intels distribution for Python, mini forge, mambaforge, etc.
As far as the solver is concerned, there should be no difference as it has been streamed. But I personally can’t see a reason to go back, as mamba is supposed to be a drop in replacement of conda. I default to use mamba and switch to conda only when necessary. There are some cases mamba can’t handle correctly, such as the case where you want to roll back to an earlier revision: https://github.com/mamba-org/mamba/issues/803
As the black hole getting larger, it is more difficult to notice this difference (of crossing the event horizon or “observe the geometry around you”.) and as we are talking about the whole visible universe being inside a black hole, we are in this extreme large scale.
Also, I’m not sure why you’re arguing about the radial coordinate being time-like. You can only measure in your own local reference frame. You wouldn’t necessarily be able to transform between your own local reference frame to the blackhole’s if you don’t know you’re in one.
Photon-photon interaction is photon self interaction. Gravity/graviton self interaction then means graviton-graviton interaction. In general relativity, all form of energy would have an effect of gravity, and also react to gravity. Since all matter, including photon and graviton, has energy, then they should self interact.
In QED, photon and photon do interacts too and you can calculate its effect to be small. In GR, you can expect self interaction is small if the space time curvature is small.
(Photon is the particle that mediates QED, and graviton is the hypothetical particle that mediates gravity.)
Chances are, it will become my default terminal in 2025!
The only nitpicking is the doc does not seem to have a search function. It would be great if I can search color or ssh and find the relevant page address that immediately, without navigating the tree or relying on external search engines.