On a somewhat related note: Why do 99% of the things that are packaged by Ubuntu (or any distribution) need to be packaged at all, let alone by the distribution?
A lot of the lack of focus (and fragmentation) in Linux comes from people duplicating each others' work. There is absolutely no need to have hundreds - thousands? - of developers spending their valuable time packaging up software when the app developers could do it just as well - or better, on account of knowing the software that they are packaging. People who work or volunteer for the OS developer should instead be writing new features for (and generally improving) the user experience of their respective distributions, not taking other peoples' software and making it work like it should already.
Two words: app bundle. Developers handle their own packaging. Bundles include dependencies. The end.
> There is absolutely no need to have hundreds - thousands? - of developers spending their valuable time packaging up software when the app developers could do it just as well - or better, on account of knowing the software that they are packaging.
1. Because that puts extra work on the developers who don't necessarily know the target system very well. What if the developer doesn't know the FHS (which apparently is quite common, given what I've seen...)? How are you going to manage dependencies if the packaging systems differ widely (and no, there's no one way to bridge them all, because some of them differ because of the whole construction of the distribution itself!).
2. They'll never do it.
(#1 leads to #2, but seriously, it's enough work getting developers to write a proper Makefile/gemfile/setup.py or whatever is appropriate for the language they're working in. Now you want them to do it 100 times over, for systems that they've probably never used before?)
That works for something like sta.li, in which everything is statically linked, because then you have no problem. But when you start talking about dependency tracking and small-but-important configuration differences between distributions, this becomes impossible.
The solution is to use a two-tiered approach, which some distributions sort of already do. If I make a Python program, for example, I create a setup.py file and all of the other things that you need to install it with pip (sidenote: this is embarassingly complicated for a language that's dead-simple otherwise!). Then, I let each distro's community handle it themselves - after all, they know their system better than I do, and remember that I've already listed all dependencies, etc. because I've packaged the program itself properly!
Then, the distribution can figure out how to handle those packages - perhaps installing them directly through pip is the best solution, or perhaps they add an extra layer the language-specific package (check out the AUR for something similar to this), or perhaps they want to just hard-code the file destinations. Whatever makes the most sense in the context of the distribution itself.
And who packages glibc? Or udev? Or zlib (which is so quirky that it almost doesn't build at all out of the upstream source). That's where the packaging bandwidth is spent in the distributions: middleware. Certainly not "apps", which are generally trivial to build as long as you have the dependencies correct. Take a look at, for example, the Fedora spec files for the software you use vs. the middleware and see the difference in complexity.
And note that Android's (which I assume is what you're referring to when you talk about "app bundles") middleware packaging is no less complicated (honestly it's a lot more so in a lot of ways -- no firm dependency tracking, everything must build all at once). Check the AOSP "external" tree.
I'd expect that like other OSes core dependencies would still be managed by the distributor. But these are nothing but the barest essentials to making the OS work. I count udev, and glibc, and zlib among these. Mac OS X does this. But these are very few in number compared to the number of total packages that are stored on, for example, Ubuntu's repositories.
If the price of having apps that just download and work on any distribution is having a more complicated middleware system, then I'll take it.
With bundles I don't know if the software has been tested on my architecture, if it's stable, if I'm downloading a botnet, if it comes with bundled libraries or if it works properly with -fpie and --as-needed.
Then you have software that doesn't have official releases (like half the stuff on github)
I'll stop using a package manager the day that all ruby gems work with MRI 1.8, 1.9, JRuby and REE. Basically, never.
A lot of the lack of focus (and fragmentation) in Linux comes from people duplicating each others' work. There is absolutely no need to have hundreds - thousands? - of developers spending their valuable time packaging up software when the app developers could do it just as well - or better, on account of knowing the software that they are packaging. People who work or volunteer for the OS developer should instead be writing new features for (and generally improving) the user experience of their respective distributions, not taking other peoples' software and making it work like it should already.
Two words: app bundle. Developers handle their own packaging. Bundles include dependencies. The end.