Well, the CGI industry had money, competent people, and 10 years to upgrade. An entire decade. And a LOT of tooling and documentation to help.
I made a lot of code conversion from 2 to 3. Most of them took me a couple hours to a few days.
I'm currently working on a 2.7 project that will never migrate because they literally patched the cpython runtime, but you can't freeze a whole community because some will take bad technical decisions or accumulate tons of technical debt.
Now, Python 2.7 will not stops working after 2020. It's just that we, as a community, will stop to pay the price for the ones that didn't move. If you want to stay there, you'll pay a commercial actor for it.
In 2020 centos 6 will stop received updates. Will you go complain that the new RPM are not compatible ? No, either you update, or you pay commercial support.
Ruby and node did it in a few months and told the community to move or die. Nobody complained.
Perl took so long the language has been forgotten.
PHP literally canceled the version 6, and made it taboo.
We gave years, and means, then extended the deadline, and heard nothing but complaining since.
I won't pretend the migration was masterfully executed. The whole 2/3 debacle was painful.
>> It's just that we, as a community, will stop to pay the price for the ones that didn't move. If you want to stay there, you'll pay a commercial actor for it.
That's very well put. But I guess this can happen only with big, important, projects where you can afford to loose/upset some users...
Your games don't work in windows XP anymore. Actually, your USB3 mouse doesn't work on windows 7 out of the box.
Centos have LTS, but still EOL.
Ubuntu init system was changed to upstart. Then to systemd. Also gnome, then unity, then with new menu/notif/systray semantics, then back to gnome (but shell), and soon wayland. It breaks a lot of things.
Firefox new addons don't work with some addons from last year.
NodeJS had 3 incompatible forks in it's short life.
Twitter and Facebook API breaks every sunday.
JS frameworks are just madness.
Python break compat, ONCE.
Once since 1990.
Also gave 10 years to migrate.
In our industry, that's not bad at all.
And the community hold. We worked. We wrote tools, doc, blog posts. We were there all the way. We have incredible libs like python-future to help.
And if any of that is not enough, well, Anaconda Continuum will be happy to do business with you.
In fact I can play EarthSiege 2, a game from the 3.11/Win95 era, just fine on Win7 x64 with the only things not working being joystick input (I guess it does some shenanigans with the MIDI/joystick port in addition to using the windows joystick API) and the pause screen which shows your vehicle spinning around spins too fast (probably because its speed is tethered to CPU sppeed).
Microsoft takes, with the exception of drivers, a lot of effort to keep backwards compatibility. And that is why people like it, in contrast to Linux where "Things break" is the norm and even in OS X it's not unheard of. Oh, and also why big enterprises stay as far away as humanly possible from anything NodeJS or more modern than PHP and Java.
Whenever this comes up I feel it is worth pointing out that Linux's "Things break" approach is a problem exclusive to the userland tooling built up around the kernel. Linus takes a very hardline stance against breaking kernel ABI compatibility (except for drivers), but pretty much every piece of software outside of that, including GLIBC, thinks it's totally ok to break things all the time.
It's really very sad that the Linux community never adopted Linus's view.
Because they are entirely different things. Breaking the kernel ABI doesn't just break a few packages for a popular language, it breaks the entire ecosystem. More importantly there's literally no reason to break compatibility in the kernel. There are perfectly valid reasons to break compatibility in programming languages.
Yes and your Python 2.x code will still run after 2020. But going the other way - new programs might not run on older distro releases, which is what GP said.
Meanwhile I routinely use Lisp written before I was born (in 1986) that works out of the box in 2018. I’ve used Java packages (recently) from 1999. My company still uses python/pypy 2.7 and we see no compelling reason to upgrade. If there is any sort of upgrade, it will be off of python.
When, oh when will we ever finally unlearn "worse is better?" It's hard enough writing good code without having to fight your tooling (non-orthogonal, non-homoiconic, non-malleable, non-backwards-compatible languages, some with header files propagating changes upwards and breaking user code, and some with fascist type systems causing combinatorial explosions of containers and factories, ugh)
Python broke compatibility a lot more than once. Python 3.5 and Python 3.6 are not compatible, for example. There are even cases where code written for an older version of Python 3 will not work on a newer version of Python 3. For example, see PEP 479.
2.X would still be included as an optional package, correct? I couldn’t determine that from the link, but it seems like RedHat’s practice. Or, they could install it from source.
If so, 2.X users still have two years or so to migrate. That’s plenty of time in my opinion, even when all of your code is 2.X. There’s even a library for it.[1]
> 2.X would still be included as an optional package, correct?
I wouldn't bet on a RedHat provided 2.7 package. if they provide it, they will be stuck maintaining it, and their long term support contracts are very long term.
On the python side - From 2020-01-01 no commits will land on the 2.7 branch. RH is still supporting RHEL 4, which was released in 2005, and supported 3 from 2003 -> 2014, which (if they keep a similar timescale for RHEL 8) could cause them to try and maintain a working build / test / development system for py27 for an extra 10 years.
"And if any of that is not enough, well, Anaconda Continuum will be happy to do business with you."
Yes, this is a key point that seems to be forgotten. The Python development team will end their support on 1st January 2020, but there are other providers with Python expertise who will almost certainly pick up the commercial opportunity, e.g. ActiveState, and possibly even Red Hat with a separate product.
This is not the end of Python 2, it's going to be a transition to a different support model. Commercial users either pay to move up, pay to rewrite the Python code with something else, or pay for a super long-life version of Python 2.
The awkward part may be academic research, where there's a probably a lot of Python 2 code that has no maintenance budget. I would not be surprised if a project appears to build no-support versions of Python 2 with essential fixes after 2020.
> Ruby and node did it in a few months and told the community to move or die.
Uh, Ruby’s big compatibility breaking upgrade (1.8 -> 1.9) took more than a few months, it took years for the community to fully move. (Because of Ruby’s pre-2.0 versioning practice, 1.8.x -> 1.9.x was a major version upgrade, and it was actually a more significant one than the 1.9 -> 2.0 update. But lots of people stayed on 1.8 throughout most of the 1.9 period and the transition didn't really complete until late in that period or early in the 2.x period, and 1.8.7 was getting maintenance releases for four years after 1.9.1 stable release.)
You could argue that 2.y is also a different language than 2.x but this is besides the point: as long as there is a compiler for that language and it works (for your definition of works) you don't need to translate your programs in another language just because you have money.
You don't need to upgrade your servers to the lastest linux kernel either. But in 10 years, you won't get any security update for it, unless you pay a lot of money for it.
It will be the same for Python.
You want the excellent and money making free work of volunteers ? Do your part.
You don't want too ? It's ok. In 2020, plenty of companies will sale you services for the real market price of your technical debt.
Is now 'ignoring security' the free pass that is thrown into every argument against opposing positions?
I'm being facetious about that; but you know that security risks can be reasoned and mitigated with different means (sometimes less costly) than simply upgrading software.
Companies who cannot upgrade their codebases are companies who cannot maintain their codebases.
Eventually, the technical debt hurts your ability to deliver with both speed and safety. When your org can't do that, your org stops being competitive in the market, and if the market doesn't kill your company then the brain drain will.
(small and non-notable exceptions for the literal handful of types of orgs where this is not the case)
True, but is there a situation where upgrading will hurt instead of benefit?
Not every solution is just as good for everyone. Think of the delayed upgrade game that some people play. They wait for others to upgrade first to get rid of new bugs at their expense. Now take this game at a 10 year extreme.
For them, the new branch (e.g. new language etc.) is simply too risky. It's not that they are not smart to upgrade, they simply have different opinions on what is valuable than you do.
Yes, upgrading is an engineering expenditure like any other. If you devote manpower to maintaining your codebase you're not devoting it to user-facing features. As such, upgrading your codebase can hurt your ability to deliver some priority feature on-time.
But that kind of zero-sum thinking is myopic. My whole point is that if you only ever prioritize feature work, eventually you lose the ability to deliver features. Nobody in their right mind thinks you can have a company with zero technical debt, there's always a balancing act in play, but if your managers are just playing the risk card without, you know, doing an actual risk analysis which includes the risk of not being able to deliver future features, then your company isn't making smart decisions.
Now, there do exist codebases where you can make the logical conclusion that there really will not be any future feature work, but it's still running in production and so it will need support, and it's a fairly large codebase, and so there would be an enormous cost to re-certifying for the upgrade with very little apparent benefit. That happens, but it only (really, truly, only) happens in large enterprises with many, large codebases and only so many engineers. Those enterprises do have the resources to hire new employees and/or outside contractors to pay down the technical debt on these half-alive projects - they just aren't prioritizing those resources on the technical debt, even long after it became clear that the clock couldn't be stretched out any longer. Through painful personal experiences, I have zero sympathy for companies in that position who think that they can solve their problems with more firewalls.
There is cost the individual user of an open source project has to consider, and there is cost the community of that project as a whole has to consider.
So from your perspective, just keeping the old version might be "the right thing to do", and at the same time the decision of the community to not support it anymore is also "the right thing to do" from their perspective.
As the community does their work unpaid, it seems you have no right to impose your perspective on them, except if you pay money for the necessary work. Which you are free to do.
I guess what the parent posters point out is that this will usually shift your own cost/benefit ratio in a way that upgrading becomes "the right thing to do" for you, too.
I don't think that centos 6 is bug free yet it EOL in 2010 too.
Actually i'm using it right now and I'm positive it has at least one bug when running in virtualbox.
My client won't upgrade. They pay support, expensive support, to keep their old version.
In 2020 I'll open a shop to convert old Python code bases or fix bugs in them. I'll charge 4 times market rate. For me it's a net win that people don't migrate.
Sometimes I wonder if the opensource movement, with its perennial “update anxiety”, is actually busy generating an industry of legacy maintenance. In a way, it’s a natural extension of the original “development is free but you pay for support” idea, but I don’t think anyone openly elaborated it into a long-term revenue-generating strategy. It looks like a slam-dunk, to be honest, with the only caveat being that work is extremely unfashionable.
It's not the opensource movement. It's IT in general. A lot of software stopped working with the windows 10 update that was forced on the users.
Android breaks the API regularly by changing the permission game.
The PSF has limited resources (3 million of dollar of budget) to exists (this includes running pypi and organising pyconf), but I think it's track record is quite good, even comparing to commercial products.
Actually, Android doesn't break the API, even by changing the definition of permissions. If you have a breakage, investigate your app manifest, what API version you are targeting. Android frameworks shim the old behaviors for the apps that declare the use of the old API versions.
E.G: Since API Level 19, the Alarm system has been completely remade. One must check the API Level to act some how or some differently. The same is for permissions after the advent of Android OS 6.0+. But we still need the support library to give Fragments and the ActionBar to API Levels lower than 11. And VectorDrawables to API Levels lower than 23. And many more objects.
The shims are managed by targetSdkVersion, i.e. if you declare targetSdkVersion >= 23, you will get the new permission system (because it was introduced in 23) if running on [23,maxSdkVersion] device; if it is running on device that has API level [minSdkVersion,23), you get the old one; if you declare it < 23, you will get the old one, always, on newer devices too.
The targetSdkVersion says what you designed against. If you claim supporting the new API, you should handle the new API (and that includes detecting what the underlying device supports). That does not preclude you from including the support library needed for lower API levels still supported.
Except if you release a new/updated app, it MUST target a recent (less than a year) targetSdkVersion.
"2019 onwards: Each year the targetSdkVersion requirement will advance. Within one year following each Android dessert release, new apps and app updates will need to target the corresponding API level or higher."
That's Play Store policy for new apps and app updates, not a technical limitation of the OS. Old apps will keep working just like they did before, the new APIs will not break them.
the old trick is to get your changes to the main branch so you dont need to maintain your it. that is also why anyone bothers to do so in the first place
> Well, the CGI industry had money, competent people, and 10 years to upgrade. An entire decade. And a LOT of tooling and documentation to help.
I don’t think that’s a money issue. Python 3 upgrade is not really compelling. You get slower speed - at least until recently -, tests might pass on 3.x, but documentation and edge cases still are better on 2.x, etc. While nothing is really exciting in the 3.x branch.
>While nothing is really exciting in the 3.x branch.
Sorry, have you looked at Python 3 lately? I don't think I can sum up all of the amazing work that's been done in one post (async? Cleaned up stdlib? Better errors? Not having an aneurysm from text encoding issues unraveling your whole project? New splat syntax?).
I would really encourage you to check out what's happened in the last ten years. I think you'll find many more exciting developments than you think.
I think that he's kind of right about the problem.
In my opinion, Python's primary business value has been that sweet spot it occupies. It's not the most fun, but it's pleasant. It's not the fastest, but it's not too slow. It's not great on resources, but it's not too bad either.
Good people will work with it, and they tend to be the "let's not get creative, let's just get it done" folks that businesses love. Mediocre people do great with it, because it feels much nicer than many of the other things they've worked with and it channels them towards producing better code and being more productive.
Python makes it huge pain-in-the-ass to get weird/creative, and generally frowns upon it, so people tend not to as much. Junior and low-skill contributors can't really get in too much trouble as long as they stick to the program. It's an awesome "just sit the fuck down and do your work" language.
As a language, it's carved out a great space being pleasant, consistent, well-rounded, predictable, etc.. I know there are people on here using it to build their rocket ships. And it can go there too... maybe not as flexible or fun as more extreme alternatives, but it's also less likely to blow up in your face. For those software rocketeers (probably most any SV start-up), updating makes undeniable sense. Python 3 is hands-down better, and they can handle the transition no problem.
But a lot of people don't use Python to build rocket ships. They don't build rocket ships at all. They build delivery trucks and conveyor belts and coffee dispensers. And, for them, the very things that make Python a great choice - a very clean and conservative focus on simplicity and stability - are the very reasons not to switch. There's just not really anything that's been introduced in the last 10 years that is going to make any real difference for their use cases. Whatever trouble they've had with unicode and the other bs has long been lived with. Whatever's missing they've long lived without. And things have been just fine.
Python's killer feature - being a really nice and well-rounded option that works pretty damn good for a large swath of people - is sorta it's undoing here. It's hard to really be that too much more so.
I mean I switched to 3. It's clearly better. But if I was running a Python team that had been effectively just trying to get the job done in 2.7 for over a decade I would have to admit that I wanted to switch for my own sake - I think it would be unlikely to make them that much more happy or productive.
Hell, many people can't even set their same environment back up in a week after loosing their laptop.
> But if I was running a Python team that had been effectively just trying to get the job done in 2.7 for over a decade I would have to admit that I wanted to switch for my own sake - I think it would be unlikely to make them that much more happy or productive.
Part of being a software engineer is keeping your environment up to date. Sure, I could keep using Node 0.10. But I'll get no updates for security or fixes for bugs.
If you're on a team that hasn't updated to a version that's supported, then you're neglecting one of the fundamental ongoing maintenance tasks involved in software engineering. If that's "too hard" to do over the course of a decade, then perhaps there's something wrong with your engineering culture.
Sure, you don't have to do it. But don't expect the rest of the world to continue to support your old setup. If you have to compile Py2.7 from source because RHEL doesn't come with it, that's fair penance for not keeping up with the community. It's just about the most entitled thing in the world to say "I didn't spend the two weeks in ten years time to upgrade to a newer version [using the numerous automated tools] and I'm mad that the world at large isn't making it easy for me to continue to not do anything."
I'll just say this... I still have 2.7 as my base install... for Ansible. Because they haven't switched yet.
Ansible is owned by... Red Hat. They acquired it in late 2015.
Seems like Red Hat - the people in the post that we're talking about that are shoveling folks off 2.7 - has been neglecting one of the fundamental ongoing maintenance tasks involved in software engineering, and perhaps has something wrong with their engineering culture.
Sorry, this shit is just too funny.
And it seems that Red Hat is booting Ansible from the core repos as well (looks like it's in that depreciation notice!), presumably instead of spending the "two weeks" to update it (you might want to go ask the Ansible team why they're too damn lazy to spend that "two weeks", see what they say).
However, unlink Red Hat and Ansible, some organization depend on the softwares in question, and can't just sideline them 'cause the shit they've successfully run on for a decade-plus has lost it's blessing.
The 3.x branch has developers committed to improving it and the 2.x branch basically doesn't. You shouldn't bet your horse on stagnation in a technology industry.
Most of CGI’s customers (if it’s the CGI I’m thinking of) are used to supporting software ecosystems like Fortran, Java, and COBOL for 20+ years on major releases. 10 years is “mid-cycle” comparably. When the IRS pays Microsoft millions and millions to keep supporting Windows XP, these “niche” organizations may be small in actual engineering persons involved but from a capital perspective extremely overweight.
Once again, I don't say Python should not move on, actually, as a CGI "dev" (Technical Director we say) myself, I would LOVE to trash 2.7 and move on.
I was just rambling about "most popular packages" which are only web/network/academic oriented while a huge part of the Python user base don't use them at all. It's just something the "Python community" can't see because most of them are web/network/internet/academic dev (and maybe it explain why they thought it could go well with breaking compat', internet bubble?).
I really want Python to move. I had the opportunity to discuss with some vendors involved in the industry they told me Python 3 switch was a running gag... I hope 2.7 will be such full of security hole my industry will finally do this once for all and stop joking (they will never move on instead), but please stop tell "most popular packages" represent how Python is used as if it was what should been seen, it's definitely not.
TBH, I think Python community give great tools for transition and write close compatible Python 2 and 3 code is not that hard.
Once my industry will be 3+, as a dev, I will love to follow the deprecation period.
> Most of them took me a couple hours to a few days.
Since 2008, the automated tooling has vastly improved. I'd bet most of those "few days" issues would either not exist or take hours given today's advances in the automated code translation and new knowledge on stack-exchange.
OK, but then can the python community stop saying "Pretty much all of the packages are converted anyways!" if the answer is really "we don't care anymore if your aren't" ?
I made a lot of code conversion from 2 to 3. Most of them took me a couple hours to a few days.
I'm currently working on a 2.7 project that will never migrate because they literally patched the cpython runtime, but you can't freeze a whole community because some will take bad technical decisions or accumulate tons of technical debt.
Now, Python 2.7 will not stops working after 2020. It's just that we, as a community, will stop to pay the price for the ones that didn't move. If you want to stay there, you'll pay a commercial actor for it.
In 2020 centos 6 will stop received updates. Will you go complain that the new RPM are not compatible ? No, either you update, or you pay commercial support.
Ruby and node did it in a few months and told the community to move or die. Nobody complained.
Perl took so long the language has been forgotten.
PHP literally canceled the version 6, and made it taboo.
We gave years, and means, then extended the deadline, and heard nothing but complaining since.
I won't pretend the migration was masterfully executed. The whole 2/3 debacle was painful.
But, it's been since 2008, come on.