Can we call it public broadcasting when it fails to even dimly reflect the diversity of ideas for the areas it serves? Milk toast conservatives like Juan Williams were deemed intolerable a long time ago, so calling it public radio at this point is a misnomer and a sad farce.
Juan Williams wasn't let go because he's conservative; it's because he's a bigot (unless you think being a bigot is a conservative qualification):
"Look, Bill [O'Reilly], I'm not a bigot. You know the kind of books I've written about the civil rights movement in this country. But when I get on the plane, I got to tell you, if I see people who are in Muslim garb and I think, you know, they are identifying themselves first and foremost as Muslims, I get worried. I get nervous."
"NPR, like any mainstream news outlet, expects its journalists to be thoughtful and measured in everything they say. What Williams said was deeply offensive to Muslims and inflamed, rather than contributing positively, to an important debate about the role of Muslims in America."
Using WAL2 should make that problem better. It has two WAL files it alternates between when making writes, so the system has an opportunity to check point the WAL file not in use.
As of this release, No-GIL/free threading Python has moved out of the experimental phase and is officially supported in this release. No-GIL Python is not the default for this release (that's potentially the next phase of the project), but running no-GIL/free threading is officially blessed.
VSCode's marketing was that it is an open source editor you could rely upon, complete with open source extensions for popular languages like Python. Then when once it became popular and vscodium was growing in popularity (a vscode fork), MS locked things down. Now the Python extensions are closed source, and MS has artificially prevented vscode forks from using those extensions. A bait and switch if I've ever seen one.
> MS locked things down... and MS has artificially prevented vscode forks from using those extensions
Or like any growing and maturing project they established boundaries - one of which was that the plugin marketplace was proprietary, which is a perfectly reasonable position. Their existing and continued contributions to vscode are significant, so I think they can be allowed to keep some cards up their sleeves like the plugin marketplace or their Python extension. I'm just flabbergasted at this idea that somehow we're entitled to everything vscode-adjacent "just because", or that Microsoft is obligated to subsidize other billion-dollar business by giving them free features for their vscode forks.
> A bait and switch if I've ever seen one.
Where's the bait? Where's the switch? If the best you have is that they released a closed source plugin I'm going to bucket this as another borrowed opinion.
> Their existing and continued contributions to vscode are significant, so I think they can be allowed to keep some cards up their sleeves like the plugin marketplace or their Python extension.
It would have been fine if MS had started with their Python extension being proprietary, that would have been up front and transparent. Instead, they lured folks in (no small part due to open source), and once it became popular, they started turning the screws and making things proprietary and locking it down.
> I'm just flabbergasted at this idea that somehow we're entitled to everything vscode-adjacent "just because"
You're not arguing in good faith at this point. I don't think it is unreasonable to ask someone to make their intentions known up front do you? Instead MS waited until vscode became popular (partly because everything was open source) and then altering the deal Vader style closing off parts of vscode and extensions that were open. That doesn't feel particularly transparent.
> or that Microsoft is obligated to subsidize other billion-dollar business by giving them free features for their vscode forks.
I have no idea what you're talking about here. Vscodium is an entirely free and open source fork, no one makes any money from it afaik.
> Where's the bait? Where's the switch? If the best you have is that they released a closed source plugin I'm going to bucket this as another borrowed opinion.
They released the Python stack as fully open source. Then released the proprietary one, deprecating the open source one. Then made double certain that vscodium or any of the other forks could not use it at all, even if the use manually downloaded the extension. How is that not a bait and switch?
> It would have been fine if MS had started with their Python extension being proprietary
Except that never happened. Pyright was released first and was and continues to be open source. Pylance was built on Pyright but has never been open source. No promises or commitments were made otherwise. Deprecating the open source Python Language Server in favor of Pylance is also a perfectly reasonable and valid decision - the community was more than welcome to continue maintaining it, but most people I know continue to rely on Pylance.
> Instead, they lured folks in
Saying this doesn't make it true.
> Instead, they lured folks in (no small part due to open source), and once it became popular, they started turning the screws and making things proprietary and locking it down.
Microsoft has not once backtracked on anything vscode-related that's been open sourced. Trying to villianize them for not making everything open source is an argument with no legs.
> I don't think it is unreasonable to ask someone to make their intentions known up front do you?
They have. Point me to a single actual example of Microsoft operating in bad faith, that isn't them deciding to keep some parts of the ecosystem proprietary while 99% remains FOSS.
> Vscodium is an entirely free and open source fork
Microsoft and the vscode team is not making long-term decisions with vscodium in mind. But they are probably worried about Windsurf and Cursor, the latter of which (a billion-dollar company) was caught violating MS's TOS around the plugin ecosystem.
Microsoft has spent over a decade investing in, curating, and improving the vscode first-party plugin ecosystem and being a rather good steward. I think they're perfectly reasonable in keeping it to themselves. Creators are free to upload their plugins to any alternative marketplace. I don't see any arguments being made that can diminish the open source contribution they've made with code - oss just because parts of the branded vscode are proprietary.
> They released the Python stack as fully open source.
Again, no they didn't. Pyright open source. Pylance always closed source. PLS deprecated. But you're entitled to what you borrowed.
> Pyright was released first and was and continues to be open source. Pylance was built on Pyright but has never been open source.
No, the first Python extension that shipped with vscode 1.0 in 2016 was called the "Microsoft Python Language Server" and was based on the Jedi LSP. Below is the deprecation announcement of the Jedi language server in the Pylance launch post below.
> In the short-term, you will still be able to use the Microsoft Python Language Server as your choice of language server when writing Python in Visual Studio Code.
> Our long-term plan is to transition our Microsoft Python Language Server users over to Pylance and eventually deprecate and remove the old language server as a supported option.
https://devblogs.microsoft.com/python/announcing-pylance-fas...
> But they are probably worried about Windsurf and Cursor, the latter of which (a billion-dollar company) was caught violating MS's TOS around the plugin ecosystem.
If that were so, I would certainly understand. However, MS started closing vscode and the extensions years before Windsurf and Cursor (initial release in 2023). This was their business model all along get adoption in partly by leveraging the open source community, and then close things off slowly once they have a choke hold (similar to Android/AOSP). I could scarcely agree more that Windserf and Cursor are supremely sketchy and generally scummy companies.
Consider MS launch announcement that focuses on open source, extensibility, open community, and a promise to be transparent with their intentions (i.e., vision) and roadmap...
> From the beginning, we’ve striven to be as open as possible in our roadmap and vision for VS Code, and in November, we took that a step further by open-sourcing VS Code and adding the ability for anyone to make it better through submitting issues and feedback, making pull requests, or creating extensions.
Except they weren't open and did a u-turn on the community a few years later. MS started closing sources and locked things down a few years later despite touting the benefits of being open and open source in the announcement above. Now they have architected the Python extension so it only runs on vscode, and will not run at all on any fork, which is pretty shady after promising transparency and openness.
Pylance isn't the same extension as what was originally shipped, it's an entirely different product. Your link backs up my argument, not yours. Releasing an open source project doesn't not obligate them to continue supporting that project indefinitely, and the decision to migrate to a closed-source plugin is a perfectly valid and reasonable decision. Disagreeing with it doesn't mean they've somehow magically violated some implicit obligation you think they owe "the community".
> MS started closing vscode and the extensions years
They never "started". The plugin marketplace and vscode - the proprietary version of "Code - OSS" - has always been proprietary and closed. At no point did they give you something and take it away. Deciding to release a closed-source replacement for an open-source tool is not the same thing, and it's bad faith to argue otherwise to fit your fundamentally flawed argument.
> This was their business model all along get adoption in partly by leveraging the open source community
>Consider MS launch announcement that focuses on open source, extensibility, open community
You're relying on hand-wavy assertions without any evidence to back it up.
> Except they weren't open and did a u-turn on the community a few years later.
Where's the u-turn? I don't see anything in this post that's not true in 2025. Microsoft offers a curated plugin marketplace that's proprietary to vscode, and they provide distribution and hosting for free without requiring anything from creators and users. Pylance continues to be free but closed, Code - OSS continues to be FOSS, vscode continues to be a proprietary version of Code - OSS, plugin authors continue to upload products free-of-charge, and users continue to benefit from that community that Microsoft has fostered.
They've firmly established what their role is in this relationship. There's never been ambiguity between what's vscode closed-source and what's code - oss, unless you've not put in the effort to find out.
Point to an actual, concrete example of where they've acted in bad faith, did a "u-turn", or reneged on a public statement rather than hand-wavy generalizations. It's on you if you've relied on second-hand HN comments and news headlines to build your opinion, and relying on misunderstanding of context isn't a convincing argument.
You can use uv to pin, download, and use a specific version of a Python interpreter on a per project basis. You shouldn't use your OS provided Python interpreter for long running projects for all the reasons you mentioned, including reproducibility. If you insist on using the vendor provided interpreter, then use RHEL (or a free clone) or any other long suppported Linux distro (there are many). Five years is a very long time in technology. It is not reasonable to expect more from core maintainers IMO. Especially considering we have better options.
But it's not a long time in the OP's field of science. Unfortunately despite a strong preference for Python in the scientific community, the language's design team seem to ignore that community's needs entirely, in favour of the needs of large technology companies.
I was hopeful that in the transition from a BDFL-based governance system to a Steering Council, we would see a larger variety of experience and opinions designing the language. Instead, I don't think there has ever been a single scientist, finance worker etc on the Steering Council - it's always software developers, almost always employees of large software companies.
Thanks for understanding. I think the responses other than yours here are making me reconsider how invested my research group is in Python. I think we will be doing far more Rust and R components in the future to explore the nature of language stability and package stability.
Just this week I had difficulty integrating the work of a team member because they used some new typing features only available in Python 3.13, but we have many library dependencies on numpy < 2, and in their great wisdom somebody decided that with Python 3.13 there would be no more precompiled wheels of numpy < 2. Meaning arduous multiple-minute compilation for any new venv or Docker build, even with uv. This sort of pointless version churn, wasting many valuable research hours on investigating the chains of dependencies and which libraries are ready or not, to serve the whims of some software engineer that decides everyone must update working code to novel APIs, is not something that I experience in other languages.
Hopefully Python Steering Council members reconsider the motivation of continual churn, but it's much harder to get promoted and get acknowledgement for judicious tending of a language than it is to ship a bunch of new features. Combined with fear over Anaconda charges, Python is quickly becoming a very unfriendly place for science, or anybody else that values function over form.
> somebody decided that with Python 3.13 there would be no more precompiled wheels of numpy < 2
Wow. NumPy 2.0 was released less than a year ago and they're already starting to drop support for 1.x?! It only supports Python versions up to 3.12, all of which are themselves either unsupported or in "security fixes only" mode.
The developers plan to end-of-life 1.x entirely in less than six months time. Do they not know their target audience at all?
> it's much harder to get promoted and get acknowledgement for judicious tending of a language than it is to ship a bunch of new features
Yes - as Mark Lutz (author of "Learning Python") puts it [0]: "[Python's] evolution is largely driven by narcissism, not user feedback" and "Python remains a constantly morphing sandbox of ideas, which is great if you just want to play in the sandbox, but lousy if you're trying to engineer reliable and durable software".
> This is factually incorrect. Latency is limited by the speed of light and the user's internet connection.
This is a solved problem. It is simple to download content shortly before it is very likely to be needed using plain old HTML. Additionally, all major hypermedia frameworks have mechanisms to download a link on mousedown or when you hover over a link for a specified time.
> If you read one of the links that I'd posted, you'd also know that a lot of users have very bad internet connection.
Your links mortally wounds your argument for js frameworks, because poor latency is linked strongly with poor download speeds. The second link also has screen fulls of text slamming sites who make users download 10-20MBs of data (i.e. the normal js front ends). Additionally, devices in the parts of the world the article mentions, devices are going to be slower and access to power less consistent, all of which are huge marks against client side processing vs. SSR.
> This is a solved problem. It is simple to download content shortly before it is very likely to be needed using plain old HTML.
No, it is not. There's no way to send the incremental results of typing in a search box, for instance, to the server with HTML alone - you need to use Javascript. And then you're still paying the latency penalty, because you don't know what the user's full search term is going to be until they press enter, and any autocomplete is going to have to make a full round trip with each incremental keystroke.
> Your links mortally wounds your argument for js frameworks, because poor latency is linked strongly with poor download speeds. The second link also has screen fulls of text slamming sites who make users download 10-20MBs of data (i.e. the normal js front ends).
I never said anything about or implying "download 10-20MBs of data (i.e. the normal js front ends)" in my question. Bad assumption. So, no, there's no "mortal wound" because you just strawmanned my premises.
> Additionally, devices in the parts of the world the article mentions, devices are going to be slower and access to power less consistent, all of which are huge marks against client side processing vs. SSR.
As someone building web applications - no, they really aren't. My webapps sip power and compute and are low-latency while still being very poorly optimized.
> No, it is not. There's no way to send the incremental results of typing in a search box, for instance, to the server with HTML alone - you need to use Javascript.
Hypermedia applications use javascript (e.g., htmx - the original subject), so I'm not sure why you're hung up on that.
> And then you're still paying the latency penalty, because you don't know what the user's full search term is going to be until they press enter, and any autocomplete is going to have to make a full round trip with each incremental keystroke.
You just send the request on keydown. It's going to take about ~50-75ms or so for your user's finger to traverse into the up position. Considering anything under ~100-150ms feels instantaneous, that's plenty of time to return a response.
> As someone building web applications - no, they really aren't.
We were originally talking about "normal" (js) web applications (e.g. react, angular, etc.)., most of these apps have all the traits I mentioned earlier. We all have used these pigs that take forever on first load, cause high cpu utilization, and are often janky.
> My webapps sip power and compute and are low-latency while still being very poorly optimized.
And now you have subtlely moved the goals posts to only consider the web apps you're building, in place of "normal" js webapps you originally compared against htmx. I saw you do the same thing in another thread on this story. I have no further interest in engaging in that sort of discussion.
> Hypermedia applications use javascript (e.g., htmx - the original subject), so I'm not sure why you're hung up on that.
Because you falsely claimed otherwise:
>> This is a solved problem. It is simple to download content shortly before it is very likely to be needed using plain old HTML.
So, another false statement on your part.
> You just send the request on keydown. It's going to take about ~50-75ms or so for your user's finger to traverse into the up position. Considering anything under ~100-150ms feels instantaneous, that's plenty of time to return a response.
No, it's not "plenty of time" because many users have latency in the 100's of ms (mine on my mobile connection is ~200ms), and some on satellite/in remote areas with poor infra have latency of up to a second - and that's completely ignoring server response latency, bandwidth limitations on data transport, and rehydration time on the frontend.
> Considering anything under ~100-150ms feels instantaneous, that's plenty of time to return a response.
Scientifically wrong: "the lower threshold of perception was 85 ms, but that
the perceived quality of the button declined significantly for
latencies above 100 ms"[1].
> We were originally talking about "normal" (js) web applications (e.g. react, angular, etc.).
Factually incorrect. We were talking about normal frontend technologies - including vanilla, which you intentionally left out - so even if you include those heavyweight frameworks:
> most of these apps have all the traits I mentioned earlier. We all have used these pigs that take forever on first load, cause high cpu utilization, and are often janky.
...this is a lie, because we're not talking about normal apps, we're talking about technologies. All you have to do is create a new React or Angular or Vue application, bundle it, and observe that the application size is under 300k, and responds instantly to user input.
> And now you have subtlely moved the goals posts to only consider the web apps you're building, in place of "normal" js webapps you originally compared against htmx.
Yet another lie, and gaslighting to boot. I never moved the goalposts - my comments have been about the technologies, not what webapps people "normally" build - you were the one who moved the goalposts by changing the discourse from the trade-space decisionmaking that I was talking about to trying to malign modern web frameworks (and intentionally ignoring the fact that I included vanilla webtech) based on how some developers use them. My example was merely to act as a counter-example to prove how insane your statements were.
Given that you also made several factually incorrect statements in another thread[2], we can conclude that in addition to maliciously lying about things that I've said, you're also woefully ignorant about how web development works.
Between these two things, I think we can safely conclude that htmx doesn't really have any redeeming qualities, given that you were unable to describe coherent arguments for it, and resorted to lies and falsehoods instead.
> Unfortunately, that means that the tradeoff is that you're optimizing for user experience instead of developer experience
Not really, your backend has rich domain logic you can leverage to provide users with as much data as possible, while providing comparable levels of interactivity. Pushing as much logic (i.e., state) while you're developing results in a pale imitation of that domain logic on the front end, leading to a greatly diminished user experience.
> your backend has rich domain logic you can leverage to provide users with as much data as possible ... Pushing as much logic (i.e., state) while you're developing results in a pale imitation of that domain logic on the front end
False. There's very little that you can do on the frontend that you can't do on the backend - you can implement almost all of your logic on the frontend and just use the backend for a very few things.
> leading to a greatly diminished user experience.
False. There's just no evidence for this whatsoever, and as counterevidence some of the best tools I've ever used have been extremely rich frontend-logic-heavy apps.
Regarding #3, you can already do this with Jinja2 template functions perfectly well. You can also do it with template tags, but it isnt as nice.
100% agree with #5, the diaspora of Django's community, dev process, and lack of a single decision maker when consensus isn't quickly reached, makes it virtually impossible to correct past mistakes.
> Regarding #3, you can already do this with Jinja2 template functions perfectly well. You can also do it with template tags, but it isnt as nice.
This is exactly the problem tho. Every Django solution is "install this thing that replaces or adds a component that should be standard". People adopt frameworks to help them stay on track, not because they want to immediately install 5-10 packages just to correct the framework. Not to mention this just creates a big split when every project will be very different.