I'd disagree that this has any relation to it being 'better for developing web apps'.
As someone who dealt with the nightmare of trying to develop Ruby on Rails apps on Windows in 2006, I assure you that this was a significant issue.
I'd expect the shift to be more a function of cost, since it's a lot easier for a startup to spin up on OSS vs. procure Windows licensing, and it's also a lot easier to get free hosting deals like Heroku ...
This is also true.
In general, Windows was not an ideal environment for developing for the web.
Oh, no debate on trying to build Ruby apps on Windows (been there, done that - which is why I have a separate Ubuntu partition for that). But more the point that if I am building a web app in, say, ASP.Net MVC using Visual Studio 2010/2012, the tooling is excellent.
Since the article is talking about Windows Developers, it's probably fair to assume that most of them are knocking out code in Visual Studio. And regardless of one's opinions on Microsoft, it's hard to argue with the quality of their developer tools.
I'd agree then that Windows is not an ideal environment for Ruby development, but not all web dev is in Ruby, and windows is an excellent web development environment if you're building out on the Microsoft stack.
The article is talking about the 2004-5 time period. ASP.NET MVC didn't exist, .NET was still at v1.1, IE was still at v6, Windows was still at XP, etc.
(I'm probably an anomaly in that I actually switched to Windows - from Linux - a few years after that, around 2008)
As someone who dealt with the nightmare of trying to develop Ruby on Rails apps on Windows in 2006, I assure you that this was a significant issue.
I'd expect the shift to be more a function of cost, since it's a lot easier for a startup to spin up on OSS vs. procure Windows licensing, and it's also a lot easier to get free hosting deals like Heroku ...
This is also true.
In general, Windows was not an ideal environment for developing for the web.