Hacker News new | past | comments | ask | show | jobs | submit login
Web Standards (dcurt.is)
358 points by maccman on Feb 10, 2012 | hide | past | favorite | 195 comments



This is the result of a standards process running in reverse: first the specifications, then the implementation. There was a time when the "Internet" could if you squinted right be defined as "the collection of people who make fun of that process and do the opposite"; their motto was (literally) "loose consensus and working code".

First you get things working. Then you standardize on them.

Complain when the Webkit team refuses to implement standards. Have they shown any unwillingness to support W3C standards once published? No? Then what's the gripe? The browser vendors are doing exactly what they're supposed to do: come up with new things. If the W3C was doing its job, it would watch those new things carefully and, when enough momentum existed, start ratifying them as standards.

It's not enough to say, "you could have used the same argument to defend Internet Explorer". To indict someone for "embrace & extend", you need both halves of the behavior: creation of new features, and unwillingness to support subsequent standards for those features.


"Have they shown any unwillingness to support W3C standards once published? No?"

Unfortunately for your argument, the answer is indeed no. Apple is unwilling to support standards for touch events, due to its patent war against Android.

Sources:

[1] http://9to5mac.com/2011/12/19/apple-submits-invalid-patents-...

[2] http://www.webmonkey.com/2011/12/is-apple-using-patents-to-h...


Did the W3C standardize touch events, and Apple prevent Webkit developers from implementing the standard?


They showed an "unwillingness to support subsequent standards for those features", by blocking the creation of a standard in the first place. Stop being obtuse.


I think yours is the obtuse argument. The W3C has proceeded at a glacial pace on everything, not simply touch events. Is it your argument that Apple is preventing the W3C from standardizing corner radiuses?

Either way, I can hold two thoughts in my head at the same time: that Apple is indeed retarding standardization of touch events, while the W3C is retarding... everything else --- and then complaining about it.


Apple is a member of the W3C, so talking about Apple retarding the W3C process is not really in opposition to the idea that the W3C is responsible.

I do think it is fair to say that Apple is a key player in the W3C's issues, along with Microsoft.

It's a situation very much like the UN, where people opposed to the idea of cooperation use their lack of cooperation as proof of the idea that collaborative action is infeasible. Apple really wants people to use their walled garden as much as possible, so they are definitely among the parties intentionally sabotaging the W3C process.


Isn't the "walled garden" we're talking about here one of the most successful open source projects ever?


I'm actually talking about iOS, especially the Apple App Store. Apple is threating to assert touch patents to prevent other vendors from implementing rich touch interfaces in the browser.

Eight years ago one might have called Java one of the most successful open source projects ever. Webkit could easily end up in a similar position. Given Apple's heavy litigation of their opponents, it seems almost inevitable.

So, the game has barely even begun yet. I suggest we revisit in 10 years and see how successful Webkit has been, for Apple, and our broader software community. (Because simply saying it is a success definitely suggests different things to different groups.)


No. Apple maintains a fork of Webkit, which is the version present on all their devices. The browser in their walled garden is the non open-source browser they force you to use on their devices.


Contrary to character, they have allowed other browsers on iOS: http://itunes.apple.com/us/app/opera-mini-web-browser/id3637...


That's not a browser, that's a psuedo-browser. All rendering and javascript is done on the server.


You just totally changed what he said. He asked "Have they [the WebKit team] shown any unwillingness to support W3C standards once published?" Let's review the facts of the case you're talking about:

1. The WebKit team wasn't involved

2. Nobody was failing to support a published standard

Was somebody pulling a dick move in the article you posted? Yes. But it was a different group pulling a different dick move than the one tptacek asked about. If you want to argue that Apple's lawyers are a disagreeable lot, I suppose that's a valid point you could make, but this is not the thread for it.


blocking it? They just submitted their standards by the deadline, right? That's why you propose deadlines.... if planning was made assuming everyone would submit their stuff before said deadlines, well, maybe they need a refresher on project management?


> The browser vendors are doing exactly what they're supposed to do: come up with new things. If the W3C was doing its job, it would watch those new things carefully and, when enough momentum existed, start ratifying them as standards.

Both the browser vendors and the W3C could be doing a better job. But neither is really the problem here.

The problem is that WebKit, a single implementation, is utterly dominant in the mobile space (except for Opera on low-end phones in the far east). This isn't a criticism of WebKit. Any single implementation that becomes so dominant is bad.

Any single implementation, once it has vendor prefixed stuff in release builds, will lead to the problems the mobile web has now: People will use the vendor prefixes, after which that implementation can't remove them without breaking websites. And after that, other vendors will be forced to implement those prefixes in order to render the web in a non-broken way.

One way to avoid that is for vendors to not vendor prefix at all in release builds, but all browsers do that. If that is out, the only thing that can save us is for no single implementation to become dominant. But WebKit won on mobile - Android and iOS are by far the leaders, and both use WebKit; in fact, even most of the runners up do (Blackberry, Bada, WebOS, etc.).

My only criticism of WebKit is that, when it saw itself becoming so dominant, it should have been much more careful with vendor prefixing, preferably stopping doing so entirely in release builds. It's ok to vendor prefix when you have a competitive market, but if you do so when you are dominant or about to be, you end up with the current debacle.


In case people aren't aware of just how badly competing browsers are locked out of the mobile web, imagine a world where most of the buttons in Google Maps are broken in Firefox. Where bing.com doesn't work in Opera. Where non-WebKit browsers get a degraded experience in Google web search, and a barely-functional version of Gmail.

All of this is true on mobile phones today. And it's getting worse over time rather than better. If you want to build a browser that provides decent compatibility with popular mobile websites, you have little choice but to emulate WebKit's experimental (and often unspecified) features.

[Disclosure: I work for Mozilla on Firefox for Android, and as the editor of the W3C Touch Events spec.]


The web - the browser programmers, the HTML developers, certainly all levels of management - is incapable of learning from its mistakes - we've been over this with Netscape, then IE6, and now the exact same thing is happening with Webkit.


In my view what we've learned is the standardization process doesn't work. We had IE and Netscape doing what they liked and we got innovation; we got javascript, we got iframes, we got embedded video. Sure, each of these is a mess, to a lesser or greater extent. But they let us build the sites we want. Then in ten years of W3C we got nothing - just the blind alleys of xhtml and css 2.1. The web has only started moving again after the WhatWG basically said to the W3C "we're going to implement this stuff, you can either call it html5 or become irrelevant". We have learned from our mistakes. We're returning to the netscape vs IE6 days because they were better for innovation than the ten years of emptiness that followed.


The ten years of emptiness were caused by the browser wars and its ultimate, unequivocal victor when it, and the technology it supported, became entrenched for an extended period of time. It's not as though the IE team wasn't innovating because W3C stopped them.

During the emptiness, technologies like alpha-transparent PNGs started out but never got anywhere because the gorilla didn't support it and what were you going to do? We got this idea to create layouts without using tables, but oops, box model bugs, I sure was glad to be working around the legacy of the days of innovation, and hey, wouldn't position:fixed be great? You had upstart browsers that couldn't access many websites because in the happy days of innovation the web decided that if (document.all) elseif (document.layers) was a decent way of writing code and it's not like anyone was going to rewrite those. Remember that innovative native client technology a whole bunch of banks and DRM sites decided could be good for secure, controlled internet experience? I think they called it ActiveX, sure did wonders for usability and practicality of actually innovating browsers that could not support it.

You say we've learned from our mistakes but to me the use of -webkit- and continued use of user agent sniffing (buggy, natch) suggests otherwise. Who'll be updating those sites two years from now, when Firefox Mobile's or X Mobile's rendering engine is the innovative stuff all developers love?


It constantly amazes me how many people do not understand this. So many web standards advocates will cry foul the moment any vendor tries something out that has not already been standardized. Even Brendan Eich cried foul about Dart even though JavaScript was out in the wild for a year before Netscape even submitted it to a standards body.

Committees are not the right process for innovation. 10 years of overly complicated and unneeded standards from W3C should have taught us this.


"So many web standards advocates will cry foul the moment any vendor tries something out that has not already been standardized."

While some misinformed people might make this argument, I don't think you'll find anyone making it at Mozilla or other groups that actually participate in web standards development. We all know that the normal process is to implement first and then standardize. That's what Mozilla has done in the past with features like Geolocation and WebGL, and it's what we're doing today with new work like Audio Data, Game Pad, and Telephony. In fact, the W3C process requires two interoperable implementations before a specification can reach the Recommendation stage.

But there's a difference between a standard that is developed in the open, with the opportunity for all parties to provide feedback and influence development, and one that is developed and implemented in secret, and then deployed to production browsers and evangelized to content authors before anyone else has a chance to comment on it -- or even on whether the fundamental approach is a good idea. (And obviously these are two extreme ends of a spectrum, with a lot of gray area between them.)

The problem with many of Apple's "contributions" (and some of Google's) is that they never write specs for them or bring them to standards bodies. (This is the case for Aplle's -webkit-text-size-adjust.) In some cases they actively fight efforts to standardize them. (This is the case with Apple's touch events.) Or sometimes they do push specs to standards bodies, but only after a highly complex specification has been fully developed behind closed doors. (Google's Dart, NaCL, and Pepper have this problem to some extent.)

That's fine if what they really want is a proprietary ecosystem that works in their own browsers. But if they actually want these efforts to be part of the web, and if they want to be seen as stewards of an open web, then just implementing something is not enough. The web has lots of stakeholders, and the initial implementer needs to find a way to bring them to the table.


mbrubeck and I, who are good friends IRL, just had a spirited discussion about this over IM; I'll just repeat a few of my points here for other readers. :)

NaCl was first announced in 2008, along with a research paper and and source code (http://googlecode.blogspot.com/2008/12/native-client-technol...). The next year Google sponsored the NativeClient Security Contest that awarded prizes to security researchers who could break out of the sandbox. A lot of the design surrounding PNaCl has happened in the open (see http://www.chromium.org/nativeclient/pnacl). I don't think it's accurate to say that NaCl was developed behind closed doors.

Also, other browser vendors seem to oppose NaCl in principle, and it seems unlikely that any other process Google could have taken to get buy-in would have resulted in other browsers accepting it.

I agree that there is a continuum of behavior here; in the NaCl case in particular though, it really seems like Google has done the right thing.


"Never write specs"? I have to say, that some of the rhetoric from Mozilla seems a bit over the top. Mozilla doesn't seem to have had a problem dumping all kinds of Javascript and CSS features into Firefox over the years not all of it making it into draft spec proposals.

And how could NaCL have been done any better? It's been publicly published for years prior to the first shipping implementation. If Mozilla was interested, they could have put in requests for how they'd like to see NaCl and Pepper changed to better fit a Firefox implementation, or even a counter-proposal for running untrusted native code. But I've seen no such proposals, instead, just a blanket refusal to even consider it. I think there's a philosophical difference such that Mozilla isn't really interested in anything that isn't JS based, so suggesting that bringing NaCL to the working groups would have changed anything is somewhat disingenuous. It's kind of like hoping that Republicans would contribute to a National Single Payer healthcare plan design if it had been brought to the right committee.

It's a shame really, because proprietary native mobile platforms are hurting the web, and consumers don't really care, they just want fast apps which don't kill their battery or phone performance. Something like NaCL for mobile could preserve the DOM, CSS, browser security model, and other good things about the web, but replace JS with something much more performant for games, and this would go a long way to hedging risk against the proprietary native platforms. I'll take portal NaCl C-code calling into browser APIs and WebGL any day over Obj-C code calling into Cocoa that can only be installed through an app-store.

If the Web is to be around for decades more, the religion over programming languages needs to give way to polyglotism.

As on Dart, the language was released at version 0.1. The grammar isn't all that complex, the library is minimalist. It came with a early spec and source. It is not a "fully developed" specification. No one designs language grammars by committee. Languages are highly personal and opinionated. Usually one or two people design the overall structure of the language privately, and then release an initial spec, and it evolves from there.

When Mozilla developed Rust, the first time most of the web heard of it was Graydon dumping a gigantic codebomb into GitHub and it being covered on Lambda-the-Ultimate.

Which I think is perfectly find, but let's try and be a little level headed and fair, and not arrogantly think that what Mozilla does is uniquely better than what others are doing.

(I speak only for myself and I do not work on Dart, NaCl, nor Chrome)


"Mozilla doesn't seem to have had a problem dumping all kinds of Javascript and CSS features into Firefox over the years not all of it making it into draft spec proposals."

Certainly, every vendor has been guilty here. I at least think that every extension Mozilla is currently encouraging use of is hoped to be a potential future standard. (And no, I'm not claiming this is unique -- some other vendors like Google are also good citizens in this regard.)

I also agree that proprietary platforms are hurting the web, and that the web needs new abilities in order to fight the threat. However, to preserve the openness of the web, new features need to be things that more than one vendor has the resources and motivation to implement; even if Mozilla and Opera thought Dart or Pepper were great ideas, adopting them would put is on a treadmill where Google (with its much greater resources) controls the speed.

And no, I am not claiming that Apple/Mozilla/Opera would have adopted Dart or NaCL if they were brought before the right committee at the right time. Part of getting early feedback means finding out if the community even thinks that your solution is something that's desired in the first place.

"When Mozilla developed Rust, the first time most of the web heard of it was Graydon dumping a gigantic codebomb into GitHub and it being covered on Lambda-the-Ultimate."

This is actually an interesting contrast between Mozilla and Google. The initial Rust code-dump was a barely-running proof-of-concept; it was shared with the world on the same day it was first shared within the Mozilla community. Zero lines of the current self-hosted compiler were written at the time. Most of the planned features (like the task system) were unimplemented. The first actual release (Rust 0.1 alpha) came a year and a half later, and that first release already reflected significant community feedback and contributions. For example, the packaging system in Rust 0.1 was written by a volunteer contributor who happens to be a Google employee. The stage of maturity at which Rust became open source was far earlier than Go.

Some things really are different at a non-profit like Mozilla, and the goal of opening everything up as early as possible is one thing that's unique among tech companies I've worked for or with. We don't always do it perfectly, and we do struggle with the need for people to do some types of design and experimentation without the world looking over their shoulders, but it's very much a unique core principle that affects all of our work.


SPDY is an example of a great idea, good enough that Mozilla is adopting it. Apparently, you're saying that, like SPDY, even if everyone agreed that Dart and NaCl were great ideas and great for the web, Mozilla still vote against them for political or aesthetic reasons. So part of "getting early feedback" as you alluded to above, is getting early politically motivated feedback. How is this a good thing? You should vote on technical proposals based on the intrinsic merit.

Almost all of the additions to the browser API that Google has made are desired to make it into spec proposals. There is no desire for Chrome to be the only implementor of Dart or NaCL.

I'm a little concerned what you're saying here. That if an idea is good, it won't be adopted anyway, just because someone else has greater resources to implement it. It sounds like Not-invented-here syndrome to me. No matter what technical specs are proposed, Google is always going to have greater resources to develop them.

Everything you say about Mozilla, I can say about my experience at Google. The first release of Dart was very incomplete, it didn't implement a lot of the type system, no missing method, some stuff like default method parameters didn't exist. It generally really slow and bloated code. It was in a barely working state. The DartC compiler which was implemented in Java was not a production compiler design to be used by people, it was a prototype for research. This idea that Google sprang a fully-formed already finished product on everyone is a gross exaggeration. The Dart VM isn't even in Chromium yet.

The self-hosted compiler (Frog, written in Dart) has been developed completely in the open.

Google is a large company with hundreds and hundreds of projects, so not all of them operate the same way, but many of them are open sourced very early in the process, and Googlers push to release source early for client side web technologies. Google has spent hundreds of millions of dollars on acquisitions which it immediately turned around and open sourced, like WebM, and the first Google Web Toolkit, and Instantiation's products (donated to Eclipse Foundation)

Really, I think the comments on NaCl and Dart are patently unfair. They were released practically as early as they could have been, NaCL took, what, 2-3 years in the open of development before it was finally pushed to production.

Ultimately, if standards committees get gridlocked over political and emotional stuff, instead over the merits of technology, you will see people start to ignore them. A similar thing happened in with OpenGL ARB where vendors who had less resources voted down and blocked companies like NVidia and ATI who were progressing chips at a faster pace. Ultimately, it took DirectX surpassing OpenGL to push it to finally start evolving faster again.


Yes, SPDY is a great idea, and of course I don't think that it -- or other ideas -- should be rejected just because of who proposes them or when. (As a concrete example, I am the lead editor of the W3C Touch Events spec and worked on implementing it in Gecko; it is an API that I absolutely believe should be standardized even though I strongly dislike how Apple originally designed and deployed it.)

I do think we agree on fundamentals though we may disagree on some cost-benefit analysis. All I meant about the "treadmill" is that some proposals have a much greater cost to implementers than others, and different organizations will weigh those costs differently. While some debaters may throw out "political or emotional" responses, I believe the decision-making process should be (and is, in the cases I've participated in) about allocation of resources, complexity of implementation, and the desired architecture of the web.

Technical issues and process issues are separate, but affect each other. When I say the process behind something like Touch Events was bad, I'm not giving that as a reason to reject the feature -- I think the feature should be accepted or rejected on its merits, like you said -- but it does mean that any technical objections that do appear will be much harder to resolve than they might have been. And as you say, Dart and NaCL have had better process behind them than Touch Events or various other extensions I could name. Dart and NaCL may succeed or they may fail (just as many of Mozilla's current experiments may suceed or fail), but at least their fates will be influenced by feedback from all stakeholders.

You are obviously better informed than I am about Dart and NaCL implementation. (Also, I accidentally conflated Go and Dart in my previous comment, which is embarrassing!) I apologize for any uninformed comments and defer to you on all the facts there.

I really don't want people to see this as a "Mozilla versus Google" thread. I hope everyone cares about standards issues because they help or hurt the web, not because they helps their favorite vendor or hurt a competing one. In the big picture, Google and Mozilla are close allies when it comes to web standards. As you say, the real risks to the open web are elsewhere.


It is unfortunate that you have intermixed good points with unfair and untrue attacks against mbrubeck ("disingenuous", "arrogant"); I know Matt in real life and he is none of these things.


I didn't accuse Matt being disingenuous, I accused Mozilla as a whole, primarily based on some of the comments of Brendan Eich when Dart was first released, because of somewhat contradictory statements.


Not a problem, I am not easily offended. :) I was careless in my own comments and I apologize for being combative.


Err, on second reading, I can see how it reads like I'm personally accusing him, my apologies, it was not my intent.


[deleted]


I actually agree with that: the point of a standard is to specify something in a way that multiple interoperable implementations can exist. If there is only one implementation, it is extremely difficult to say what behavior should be specified in the standard and what should be left implementation-defined.


Brendan Eich cried foul about Dart because other browser manufacturers, and web developers themselves, said they didn't want it, and Google did it anyway.


I was unaware that a poll had been taken of all web developers asking whether they wanted Dart. If developers really don't want it, they'll vote with their feet, so what's the big issue?

Also:

  > The big issue I have with Dart, which you
  > seem to consider inconsequential, is whether
  > Google forks the web developer community,
  > not just its own paid developers, with Dart,
  > and thereby fragments web content.
  >
  > "A Dart to JS compiler will never be "decent"
  > compared to having the Dart VM in the browser.
  > Yet I guarantee you that Apple and Microsoft
  > (and Opera and Mozilla, but the first two are
  > enough) will never embed the Dart VM.
  >
  > "So "Works best in Chrome" and even "Works only
  > in Chrome" are new norms promulgated intentionally
  > by Google. We see more of this fragmentation
  > every day. As a user of Chrome and Firefox (and
  > Safari), I find it painful to experience, never
  > mind the political bad taste.
  >
  > Ok, counter-arguments. What's wrong with playing
  > hardball to advance the web, you say? As my blog
  > tries to explain, the standards process requires
  > good social relations and philosophical balance
  > among the participating competitors."
--Brendan Eich, http://news.ycombinator.com/item?id=2982949

Basically he's arguing that the browser vendors of today (not developers/users) are the gatekeepers of what new technologies can be tried. It's like making browser innovation a UN Security Council where every member has a veto over doing anything. If you value progress and innovation, I do not see how you can support such a position.

If Google introduces something like Dart or NaCl, is willing to support standardization of it and even release a free implementation, but other browser vendors don't want to integrate it because they don't like it, that's their problem, not Google's. If that means popular apps become "Works best in Chrome," that is a self-inflicted wound.


> I was unaware that a poll had been taken of all web developers asking whether they wanted Dart.

A few hours ago at JQuery UK someone asked Paul Irish what he thought of Dart. Then the whole room laughed. PI mentioned 'its complicated' and then that 'Dart has a place' and then everyone laughed again.


A few hours ago at JQuery UK someone asked Paul Irish what he thought of Dart. Then the whole room laughed. PI mentioned 'its complicated' and then that 'Dart has a place' and then everyone laughed again.

So a knee jerk unintelligent response by some random group of Javascript programmers, given without context, is supposed to mean something?


Paul Irish works with the Chrome team at Google.


Who said anything about Paul Irish?

The knee jerk reaction I was referring to was the "the whole room" of js guys that "laughed".


I think you missed the point of the comment.


Maybe. It sounds like the made fun of PI and/or Dart.

But I cannot understand either the point or why. If you could chip in with some context, I'd be grateful.


Haberman's post mentioned he was unaware of a poll of all web developers on Dart. I mentioned a recent example of such a thing with a large sample size. That's all.


"Basically he's arguing that the browser vendors of today (not developers/users) are the gatekeepers of what new technologies can be tried."

Anyone's free to try things. But those things will never become a part of the web platform if other vendors don't think they will benefit from implementing them. The people who make the browsers are gatekeepers of what ends up in the browsers. That's not a statement about how things should or shouldn't be; it's just a tautology.

The job of anyone who wants to create a web standard is to accept that fact, not ignore it. Ignoring it leads to single-browser "standards", which may get adopted by developers but will not become part of the open, interoperable web platform.


What are the things Webkit has "proposed" by releasing builds that you don't think should be part of the web platform?

And, what are the things Webkit has built that were "overridden" by the standards process? Where did Webkit actually get things wrong?

Just what comes to your mind.


First of all, I don't want to come off as argumentative. I think we agree on the fundamentals: Implementation should come before standardization. Vendors should feel free to experiment. The W3C should focus more on quickly standardizing things that are already working in the field.

Today, experimental and prefixed features used in the wild are creating real problems for interoperability and competition. But I don't think there's a villain here, and I'm not blaming WebKit. Both the browser vendors and the W3C are using processes that were designed to certain solve problems. Now those processes are creating new problems and might need to change in response.

But, to answer your direct question (although I think this is getting on a tangent from the original post):

> "What are the things Webkit has "proposed" by releasing builds that you don't think should be part of the web platform?"

Dart is an example of something that Google has proposed (not "WebKit" as a group) that I and many others - even others like Oliver Hunt within the WebKit community - do not think benefits the web platform. Some people would put Web SQL Database in this category; I don't have an informed opinion about that one. (Though I do know it's the reasons some versions of Gmail wouldn't even load in mobile Firefox.)

> "Where did Webkit actually get things wrong?"

Apple's "meta viewport" tag is something that they released in Safari and evangelized to developers without specifying it or getting community feedback. I had the pleasure of reverse-engineering it for Gecko. Among other things, it's not even implemented in a layer where it makes sense. (It should be part of style, not markup.) Apple never proposed it as a standard; other people eventually did and are trying to fix some of the problems. Unfortunately, the basic design is locked in by legacy content (and has been since soon after Apple shipped this feature in the first release of mobile Safari).

But frankly, I don't care so much whether they get things "right" or "wrong." As bad as meta-viewport is, at least Firefox and Opera and Android and IE can implement support for it without controversy, and be compatible with sites that use it. The way that prefixed CSS properties are (a) released to market by browser vendors, (b) evangelized to web developers, and (c) deployed on production web sites is creating a situation where new browsers are either locked out of existing sites or have to implement each others' prefixes. This is not a stable equilibrium.


This is an awesome comment, and thanks for writing it.

One thing I didn't have going into this thread that I have now is appreciation for how different this problem is in the mobile space than on the "desktop" web. It's easier to be frustrated with web standards in the desktop world, where Webkit seems like an unalloyed force for good. I can see how much trickier it is for your team given the fact that Gecko has been relegated to second class status by the market.


What doesn't benefit the Web platform is being tossed aside because of superior native development frameworks. That's a far bigger danger than someone forgetting to add a '-moz-' prefixed CSS properly.

A number of people seem to have a somewhat extremely optimistic view that they're going to make Javascript competitive in performance and memory use compared to native on mobile platforms. There's just no evidence that it's going to happen anytime soon. It's just not even close, despite how magical emscripten or Mandreel might seem.

With more and more browsing activity shifting to mobile, and with the majority apps being entertainment/multimedia/games, holding steadfast to the idea that there are not benefits to alternate VMs is more of a dangerous than transient breakage of pages.

Yes, the prefixes are a problem, but the Web needs to evolve fast to keep up. A period of incompatibility and fragmentation can be tolerated. It's happened before, chaos and disequilibrium followed by periods relative calm.

I'm more worried about stagnation, and a future where the web gives way more to the client-server internet days, with everyone running OS specific native apps that read and store data behind walled-garden non-HTTP clouds.


WebKit CSS gradient syntax was pretty explicitly overridden by the standards process in favor of a better one. Thankfully, WebKit changed its syntax to match.

Note that in a world where the Web was defined as "whatever WebKit renders", which lots of HN readers seem to be in favor of these days, we would of course be stuck with the old gradient syntax forever.


Do you trust the standardization process to make good choices? We got the stupid now-standard CSS box model over IE's sensible one (and now, twelve years(?) later, there's finally an experimental way to use the sane definition of the width of an item). We got IE's <object> with its clsid nonsense over netscape's <embed>. We got the wrong choice for iframes as well. I'm happy to believe the new gradient syntax is better (though if it's really that much better, surely webkit could have changed it themselves), but taken as a whole I'd sooner trust webkit's decision-making than the W3C's.


> If Google introduces something like Dart or NaCl, is willing to support standardization of it and even release a free implementation, but other browser vendors don't want to integrate it because they don't like it, that's their problem, not Google's. If that means popular apps become "Works best in Chrome," that is a self-inflicted wound.

I completely disagree.

First, if a technology is really bad for the web, are you saying other browsers should still implement it, just to not suffer the situation where popular apps work best in Chrome? That's bad enough as it is, but those popular apps are likely going to be Google properties - google.com, gmail. That means Google can basically force other browsers to adopt technologies as it sees fit, even if they are bad for the web.

Second, the WebKit situation on mobile exactly shows why this is wrong. If one popular browser adds special features and leaves them there indefinitely, in release versions of the browser - as WebKit does with its special CSS properties, and like Chrome does with NaCl and perhaps soon Dart - then people will develop for those features. If that browser then becomes dominant, those features will become a de-facto standard. Given Chrome's incredibly rapid rise, we should not assume it will not become dominant on the desktop, so this should concern us all.


Yes they may be forced to develop these things and to include them (if by being forced to we mean they would otherwise not having enough users) but I don't see that the browsers should have any say in what is good or not for the web. Browsers should focus on making the best browsers possible.


I think this is a valid concern. It is very similar to ActiveX with Internet Explorer, and, with the browser share Chrome is gathering, it isn't unreasonable to see people focusing on that.

--BUT--

With ActiveX, there was no other way to accomplish the same task. With Dart versus JavaScript, I can't accomplish anything in Dart that I can't in JavaScript. And, even if I decided to use Dart for some reason, it would be idiotic of me not to provide a JS version for non-Chrome browsers.

What would be really nice (heck, even something a standards body should have done) is coming up with a standard byte-code for browsers, so I can use any language I want. JavaScript isn't good enough, because I want my sites to load faster, and parsing the code shouldn't have to happen every time somebody visits a page.


> What would be really nice [...] is coming up with a standard byte-code for browsers

I used to think that too, until I read convincing arguments against it like this: http://www.2ality.com/2012/01/bytecode-myth.html


However, Javascript as the intermediate language does not follow. It may be true that a source-level intermediate representation has benefits, but the idea that the best intermediate representation is JS doesn't follow.

For example, late-binding, boxing/heap-numbers, etc all serve to make it extremely difficult to maximize startup performance, or heap overhead. It also makes the VMs much more complex to deal with optimizing such languages. TypedArrays don't really fix the problem, they just introduce others. For example, by forcing your to implement your own manual memory management everywhere. This is somewhat problematic if you're compiling a language with GC into TypedArray heavy code, since there's no finalizers in JS or Reference queues, you can't really do automatic release if a regular heap object holds into pseudo-objects in manually allocated memory.

The problem is particularly acute on mobile devices which are very resource constrained.


It wasn't "any member having a veto over doing anything". Don't attack a straw man. It was every other browser manufacturer unanimously saying Dart is a bad idea, and specifically outlining reasons for doing so. The only possible position you could have in support of Google here is that Google should be able to push whatever it wants, on account of Chrome being open source. In other words, monopolistic behavior is fine as long as it's open source. I don't agree with that.

Also "Works best in Chrome" is hardly a self-inflicted wound. With the might of Google, that's actually a recipe for browser monopoly. This is the reason why Google pushed ahead with Dart despite the opposition.


How would your opinion change if Dart was just a cross-compiler to JS like GWT or CoffeeScript?

And if you have no problem with that, then why is there a problem with a Dart VM in Chrome? All it means is that potentially, Chrome apps will startup faster or run a little faster.

However, this is not really much difference than V8 vs other JS VMs. There was a time when V8 was way way faster than other JS VMs, so the same speed differential existed anyway.

There's a difference between "doesn't run" and "runs, but faster"


I am a web developer and I very much want it. I don't want Javascript to be the only game in town for front end programming.

But this is beside the point: Eich never asked if web developers want it or not.


I'd go further than that. I think the era of the technology standard as a formally specified, committee-managed, cross-platform entity is coming to an end. What we're seeing instead is the emergence of open-source implementations that are themselves the de-facto standard. This is particularly true of languages. Look how Apple has managed to run circles around the C++ committee in their single stewardship of Obj-C. Or think how ridiculous it sounds to think of a "standard" definition of Python or Ruby.


If it is true that we have moved past the era of formal standards, we will have lost something that is important. More important than most people understand.

If a technology has only implementation, even an open source implementation, that implementation will have all sorts of hidden assumptions and biases. No one will know about or understand those assumptions and biases until the time comes to use that technology in a new context.

In my work, I've dealt with multiple implementations of things like simulators, models and even languages. The additional implementations usually had some specific reason behind them (performance, alternative features, supporting different execution environments, different mathematical properties and so on). Nevertheless, the value of each alternative implementation (in terms of surfacing hidden assumptions, finding bugs, illuminating more general and flexible alternatives, etc.) went well beyond the specific reason that implementation was made.

These considerations are even more important in the case of software that is part of a large, chaotic, heterogeneous system like the Web. Imagine where the mobile web would be today if there hadn't been a W3C and, at the turn of the century, the Web really had been "what Internet Explorer renders" with no attempt to document and codify anything else.

Or, to take your example of Objective C, how much does Objective C matter outside of Apple's ecosystem?

Contrast this with:

- How much does JavaScript matter outside of Netscape's ecosystem?

- How much does Java matters outside of Sun's?

- Or, to take the granddaddy of examples, how much does C matter off of the PDP-11 and outside of Bell Labs?

None of those languages would be anywhere near where they are today without standards. On the other side of the coin, there is a reason languages like Python, Ruby and Haskell have had (and, each to a varying extent, continue to have) a "bus problem". Standards matter and even just the standardization process itself matters. I wish more people understood that.


How much does Objective C matter outside of Apple's ecosystem?

It doesn't need to. Apple has demonstrated the agility of a wholly owned, vertical stack. The pace of change has qualitatively changed since the days of language standards set by plodding standards bodies.


It depends on what your ambitions are. I'd agree that it doesn't matter much to Apple that Objective C isn't standardized (and not standardizing may be an intentional part of Apple's business strategy by increasing porting and switching costs).

But that also means that, no matter what its true potential is, Objective C will never be more than what Apple makes of it. It won't be seriously used outside of application areas Apple is interested in. It won't develop new features Apple doesn't care about. And if Apple decides to go in a different direction and/or runs into serious trouble, well that's it.

Apple has changed the world, but, unlike some other, standardized languages including the ones above, Objective C never will. If we lose the full potential of future great languages because they are limited by a single company and/or a single implementation, we will all be poorer because of that.


I used to think so but I've come around to the idea that the language is just part of the platform. Learning the Apis and idioms is always more work than learning a new syntax anyway.


I guess this is where we're actually parting ways. I agree that learning a platform, its APIs and idioms is important, non-trivial work, but...

As you might guess from my username, I'm a fan of more than a few languages where (at least I believe) the differences go well beyond syntax. That means I want standards because I want to know what I can (or could) take with me to another platform and what I can't.


The notion of a standard python is very real and meaningful, given that Jython, IronPython and PyPy all exist and have their differences from CPython. The only way you can write code that runs on all of them is with a standard.


> Or think how ridiculous it sounds to think of a "standard" definition of Python or Ruby.

There are ongoing efforts to standardize a quite sizable subset of Ruby in Japan for later ratifications in ISO. Ruby is also de facto standardized by the test-suite RubySpec.

It is not ridiculous at all, in fact reference documents such as RubySpec have been one of the main reason for which you can move from a Ruby implementation to another without many issues and without having many compatibility hacks included in each compiler/interpreter.


Or think how ridiculous it sounds to think of a "standard" definition of Python or Ruby.

Really? It doesn't sound all that ridiculous if you know cpython is the reference implementation of the language python, which is extended by the PEP process.


My point exactly. There's a reference implementation, not a thousand page standard document.


>not a thousand page standard document

It has those too. And IIRC the w3c requires a browser to implement a reference version before something is standardized there.


There are plenty of W3C standards that WebKit is not willing to support. That's fine. Some of the standards just suck.

There are also plenty that it purposefully implements incorrectly (CSS Selectors, for example). That's a lot less fine.


> There was a time when the "Internet" could if you squinted right be defined as "the collection of people who make fun of that process and do the opposite"

Are those the same guys who brought FTP upon us? BGP? 32-bit IP addresses?

There were few moments where it would be better if somebody said, wait a minute, we need to think first, then code.


Yes, and this is the same shotgun process that managed to avoid congestion collapse. Not by spending 6 years fretting about it, but by writing the TCP stack in 4.3BSD-Tahoe.

32 bit addresses were a good idea. It's easy to look back on it now and say "they should have used 128 bit integers", but until recently, 32 bit addresses were the largest addresses that worked as scalars in all mainstream platforms.


>FTP upon us? BGP? 32-bit IP addresses?

Those are effectively entrenched de facto standards, but its not the fault of the creators, its the fault of the people unwilling to move with the times. I'm not cool enough to run an AS, but I know for the other two better alternatives are already here. As Vint Cerf has said, 32 bits was enough for an experiment, it just never ended.

IPv6 and SFTP are here and _in use_ by anyone who isn't clinging on to old tech like people clung to IE6.

>we need to think first, then code

Thinking first, implementing and then improving is how technology advances. Iteration is mandatory, otherwise you're stuck in a waterfall-type scenario. Without it, I sure hope you enjoy riding your horse down to the local Gutenburg press to pick up this week's HackerNews and read my comment there.


> its the fault of the people unwilling to move with the times.

It's the network effect. Once something is "out there", it's really, really hard to get rid of it. If the first version that reaches critical mass has unfixable flaws, you'll be stuck with them forever.

I don't blame Vint, he couldn't have idea how successful the experiment is going to be, but I can't read HN over IPv6.

But now we know how web standards spread and stick around forever.

HTML5 special-cases parsing of <XMP> element. CSS has quirks mode, almost standards mode and standards mode and two syntaxes for clip().


If you read the article referenced, web-kit isn't being picked on especially. The concept of using vendor prefixes is being questioned.

There's a valid rationale to not using vendor-prefixes, because it could potentially add confusion and erode standards.

From my understanding, in effort to display web-pages at their best, some browser manufacturers are starting to honour the browser prefixes used by other vendors. I can see how this could create problems.


It's a little hard for me to avoid the impression that W3C partisans hate vendor extensions because they make it easier for websites to handle cross-platform support for extensions. -moz-border-radius and -webkit-border-radius is ugly but nothing more; conflicting definitions of "border-radius" is a showstopper.

Which, if you're cynical, is how the W3C probably wants it.


The alternative isn't to stop using the properties - it's to stop using the vendor prefixes. That's all that's being suggested.

I think your aggressive anti-W3C stance is really misguided. They're not the bad guys.

If you actually read the article referenced in the dcurt.is article, you'd see how vendor extensions are actually becoming something more than just ugly. Which is the whole point of the suggestion.


The argument in that article is adequately summed up by Dustin Curtis. The problem with vendor extensions is that when one browser ends up dominant, the others eventually end up having to implement that browser's extensions.

What I don't get is how this makes that browser a "monopoly". The only thing Webkit won was the string "-webkit" in the CSS property. It's the monopoly that gives the monopoly to Webkit. Not CSS vendor prefixes. Webkit has a virtual monopoly among mobile browsers.


The problem isn't that one browser is dominant. The problem is when Apple develops a feature entirely in secret and then doesn't bring it to a standards body, forcing other browser vendors to reverse-engineer that feature. Apple is shipping those features, and evangelizing them to developers, before the other browser vendors can even get started reverse-engineering them (which they shouldn't even have to do in the first place).

Now, you could argue that that's in Apple's interest, and you could argue that "it's their browser and they can do what they want with it". But you can't dispute the fact that they're not helping web standardization in the mobile space. Which is a problem for anyone who wants a more standardized web.


WebKit has an advantage with WebKit prefixes when the prefixes in question have no spec. As DarkShikari said when criticizing the VP8 spec, code is not a spec.


Let's see if we can't tell the difference between -webkit-transform-style, a fiddly CSS property that to be useful has to be documented enough for web developers to employ it, and the interpolation filter embedded in the guts of a black box video codec.


Good point. Webkit even changed the way it handled gradients about a year ago to be in line with standards (at the time I liked the webkit way).


the developers of WebKit added the prefix -webkit to the experimental stylesheet declarations. This ensured they worked only in WebKit.

No. This ensured that they didn't pollute the global namespace with experimental, partially-implemented ideas. If Mozilla, Opera, and Webkit each make e.g. "rounded-rect" declaration, and one browser takes the width as the first argument while another takes the radius first, then we're screwed. At least having "-moz-rounded-rect" and "-o-rouded-rect" means developers know how their pages will behave in each browser. I agree that the W3C is unacceptably slow, but that doesn't mean prefixes are a terrible idea.


What I still don't understand is why the global namespace isn't used straight away.

You could, for example, just use one line of css for border-radius for all browsers. If one vendor does it differently you use their vendor specific namespace to override it. Surely that makes more sense than having to do a line of css for every vendor?


I support this methodology. Have the globally-namespaced attribute already function, but give the vendor-specific one precedence. That way the developer only has to add vendor-specific declarations for the minority that implements it differently.


But if there's no standard, then how do you know which behavior to expect in the "global" case and which ones should be vendor-specific? And if only one browser experimentally implements a new feature, should you use the vendor-specific namespace for it, or hope that all future implementations in other browsers are the same?


Just to clarify I'm not suggesting there should be no standard, there definitely should be.

However for the initial implementation what's wrong with survival of the fittest? The most popular way will win through, which would most likely become standard. For those browsers that don't implement it in the same way you use their vendor specific namespace instead.

Yes it's likely to be hairy when it's first being put to use if browsers have slightly different implementations but no more so than using an experimental tag in the first place.

The difference appears to me to be that for the next five+ years I could be writing 8 lines of basically the same css for every instance of linear-gradient (and that's assuming we don't get any new browser vendors in the coming years) vs writing 2-4 lines with backwards compatibility or shock maybe even 1 line if I didn't care about older browsers (e.g. http://www.colorzilla.com/gradient-editor/).


Exactly. The vendor prefixes are a compromise. The majority of the time the prefixes are used to implement features which have been proposed (but haven't been ratified) now.


I think if features are standardized, the economy in typing alone will force the vendor-specifics away. Just a phase that (CSS) standards might go through.


Welcome to the 'standards' world. It sucks big time.

I've participated in standards in both POSIX working groups (network file systems, network apis), IEEE working groups (PCI express Advanced Switching), IETF working groups (RPC and XDR, heck I'm the 'owner' of port 111!), various CAD standards groups, and a whole bunch of things that never even rose to a level of relevance to get published.

If standards bodies are effective, they take away power from companies to distort the playing field, if they are ineffective they allow folks to distort independently. So large companies send representatives to standards bodies to distort them in favor of their own company (Rambus is the canonical example) and other companies send representatives to make them ineffective so that they won't level the current field (Microsoft and XML is the canonical exemplar there).

The bottom line is that Standards bodies whine that nobody lets them do their job, and they tolerate members who actively prevent them from making progress. I have long since given up feeling any sympathy for them whatsoever.


> Welcome to the 'standards' world. It sucks big time.

Because a hegemony is so much better right? What fun that was last time around.


Dustin, I totally agree with you that the current system is whack. Glacial adoption of W3C standards has a real cost on developers and users, and ends up with rounded corners taking 16 lines of CSS, and gradient fills taking 8.

I don't think that we'll be better off just by making the current process "go faster," though.

There needs to be some proper vetting of features. You'd want to avoid the tyranny of the monied interests. It can't just be Apple/Microsoft/Google making the decisions about what goes in and what doesn't -- even though they're going to shoulder the cost on the user-agent side to implement the feature.

It might just be as simple as leaving the current ecosystem as it is, looking at the adoption rate and success of a given construct, and folding those into the "living" spec (without the prefix, of course) -- and just do this process on a regular, fairly frequent interval, like every 6 months.


I disagree about the proper vetting of features being a necessity. As long as open-source rendering engines like Webkit or Gecko are pushing forward and providing ideas and implementations of those ideas, we can let web developers vote with their websites.

Everyone understands that advanced features of CSS can't be relied upon—this is now part of the collective front-end developer consciousness.

Get rid of the W3C and let Mozilla and the Webkit project define the web moving forward; they've done a great job so far.


> Get rid of the W3C and let Mozilla and the Webkit project define the web moving forward; they've done a great job so far.

The standardization of required codecs for the `<video>` element is indeed a wonderful accomplishment by the non-W3C HTML5 WHATWG.

Seriously, a third-party neutral organization will always be needed. Who will you appeal to when the main companies start having different ideas about something? Another thing that organizations like W3C or IETF do is to provide some kind of shelter from patents and similar problems.

We can discuss about the efficacy of such shelters and about the ability to mediate between different options, but the web is definitely under a better situation than wireless telephony, a field where company-run consortiums decide standards and "settle" disputes.


Who will you appeal to when the main companies start having different ideas about something?

No one. If the browsers refuse to agree on something (codecs being a good example), there's nothing that anyone can do about it. Just say "no standard is possible in this area" and move on.

OTOH, if two or more major browsers agree on the syntax and semantics of a feature, let's issue it as a standard (and de-prefix it) ASAP.


Speaking of `<video>`, i came across a site today that used <embed src="foo.wav">. As long as video and audio formats are loosely specified, we could've used the existing syntax and existing browser infrastructure (you still can't disable HTML5 audio/video in Chrome[1] for e.g. data cap reasons)

__________________

1. http://crbug.com/50132


The irony is that <embed> was never a standard (unlike <object>); if you believe the W3C fans then you must be evil to use it. But it works, and has always worked, better than any of the alternatives. I hear with html5 they've given up and declared it part of the standard.


The "they" that "gave up" is a different group of people than the "they" who thought it was evil, by the way.

The current group working on the contemporary HTML living standard just describes reality, by and large, so there was never really any doubt that "embed" would be part of the standard when we started back in 2004.


I think you've hit it with watching the adoption rate. It's not as if Microsoft or Apple or Google can force these features on to people. Web developers have to actually use them.

I fail to see how "monied interests" can affect much unless developers who are very much not part of these interests actually use these features. I think the way things are going (i.e., engines add features, developers use them or don't, the ones that get used get approved by de facto adoption, etc.) is perfect and fits the best with the pace at which things are moving in this industry.

I don't think, as many do, that it will lead to the proliferation of a bunch of confusing, bloated features being shoved into the spec (as loose as that word can be applied here). We haven't seen much of that yet, and I really don't think we will.


If you don't mind - I'm missing something here I think - does the W3C have any legal power whatsoever to force vendors who want to ignore them and develop proprietary closed features to fall into line? I don't believe they do, other than just some bad PR.....


Somewhere in building 50 on Redmond campus, the IE team is wondering why nobody ever said this in their defence when the exact same scenario was playing out with IE (dominant browser, not available on all platforms, glacial standards bodies, browser specific extensions, etc).


This is a very different situation in quite a few ways. For one, IE was not just a big browser, but was almost completely dominant. For another, many of IE's browser-specific features were Windows-only by design, such as ActiveX.

Here, we have an open source rendering engine that clearly delineates browser-specific features (each of which is well-documented). The use of a prefix makes it possible to use the browser-specific features offered by WebKit without sacrificing compatibility with other browsers, something Microsoft actively resisted. For quite a while, it was very difficult, if not impossible, to make sites that looked and felt right both in IE and in other browsers.

Had Microsoft extended standards in a way that played nice with the rest of the community, maybe they would have had someone saying something like this in their defence. Instead, they used browser-specific features as a way of undermining standards.


I don't think Microsoft was acting "not nice" - most of their extensions were submitted to the W3C (and even ActiveX was standardized by X/Open, however pointlessly). However, the major vendor, Netscape, simply was not going to adopt any IEism under any circumstances.

The biggest issue with being frozen on IE6 was not so much the standards but all the terrible bugs.


Netscape had a cross-platform browser. Fully implementing ActiveX in a cross-platform way was not feasible.


Keep in mind that at the time, Netscape was even more proprietary than Microsoft. (Layers instead of DOM, JavaScript SS instead of CSS)

And if you want to get technical about it, in theory you could support ActiveX on any platform. The problem was the controls were all Win32 software. There probably would have been some marginal benefit for Netscape to support it on Windows.


NaCl is specific to individual processor architectures by design, much like ActiveX was specific to Windows.

Prefixed WebKit features are certainly not consistently well-documented. CSS animations, transforms, etc. had to be carefully reverse engineered by other browsers.


CSS animations and transforms were released along with formal specs. The Webkit developers read mailing lists, hang out on IRC, etc. If anyone thought these were underspecified or ambiguous, they could easily ask for answers or adjustments to the spec. It’s not like the Webkit folks were trying to hide anything.


The spec was not released alongside the implementation of animations and transforms. It was released much later.


You’re right. CSS animations were announced and added to Webkit betas in October 2007, and it took until the beginning of April 2008 for Apple to put up their proposed formal spec. I’m not sure when they first hit a publicly released version of Safari. Still, 6 months is pretty short in browser spec time, and pretty much no one was using these features in the wild until much later.


NaCl is not part of WebKit.


As others have said, what Microsoft did was arguably worse, but I generally agree: Apple and Google are industry darlings, and because of this are getting away with things people would be crying bloody murder over if Microsoft did them.

I get the distinct sense that a lot of people don’t really care about web standards so much as they care about their favourite company controlling standards. If it were Microsoft who was forging ahead with new features, would the MacBook-wielding, Gmail-using technorati be so enthusiastic?

I’m not even entirely against the new -webkit- stuff, I just hate the hypocrisy from people who claim to be all for web standards, except for when there’s a shiny new feature that only works in Safari, Chrome, and iOS. Microsoft has arguably been doing the best job of respecting web standards since IE9, and they get very little credit for it.


I think you're right. Today I've been reading the exact opposite of what people were saying about IE a decade ago.

I just hate the hypocrisy from people who claim to be all for web standards, except for when there’s a shiny new feature that only works in Safari, Chrome, and iOS.

I'm right there with you. For a while I've been coming to realize that the standards never actually mattered for these folks, its just that "IE6 doesn't conform to standards" made a more convincing argument for firefox than the real reason which was "IE6 is a pain in the ass and I hate it."

Now the people who want the new hotness are realizing that the standards they tied themselves to have become an albatross. If they actually ever gave a shit about standards, they would have been pitching ideas about how to update the standardization process to work with today's implementation realities. They actually would have been doing it along the way, and we would have never ended up where we are now.


I may be taking this too far, but I think there’s more to it than IE. Many in the “hacker” community seem to wrap themselves in the flag of “open”, but only when it’s in their favour.

It’s the people who argue for a decentralized, open-source alternative to Facebook, but embrace Google+. It’s the people who explain how terrible it is to tie your business to a single vendor, but decide to build businesses on iOS. It’s the people who tell you they would never put any personal documents on the cloud, but embrace Dropbox. It’s the people who claim to support open, DRM-free data formats and think the Khan Academy is the future of education, but think iBooks textbooks are great.

Like I said, I don’t have a particularly strong attachment to open source and open data formats — I just hate the way people use openness and standards as a prop. Saying standards aren’t necessary when WebKit has neat new features is (loosely) analogous to only supporting democracy when the party you vote for wins.


It wouldn't have been a problem if they had done so the way that WebKit, Opera and Mozilla developers have done so. By prefixing it clearly lays out that it is a "vendor" specific property implementation.

That implementation can be discussed, modified or accepted and become a standard non-prefix property. It's happened a few times where the WebKit implementation has had to be reversed to match another vendor's approach. This is good.

It has gotten us closer to using CSS3 in a shorter period of time than we have with any W3C standard.

The way that both Microsoft and Netscape did it in the 1990s was not only the wrong way to do it, it was a malicious approach aimed at segmenting the web.

And Microsoft's biggest sin was that they froze IE at version 6 for over 5 years, ensuring that their mishmash of standards and proprietary implementations would weigh on the Internet.


I remember plenty of people making exactly this argument actually.


This is a somewhat ungrateful perspective, just after the Eolas patent case was thrown out.

Yes, the W3C process takes about half the lifespan of many HN readers. Yup, it's not ideal.

But it's the only game in town and it's been going on so long because it works. Because it successfully brought Microsoft into the fold and conversation at a time where, had they decided, they could have broken everything.

It's worth exercising a little perspective. Can you think of any endeavor that has so successfully united vendors to create something as broad and interoperable as the web, over such a relatively small period of history? The W3C has been fundamental to this.

- What has the W3C ever done for us?

-- Well, HTML.

- Oh yeah, they gave us that. Yeah. That's true.

-- And CSS you could view source and copy.

--- Yeah, you remember how scared we all were of Microsoft Blackbird?

- All right, I'll grant you that HTML and CSS are two things the W3C have done...

(you get the idea.)


W3C isn't the only game in town. There's the WHATWG because of W3C's shortcomings.


Seriously, what has the pre-WhatWG W3C done for us that wasn't in "HTML 3.2"? I can't think of anything better in html4 than before, and plenty of things that're worse. As for CSS, it's evolved so slowly and refused to implement such basic features that people now use preprocessors to generate it, defeating the point entirely. (Not to mention that the most W3C-ey version was 2.1, which has still never been implemented).

So what has the W3C actually done for us? XHTML?


Why are they the only game in town?

And html is arguably a bad document language. Look at all the different conflicting ways in which it is rendered. And don't even get me started on css....


My complaint is not that -webkit CSS extensions are bad. I love how fast the browsers move. It's that people aren't careful to make sure they use all the prefixes when they're available. It's an education problem, but people using CSS preprocessors such as LESS and SASS have no real excuse.


That wouldn't be a problem if the non-vendored variants of the CSS attributes didn't take so god damn long to become proper standards.

The problem is not the webkit vendor attributes, the problem is that proper standards aren't able to keep up.

HTML5 is still not a proper W3C standard (or "recommendation"), it's still just a working draft, and was last updated in May of 2011. It's been six years. So far.


I don't really understand this objection. Sure, HTML 5 isn't officially a standard, but neither is -webkit-whatever; in both cases, you can use whichever non-standard you like, subject to browser support. The advantage of the W3C draft is that it is a non-standard that the browser makers are all working towards.


The objection is that the standardization process is too slow. Not a little bit too slow, but so glacially slow it's almost indistinguishable from abandonment. So, progress of key web technologies needs to be put into vendors's hands.

CSS border-radius is still not a W3C recommendation. It's been a draft for nine and a half years. So far. It was last updated one year ago.

With a standardization process that broken, vendors need to take things into their own hands, and them doing so, especially webkit doing so, has been an astounding success, IMHO.


Yes, but why does it matter that the standardization process is slow? What is the practical relevance of something moving from "Candidate Recommendation" to "Recommendation" status? Why are draft standards any worse than vendor extensions?


Prefixed attributes bring features to developers while they are still experimental, which means we don't have to wait a decade before trying them out.


HTML is a living standard and was last updated yesterday:

   http://whatwg.org/html
The W3C Recommendation process is obsolete.


100% agreed. The process has to move faster.


This is something that the discussion hasnt touched yet, but I don't know that you can assume that devs are omitting the vendor-specific prefixed border radius out of laziness. I'm frequently guilty of coding base lowest-common-denominator CSS, then enhancing the WebKit experience -- not because I'm not aware of the vendor-specific attributes for other rendering engines or because I'm too lazy to type them out, but because only the WebKit rendering is satisfactory.

I'm not entirely certain about this approach though, so what is the real harm in a web where boxes have slightly rounded corners in one browser and not another? I'm also curious if anybody has examples of WebKit-only sites (that don't actually function in other browsers) that don't fall under the heading of "look at this cool new CSS thing that this site exists to demonstrate."


Here are some examples. I have a lot more. It's hard to develop a new browser when everyone has a "must have" site that serves degraded or broken markup to everything except WebKit.

Google Maps in WebKit: http://dl.dropbox.com/u/2128410/maps-webkit.png

Google Maps in Firefox: http://dl.dropbox.com/u/2128410/maps-firefox.png

Gmail in WebKit: http://dl.dropbox.com/u/2128410/gmail-webkit.png

Gmail in Firefox spoofing a WebKit UA header: http://dl.dropbox.com/u/2128410/gmail-firefox-webkit-ua.png

Gmail in Opera or Firefox normally: http://dl.dropbox.com/u/2128410/gmail-opera.png

Twitter in WebKit: http://dl.dropbox.com/u/2128410/twitter-webkit.png

Twitter in Firefox: http://dl.dropbox.com/u/2128410/twitter-firefox.png

These are just some of the big-name sites that I knew off-hand. At least these high-budget sites usually have some sort of fallback. Some sites just have degraded styles; some have parts that are unreadable. Some provide partial functionality to non-WebKit browsers; others won't work at all (like Bing web search in Opera).

This is becoming more common on the desktop too, with sites like Tweetdeck that just lock out non-WebKit browsers using UA sniffing: https://web.tweetdeck.com/web/unsupported.html


Great examples. I think even the WebKit touch events version of Google Maps is broken, but, not being a Gmail user, I had no idea they were blowing it that badly. I use both Maps and Twitter every single day, but -- and here's another facet of the problem -- always through applications. I encounter Maps embedded in (WebKit) pages only enough to be annoyed by them, and I haven't been to twitter.com in at least a year. This is eye-opening. Clearly we're not just talking about drop shadows.


Developers are ambitious. We want to build better apps now not in 6 years time when the W3C have finally moved on this or that issue. Right now webkit is enabling developers so it wins. Supporting every browser is either economically not viable or not possible without sacrificing features. If mobile firefox wants to compete it needs to step up. If developers are voting with their feet and actually using webkit features in the wild then they are defacto standards. Firefox better pull its finger out or it will die.


In all of the above cases, Firefox actually does support the features used by these sites, and has for a long time (years, in some cases). In most cases the developers would just need to tweak their UA sniffing, or add some -moz properties alongside the -webkit properties in their styles.

For some technical details, see for example http://bugzil.la/668218

Mozilla does need to "step up" -- but work required to fix cases like these is not technical. It's about gaining user marketshare, developer mindshare, and improving the standards process.


It's true, and since everyone upgrades their phones every two years it's not like there's a chance of older versions of a hegemonic browser haunting developers' dreams, right?


Aside: Please don't have a flashing beacon thing where I'm trying to read. I get visually distracted very, very easily.


I'm going to have to agree that's a usability gaffe. Peripheral animation that isn't in service of a notification or something else that actually requires the user's attention is, quite literally, only a distraction.


It's worse than that actually - hold your mouse over it and you give the author "Kudos." I find that particularly offensive /rant


To me, the article comes across as a bit of an uninformed rant.

> I hope it does kill the W3C and CSS Working Group standardization process.

To this, I have to ask "Mr Curtis - are you _CRAZY_?"(!)

Do you not realise that it's taken over 12 years to say good bye to IE6. Have you not had the pleasure of battling with the ramifications of a browser that doesn't play by the rules?

--

There are some understandable reasons for not making use of vendor prefixes.

The article that Dustin Curtis links to makes some valid points about the dangers of continuing to use vendor prefixes. A similar line of thinking is presented here [1]

The alternative to using vendor tags isn't to stop using the features - it's to simply drop the vendor prefixes and use the property names from the (non-ratified) CSS specifications before they're fully approved.

For all it's failings, the W3C does a grand job. Negotiations take time. Best we don't throw our toys out of the pram.

--

[1] http://www.quirksmode.org/blog/archives/2010/03/css_vendor_p....


I think you're misinformed by what "the rules" are.

W3C standards are not, in fact, an innovation speed limit. Vendors are free to implement whatever they'd like.

The only way a vendor should be encumbered by the W3C is that if they choose to implement a feature, they should also support the W3C incarnation of that feature. That's it, full stop.

The problem with IE6 isn't that it did extra things. It's that it did a variety of things (some extra, some not) badly, and then Microsoft refused to bring it into compliance. That's not a problem we have with Webkit. So what are you saying when you try to tar Webkit with the IE brush? How familiar are you with the development process behind Webkit?


If you read the article referenced, you'd realise that web-kit isn't being picked on. The concept of using vendor-prefixes is being questioned.

I'm not tarring webkit with an IE brush. I'm stating that without the W3C or similar body to set standards, we would in a far worse situation.

The article spectacularly misses the point.


Do you have something to say about vendor prefixes? That's what we're talking about. I'm not particularly concerned about what particular blue ribbons we affix to the W3C. They're great. Now what?


These links [1][2] provide some reasoning behind not using vendor extensions. The second is referenced in the article we're commenting on. I find it difficult to understand why much vitriol needs to be expended.

[1] http://www.quirksmode.org/blog/archives/2010/03/css_vendor_p...

[2] http://www.glazman.org/weblog/dotclear/index.php?post/2012/0...


That QuirksMode article you posted includes an update acknowledging that it's wrong. I think the stance in the newer post it links to is more reasonable, though I'm still not sure it's all the way there.


The article provides a summary of potential problems with vendor prefixes, which still stands.

The author acknowledged that his suggested solution was wrong. See the redux [1] (which is actually mentioned in the article) for alternative solutions.

My main concern is that the dcurt.is article misses the point entirely.

[1] http://www.quirksmode.org/blog/archives/2010/03/css_vendor_p...


Way to go, dcurtis, ranting about web standards and webkit's greatness while your website doesn't render sanely on my android devices' webkit browsers.


That's unfair, it says right at the top of the page that he's a superhero not a web developer.


This seems to be a fairly standard gripe from the W3C. "Listen to us, because if you don't the sky will fall!"

They are making themselves increasingly irrelevant because they exist primarily to restrict and limit rather than to expand and innovate.

Partly that's the nature of the beast, standards are after all rules.

The problem is that the rules need a clear and timely mechanism to evolve. Clearly the W3C is the wrong answer to that problem.

I remember well what things were like in the days of horrible tag soup trying to support internet explorer and netscape, and that sucked too.

Standards are good, having a single rule set for everyone to test against is good.

The W3C though, yeah they suck hard.

Way past time to move to a usage-based standards adoption mechanism. Let people innovate, if that change sees wide support adopt it as a standard.


Does anyone have any insight as to why the W3C has taken over 10 years to converge on the spec for rounded corners? Any info more than just "bureaucracy"?


you really do have to properly vet these things. i didn't realize the work involved either until i started following the rdfa spec. beyond the bureaucracy (which is actually pretty good, like a software development process) there are unintended use cases and side effects, and the concept needs to be as complete as possible and not self-contradictory. there must also be prior art to be researched and understood and incorporated wherever possible as to not throw out good ideas with an off-the-cuff implementation. second, it is actually done sort of slowly, and with your input if you want, and you can start to use a lot of the core features of drafts, just like a beta program.


While I the W3C CSS WG does take a long time to produce standards, I should point out that "border-radius" is in fact finalized already, and is supported without a vendor prefix in Firefox 4+, Chrome 4+, IE 9+, Opera 10.5+, and Safari 5+.


Browser developers are going to innovate new features. A standards body doesn't drive innovation. Innovation is not a top-down process. A standards body exists to ensure some level of interoperability between browsers.

What Prefixgate shows is that a W3C standard is not a rigid set of laws like the 10 commandments. Rather a W3C standard is like a peace treaty - all adherents agree to respect certain protocols and behave in certain ways, buy are free to pass whatever laws they want within their own boundaries. Assuming that these proprietary laws do not conflict with existing treaty stipulations.

Peace treaties give us access to the best of both worlds: vendors are free to innovate as they see fit, yet they still commit themselves to some level of interoperability. Innovations influence the standard and the standard ensures that all vendors adopt the best innovations at some point.

The comparisons between IE of old and Webkit are incorrect. Microsoft has a history of implementing standards incorrectly (http://www.quirksmode.org/css/quirksmode.html) and bullying standards bodies (http://techrights.org/2011/09/06/michel-levy-comes-out-swing...). When it comes to standards Microsoft does not act in good faith.


>Innovation is not a top-down process.

I don't see why it can't be. For example, going back 10 years the W3C could have said we're standardising a series of layout definition attributes (or multiple backgrounds, or colour fades, or bordering styles, or animation modes or whatever) and then waited for those to be implemented by the browsers. I don't see why the standards bodies can't anticipate a need that hasn't yet been coded for and implement a preliminary standard definition for browser writers to use.

It probably wouldn't work in practice because it appears to rely on the standards body moving faster than the browser makers which is a laughable suggestion as things stand now.

As I, probably shortsightedly, see things now it's the javascript (jquery, mootools, etc.) writers who're best placed to speak to the next iteration of web standards as they are actively filling the holes that are missing in the current iteration of HTML+CSS (and associate tech).


> For example, going back 10 years the W3C could have said we're standardising a series of layout definition attributes

So 10 years ago, before the iPhone and the iPad and the entire mobile web? You really expect that standards body 10 years ago would have anticipated the need for multi-touch events, for example?


I didn't say that all innovation is exclusively top-down. I just said that I couldn't see why it couldn't happen. For example when a client gives you a design brief for a system that's a limited type of top-down innovation.

W3C do appear to create some standards before there is a working implementation too.


don't see why it can't be. For example, going back 10 years the W3C could have said we're standardising a series of layout definition attributes (or multiple backgrounds, or colour fades, or bordering styles, or animation modes or whatever) and then waited for those to be implemented by the browsers. I don't see why the standards bodies can't anticipate a need that hasn't yet been coded for and implement a preliminary standard definition for browser writers to use.

Well, people who have worked with standards bodies know why it can't be. Because of the natures of the processes involved there and the interests that have to be balanced.

The fact that something is at all possible (= no laws in the universe prevent it from being so) don't mean it has a high enough probability. Yeah, a committee could work faster and flexibly. Experience has shown us that historically, they never do.


I find it funny that that rant holds up as an example the magic bit of CSS animation on the right which treats a brief hover over it as a "kudo".


I'm glad that the webkit-specific features were prefixed. I just wish they would have chosen a prefix that was a wake-up call to anyone relying on them.

Perhaps a prefix like "this-feature-will-be-deprecated-in-2013-".

Slightly off-topic, but in a similar manner, when we struggled with users permanently deleting records, ignoring the warning that they were permanently deleting records, and calling us to retrieve the irretrievable, we changed the confirmation box from "click OK" to "type the word 'irreversible". This cut the calls to near zero.


It is correct that W3C certainly lacks the ability to coordinate different vendors. But is it good to abandon the coordination process altogether? No, the problem here is a standardization process and not a coordination process. You at least have to coordinate vendors to maintain the very baseline, no matter Web standard is means or not. (If you don't agree on this, you'd be better looking at the past...) Someone should really bring up the better alternative than Web standard, and not only criticize it.


"Despite having representatives from all of the major browser developers, the Working Group has not been able come up with a solution."

IMO there's no "despite" about that, "Due to" would be more appropriate.

Design by committee sucks even under the best of circumstances. When the committee is made up of very different companies each trying to use the web as a chip to bolster different parts of their businesses, of course the process is going to be broken. It probably can't not be broken.


I don't see how deciding the syntax for a gradient fill might affect one business concern vs another


Actually there are very real ways in which this sort of thing can impact one vs another. eg. Adobe may push for a format that natively supports the double sided color filling rules they use in Flash to make it easier for them to produce Flash->HTML5 conversion frameworks without a lot of code rewriting while other parties may not have this as a concern.

That aside, you picked a ridiculously low-hanging fruit of a concern. Consider something larger, like JavaScript and the fate of ECMAScript 4 due to the very situation I'm talking about. But even the minor bikeshedding concerns can be used as political levers one way or another regardless of how (in)significant one implementation is compared to another.


There ought to be at least some things that literally none of the participants in w3c etc have any conflict of interest about and therefore would be able to decide quickly on. The debate over the <video> tag for example may never be decided on. But I have trouble thinking of any counter-examples.


Ohhhh, so it's okay when webkit does it... just not IE.


I think the reason for this kind of slow evolution is lack of generic properties (like --surprise-- internet explorer's filters ) which lets developers push the boundaries on their own with in a reasonable limit. For example, support for border-radius, linear gradient, box shadow, text shadow etc all deal with rendering of elements at various levels. If there is a generic property which lets us control rendering and positioning of elements to a limited extent, the current evolution and standardization processes will be much less of a problem (I'm not saying internet explorer's filters are the way to go. But you have to give them credit for putting such a generic property in developer's hands).


the guy proposes to kill standardization bodies, and that web engines should do w/e the hell they like, regardless of standards.

coming from what appears to be a webkit dev, that's pretty bad.

standardization bodies should certainly be deeply fixed - but this kind of reply is wrong.


>The reason the -webkit prefix was necessary is simple: the W3C and the CSS Working Group are ineffective, failed organizations.

This is a bit too far. The -webkit prefix was a gesture of courtesy. What if other browsers implement it differently, and the spec changes? then older webkit browsers will render it incorrectly.

The solution is simple: code to the standards, then target the fringe browsers. It doesn't matter if it's webkit or IE, this is a future proof, cross compatible solution that will solve both problems.


Let's broaden the picture a bit here and look at the internet as a whole. There are no standards - there are suggestions (RFCs), lots of open stuff (open source, free, whatever), and some closed stuff, and then there's what really happens.

It turns out that IPv4 and DNS worked pretty well, though of course the internet at-large only uses a portion of those "standards". We didn't need an organization to tell us every last little detail and how it would work and "approve" it before we use it. People found out what worked, documented it, and it became the de-facto standard. We've never waited on the various internet "governing" bodies while keeping our own progress at bay. THere's absolutely no reason to.

Know why everyone (more or less) obeys the allocations issued by ARIN? Because we recognize someone has to manage that limited resource pool, and they do an okay job of it. We're all free, more or less, to set up networks in whatever way we want, with whatever IP space we want - we just can't expect cooperation from our neighbours without some discussion and agreement so we don't stomp on each other's space - so we look to ARIN and their brethren to manage this. If they ceased to do so efficiently, they'd be replaced.

The W3C has no power to force a developer or vendor to implement a given feature or not, and nobody really cares whether or not something is "officially" compliant. Most web pages are not fully compliant... yet the web, built by kids and adults and everything in between grew up and here we are, talking on HN.

It'd be great if people could agree when we end up with overlapping features from different vendors...... it makes it hard for the developers right? That should be obvious to all. But we've switched browsers over the years, and will again, and whichever ones actually do what people need them to do, as well as keep developers happy, will prosper - w3c has little to do with it as far as I can see.

Same for DNS (gripes aside - let's be realistic).

We implemented IPv4 and it works great, but does the internet at large obey the type of service bits which are part of that standard? heck no. It's a big, evolving, organic thing.

The web - same deal. Know what makes a standard? Something people adopt and continue using. Usage defines the standards - nobody is compelled to do anything just because it's a "standard" according to some group.


Front end Developers seem to disagree a lot. Some embrace the webkit prefix and sprinkle it all through their CSS. Others prefer not to resort to old fashioned site building techniques and instead aim to get the functionality working everywhere in one hit without coding like it's 1996.


I fully agree with this. CSS set the web back years with it's inability to handle simple layout. The W3C organization is run by representatives of major corporations, and the standards take years to roll out.


W3C is a web standard, on paper.

http://www.webkit.org/coding/lgpl-license.html

WebKit is living, breathing, open standard that anybody can fork and contribute to.

Code talks, paper walks.


Code is crucially important. That is why good standards bodies require implementations. Nevertheless, that doesn't make an implementation a standard (unless you are satisfied by under-specified, poorly documented, fragile, de facto "standards"). That you don't seem to care about (or possibly don't understand) the distinction suggests you are part of the problem.

NevWebKit is an implementation, not a standard. That you


Why doesn't the standards body ask webkit to deprecate and begin to phase out any -webkit-* CSS selectors which have been fully standardized? That seems like an effective way to force people to use the standards once they actually exist.

Complaining about people using -webkit-* before the standards exist is obviously unreasonable, considering the implications of populating the global namespace with conflicting versions of the same feature.


The standards body has asked WebKit to do that, in fact.

Apple flat-our refused to ever remove -webkit prefixes that they have once shipped.

Note that no one is complaining about the mere existence of the prefixes; the problem is:

1) The way they're being used by web developers. 2) The fact that some of these properties are not standards-track, with Apple actively refusing to allow them to be standardized.


[deleted]


Bringing up racism, sexism, and homophobia in a discussion of Web standards is a little bit extreme, don't you think?


... dcurt.is is an abuse of the TLD standards...


Great points. I'm going to automatically concur on the grounds that I don't think bureaucracies that produce nothing should have a say in the production of anything.


The aggressive dismissal of the concerns are unwarranted and a bit ignorant. The concern is not, per se, vendor extensions -- such a mechanism exists for a reason -- but rather that many users of those extensions have lazily taken to only bothering with webkit extensions. Most of the time for no reason other than an IE-only like "suck it" attitude (many demos front-paged here on HN only work in single browser, despite often needing just trivial changes to work elsewhere).

Dismiss the W3C and the purpose for standards at your peril. Webkit and its offshoots have the ability to innovate on the edge because that body and its impact kept the web open.

EDIT: It's worth noting, with sober consideration, that exactly the same argument was made to support Internet Explorer during the ugliest days of the web. This could have been cribbed verbatim from something a Microsoft advocate would have said in the late 90s.


"Webkit and its offshoots have the ability to innovate on the edge because that body and its impact kept the web open."

I don't agree. I'd say the web saving itself from IE's grip had a lot more to do with Microsoft sitting on its ass and doing virtually nothing to push things forward once they got 90% share. If Firefox didn't have its impressive extension mechanism or webkit in the form of Chrome wasn't so damned fast or the whole mobile browser didn't become a thing, people would still largely be using IE, and none of those circumstances has a single thing to do with the W3C, really.

IMO, the fact that this is even being discussed but now in the context of Webkit instead of IE shows that the W3C isn't really that relevant when it comes to pushing the web along.


You miss the fact that Gecko and Webkit are open source projects. There will never be the case where something like ActiveX can exist with these projects.


The fact that they're open source doesn't matter. We (humanity) have largely compatible open-source implementations of ActiveX in Wine and ReactOS, and if W3C or ISO or someone had deigned to standardise ActiveX this code would've found its way into Gecko and KHTML.

Plus, NaCl is closer than "something like" to ActiveX. What do you consider to be the salient difference between the two technologies?


The salient point is that Native Client is open source and the standard can be adopted by Firefox, Opera, and Safari if developers find that it meets their needs. Native Client is built on OpenGL (cross platform compatible) where ActiveX was developed against the Windows APIs.

Wine has been a long time in the making and certainly wasn't ready to be integrated into Mozilla for ActiveX compatibility on Mac OS, Linux, or even alternative browsers on Windows in 1996, so it's kind of a silly point to make.


Native Client is built on Pepper, a proprietary Chrome technology. The NPAPI port for other browsers was abandoned.


> Native Client is built on Pepper, a proprietary Chrome technology

http://src.chromium.org/viewvc/chrome/trunk/src/ppapi/

Why are you ranting about things you clearly know nothing about?

> The NPAPI port for other browsers was abandoned

You're bitching up and down this thread about standards, and now you're defending the ancient, non-standard Netscape plugin architecture? Do you have any clue what you're talking about, or are you just here to be anti-Google?



How do the projects being open source prevent something like ActiveX from coming to them?


It doesn't prevent a technology like ActiveX from coming to them, it prevents the CONSEQUENCES of such a technology had when it was in IE.


Ok, I still don't understand. I really got into web development around ten years ago, and have been a Firefox since back when Firefox was simply called Mozilla. So I have no experience, developer side or userland, with ActiveX. What were the consequences of ActiveX and how could open source have prevented it?


Well, ActiveX was a proprietary technology developed by Microsoft that only run on Windows, and especially only run on IE. So sites using it were only working with IE.

Webkit and Mozilla being open source means that no single company can control the projects or add a proprietary single-platform exception to them.

Now, a company could FORK Webkit/geecko and add something like ActiveX to it, but the fork wouldn't be part of webkit/geecko project anymore, and we'd still have webkit/geecko proper. For example, Google added NaCL which is something like ActiveX to Chrome, but not to Webkit itself.


Okay, so you now you have edited the source code and compiled a binary, is your recommendation that the binary be downloadable and installable by the user from the website?

Open source is one thing, but if you're not able to convince Google and Apple to ship your changes, you've hit a wall. FF implementing support for the WebKit extensions is what will break the web back into the IE-dominant era.


Note that, to get the web out of the IE-domininant era, it was first necessary for Firefox to implement IE extensions like document.all and innerHTML.

Having a competing browser is useless if it won't work with the content that people want to use.


For the example given were activeX open source then its windows-specificity would be rendered moot as some fork would displace it.

Or, if were webkit not open source then it would probably not end up being a joint apple-google production. Joe-bob's ideas for CSS that he compiles into Firefox himself may not be relevant, but eventually if FF sucks enough some fork of it will displace it as it did for Mozilla (cf. V8 displacing JavascriptCore) per some version of Kuhnian paradigm-shift.


> It's worth noting, with sober consideration, that exactly the same argument was made to support Internet Explorer during the ugliest days of the web. This could have been cribbed verbatim from something a Microsoft advocate would have said in the late 90s.

I hate Microsoft, but the argument you mention was made because it was mostly right. IE "won" because it provided a better platform and better user experience, not for any other reason.

The focus on IE in antitrust battles has been a disgraceful waste of resources. There are far better things to attack Microsoft for than providing a better browser than the piece of shit later versions of Netscape turned into.


And why did Netscape turn into a "piece of shit"? Because Microsoft used their considerable monopoly power (bullying OEMs out of preinstalls, bundling IE with Windows,...) to make it impossible for Netscape to build a business around making a better browser.


The stagnation and backslide started circa 1995-1996. IE was barely a blip at that time. By 1998, Netscape was basically useless and their share was just starting to drop below 50%.

If you look at timelines instead of listening to hysterical lawyers, the truth becomes clear. If you listen to the people who were actually at Netscape during this time, it becomes even clearer. They were utterly lost in the woods before Big Bad Microsoft ate them alive.


Relax. Just relax. The web browsers are automatically updating themselves for a reason.


I am going to go out on a limb and say: fuck standards.

Make a powerful, but extremely simple (as in: small instruction set) virtual machine and protocol. That way anyone can make their own standards.


Sir, do you understand the World Wide Web?


Apparently not - what am I misunderstanding?


More-or-less assured interoperability and complexity of content creation stand out as two of the larger issues. Episodes of the web's history when organizations did create their own standards might be another one.


Kudos to the author of the article. I read the referenced article on glazman.org, and once "glazou" described -webkit-* as having "hardware market dominance", I could not stop LOL'ing until I finished reading his blog post. Seriously?!

I second that the web is constantly evolving, whether or not it benefits the ego-fat cats at Google, Apple, or Mozilla. Get used to it and adapt, at least that's what I remember hearing in a Google Tech Talk back in 2008, does anyone still tune in to those?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: