Because if everyone used SQLite there would be no second implementation, which is something that is necessary for a proposed standard to become an actual standard.
I believe you, especially as it makes a number of lines I've heard around this make more sense... but why? Especially in this case, where there is a standard and it's already been implemented by several? Is it just pedantry because they don't get to call the shots?
It has one implementation implemented identically in more than one browser. SQLite. Which is a standard with an implementation to compare against.
Anyway.
Yes, there is only one implementation, one standard, one non-W3C-spec. Why does it matter to them that there's no competing spec, when they could just support something that is already out there, and can be solved by pointing to http://www.sqlite.org/docs.html ? The API for accessing it is mindlessly simple and small, documenting that would take a day at worst.
It seems they're trying to run it into the ground because they don't control it, because it didn't go through their proper procedure, whereby it must have a competitor.
Why? In what way is this helpful? And if they're not being helpful, what's their purpose? To be dogmatic and prevent good ideas they didn't come up with from spreading?
---
edit:
>The World Wide Web Consortium (W3C) is an international community where Member organizations, a full-time staff, and the public work together to develop Web standards. Led by Web inventor Tim Berners-Lee and CEO Jeffrey Jaffe, W3C's mission is to lead the Web to its full potential. Contact W3C for more information. (from http://www.w3.org/Consortium/)
How does trying to prevent it aid in "lead[ing] the Web to its full potential"?
Why does it matter to them that there's no competing spec
Because ALL web standards are standards only because of the fact that there are MULTIPLE implementations. This is a basic principle behind the Web, and making even a single exception leads to a dangerous slippery slope.
It seems they're trying to run it into the ground because they don't control it
No.
because it didn't go through their proper procedure, whereby it must have a competitor.
Yes. In the long term, not having another implementation is a dangerous sign -- it is a sign that developers are incapable of coming up with another implementation. What if SQLite can't be ported to some hypothetical architecture/stack in the future -- for example, a fully managed OS? What is the guarantee that someone will be able to come up with another implementation of the same spec which can run on that OS? At least with multiple implementations you know that the problem is tractable.
Why? In what way is this helpful?
It is helpful because it demonstrates long-term thinking.
And if they're not being helpful, what's their purpose? To be dogmatic and prevent good ideas they didn't come up with from spreading?
And the multiple implementations of IndexedDB are...? (don't actually know. If you do, I'd be interested in links)
There's an early one in Firefox 4 beta right now [1], and an upcoming one in Chrome [2]. Microsoft will probably implement IndexedDB in a future version of IE [3]. (IndexedDB isn't a standard right now either, but it'll be on track to be one once two implementations are complete.)
I don't think any browser shares all the peculiarities of any other browser. That's hardly an argument against there being a W3C spec for HTML, CSS, the DOM, or JavaScript. Why pick on SQL?
By "peculiarities" I didn't mean minor implementation differences but the major ideas behind the implementation. Be reasonable here -- the differences in implementation in the DOM between Gecko and WebKit (say) are trivial compared to "type checking" vs "lack of type checking". There's a bit of subjectivity involved, agreed, but I think any reasonable person would agree about the differences.
There are indeed differences. Much greater with SQLite vs (most) other SQL engines.
I still see it as killing something out of some ulterior motive rather than supporting something highly effective. SQLite exists because it has survived against many competitors for a long time. It's not like this is some new, untested thing that cropped up out of nowhere.
edit: just realized there's a much stronger argument here. And a bigger threat.
By not supporting WebDatabase, because the implementers all used SQLite, they've essentially guaranteed that they will never support an SQL-based database. Few are going to want to compete with SQLite on its own turf as an embedded database when it's already so optimized. Especially if they all have to agree on a spec similar to SQLite.
SQLite isn't even all that poor of a match for a JS-interfaced database. JS is nearly as fast and loose with its types as SQLite, and the types that do exist are very close to what JS has.
You still haven't answered what you're going to do if the current SQLite code simply can't run on a particular OS stack. There's no guarantee that every future stack will have a C compiler, or even allow C programs.
All that matters is the behavior, which SQLite has very well defined because it's a bit wonky out of maintaining backwards compatibility. Even the fundamental algorithms will still translate, so all that's needed is to compile it or something logically similar for the system with whatever compiler it allows. Heck, it'd probably be easier than implementing off an abstract spec, especially when you consider most of the optimizations will still translate but do not need to be discovered.
If such a system exists, it won't be able to run the vast majority of stuff out there. Re-implementing SQLite will be the least of its concerns.
Well, I think having two implementations is necessary for a "standard" to actually truly be one, because if there's just one then you might as well be standardizing on the implementation rather than on the spec. I don't agree with standardizing on implementations in principle.
Hardly. An API can exist entirely without implementation, and pulling an API from an implementation doesn't need to expose any details of the implementation. That's kind of the point of an API - it's an abstraction. And what are the specs, if not APIs?
Not standardizing on an implementation: makes sense. Refusing to standardize on an API until there are two+ implementations: ?
Refusing to standardize on an API until there are two+ implementations: ?
There are good reasons for this, like uncovering undocumented-but-essential behavior. I think if you are going to standardize an API you should have two implementations, but I also think it's reasonable to standardize a single implementation as long as it is licensed so that everyone can use it (e.g. SQLite).