Yet years later there is still no good solution for that space and IndexedDB is a total clusterfuck.
I'd be far more worried about the mess at the core of the web, css and rendering, than about exploitable bugs of SQLite.
The fact that a RCE in SQLite is HN worthy is indicative of that.
Browsers have tons of RCE that are fixed every year, but it happens silently because everybody is so numbed to it.
The quoted argument is a copout of them.
HTML is also a "Living Standard" a.k.a. we just implement whatever we feel like, and write it down once we feel like it has stabilised a bit. They could have done the same for SQL, but NoSQL was en vogue at the time so they pretended that SQL needs to somehow hold up to much higher standards than the usual mess they produce.
SQLite is probably one of the few pieces of software that is actually trustworthy, unlike the dumpster fires of C++ and feel good essays, that we call browsers.
Web standards are meant to have multiple independent implementations. That’s pretty much the entire reason that Google pays for Firefox at this point. “Everyone should just use SQLite” is a slippery slope to “everyone should just use Blink”.
Blink is exactly a good example on why starting out with SQLite would have been a good idea.
Blink is a fork of Webkit, an engine soo much better than the alternatives, that it almost over-night became the de-facto standard.
Did webkit ruin the web?
Eventually apple and google disagreed and blink was forked of off webkit.
The same thing would have happened to SQLite as the foundation of a living WebSQL spec.
It's ironic that Mozilla pushed IndexedDB through, yet they were among those too lazy to provide their own implementation. Instead they simply dump everything into SQLite, same strategy done by Apple.
They left it to google to implement the only differing implementation based on LevelDB.
But hey, it's totally important to have multiple independent implementations...
There are at least three implementations of IndexedDB. Two are built on top of SQLite, but are different codebases aside from that, as far as I'm aware - I don't think Mozilla just merged in Webkit's IndexedDB implementation. Firefox's implementation came out well before Safari's, for a start.
What would be your opinion if everybody just decided to use Google's implementation of WebRTC wholesale, rather than building out their own systems? What if Mozilla decided to rebase on top of Blink, and give up on building its own rendering engine, tomorrow?
I wouldn't mind if they realise that their implementation is bad enough to be replaced, it would be a win for all.
The implementations would start to diverge again anyways.
The engineering time saved by piggybacking of a working implementation can the be reinvested into improving the fork massively, or into simplifying the design and specification.
Like I said, webkit and blink are the perfekt example for this.
As for the first point. It doesn't really matter if they have their own glue code, all the important parts are shared.
Right, so the web was a wonderful place when IE6 was the only browser anyone developed for, and for the short period of time when Chrome was the only browser anyone developed for. This definitely didn't affect anyone's ability to choose a browser which met their needs, and definitely didn't result in half-baked and overly complex specifications being forced through the standards process by the only browser vendor with any power.
If Google got their way, we'd be shipping modified LLVM bitcode to clients ("PNacl"), and every browser would be shipping some random fork of LLVM stuck in the past forever. If Microsoft got their way, GMail would be an ActiveX plugin.
Gecko has massive improvements over Webkit/Blink, btw - WebRender is huge.
You're forming a false dichotomy, and you're mistaking competition of code with competiton of institutions.
Having different organisations with different goals is what prevents these scenarios.
Otherwise the webkit blink thing wouldn't have been successfull.
Mozilla could also have forked blink and started replacing it with rust, and you would've gotten the same improvments.
Mozilla could have taken SQLite as a foundation, started a living spec, and immediately begun translating the codebase to rust.
The effort would have been the same as for their half assed IndexedDB stuff, but the result
would have been much better.
It doesn't matter where a code base comes from, it matters where it goes to. And when it comes to
diversity of implementation the repelling forces of different ideas, viewpoints and aesthetics that normally result in dreaded project forks, work for the advantage of all in browsers.
Conways law: The software of projects reflects their social structure.
"Rewrite it in Rust" is not the only difference between Gecko and Webkit/Blink (which are still similar enough that they might as well be one codebase), and believing so is showing your bias. WebRender, for example, is not simply "rewriting part of a renderer in Rust". There's significant differences between how Gecko and Webkit handle media under the hood. And both have pushed various specifications that would be easy to implement in one but not the other. Google are, admittedly, much better at being incredibly loud about "standards" they try to force through.
In theory, the purpose of a standard is to allow other people to implement it, from spec. The spec cannot be "just use this existing codebase". Otherwise we'd have one HTML parser that sits entirely undocumented, and the HTML spec would be "do whatever libhtml does" - we've seen that in the form of OOXML. The media streaming spec would be "just use this binary blob from Adobe, or you can't do video at all". If I came along today, and wanted to implement WebSQL, which is entirely specified as "do whatever SQLite does", from scratch... how exactly would I start? In theory, right now, with enough time and money, I could implement a javascript interpreter or html renderer or whatever else without ever referring to any other browser's source code or depending on anything - a clean room implementation. Some companies still actually do that, because Webkit, Blink and Gecko don't meet their needs and wouldn't without a complete rearchitecture. Imagine if the javascript spec was "just do whatever V8 does", and we could never get things like QuickJS or Duktape.
When I, a web developer, come across something that looks like a bug in the One True Codebase, how do I know whether it's a bug or something someone forgot to document properly? What if that bug isn't present in another implementation? Do we have to be 100% bug compatible with some arbitrary version of SQLite/V8/Blink forever? Getting rid of most "quirks" was the best thing to happen to the web from a developer perspective in a very long time, IMO.
What about when someone comes along and suggests something that would work really well in the One True Codebase, GeBlinKit, but it turns out that nobody else with a different code design could reasonably implement it?
You're really bringing up false dichotomies all the time.
Nobody ever argued that it was about the programming languages or equal implementations, but about project stewardship and diverging code bases. They influence each other, it's not only about one or the other.
I don't know where you get your weird 100% bug compatibility idea from, that's literally how nothing is handled anywhere.
This is also orthogonal to specs, you can have specs that completely dictate specifications (like CORBA) or that are super loose in what they allow (ANSI C).
There are not only reference documents but also reference implementations, as projects grow it's ok do diverge from them, and find common ground in other documents like specs.
Sometimes they cover reasonable behaviour so well that they can work as an alternative to a specification, like sqlite and https://sqljet.com/ . That doesn't mean that they'll never change, SQLite regularly has bugs discovered and fixed. If the SQLite devs don't even adhere to your assumed "aLL bUgS aND BEhAViOUrs aRE SAcrEd AnD MUsT Be KEpT InDeFInitElY" philosophy, why would anybody else?
As if there is some kind of weird rigid black and white process involved with these complex projects, that is either good base implementation and no spec ever in the future with 100% backwards compatibility, or waterfall spec development followed by implementations that asymptotically approach the spec.
Where theres a will theres a way, these projects and documents are all about people and the ways they collaborate and work. It's not as rigid as you make it out to be.
>When I come across a bug how do I know whether it's a bug or something someone forgot to document properly? What if that bug isn't present in another implementation?
You do what you currently do. You go to a place where the people that steward the project reside and you ask.
Why and how do you think specs get revised? They contain ambiguities, bugs, and unspecified behaviour. Somebody stumbles upon it, and asks a question.
>What about when someone comes along and suggests something that would work really well but it turns out that nobody else with a different code design could reasonably implement it?
You'd do what you currently do. You talk about it, and in the end you might even write it down somewhere, in a spec, in an rfc, in a piece of documentation.
You seem to think that SQLite would stay the reference implementation for ever, which is simply not true. It's a good starting point yeah. But webkit didn't stay the reference implementation either, nor did netscape.
I'd be far more worried about the mess at the core of the web, css and rendering, than about exploitable bugs of SQLite. The fact that a RCE in SQLite is HN worthy is indicative of that. Browsers have tons of RCE that are fixed every year, but it happens silently because everybody is so numbed to it.
The quoted argument is a copout of them. HTML is also a "Living Standard" a.k.a. we just implement whatever we feel like, and write it down once we feel like it has stabilised a bit. They could have done the same for SQL, but NoSQL was en vogue at the time so they pretended that SQL needs to somehow hold up to much higher standards than the usual mess they produce.
SQLite is probably one of the few pieces of software that is actually trustworthy, unlike the dumpster fires of C++ and feel good essays, that we call browsers.