Isn't the point of JS that you do not need an SDK!? If I need to use something else then notepad to write a program for FFOS - that would be a big turn-off.
I think you're mistaking a SDK (software development kit) with an IDE (integrated development environment). A SDK is nothing more than a set of libraries that allow you to write code for the target hardware. In this case, the SDK consists of a Firefox extension that you can install, which gives you an emulator that will simulate the phone hardware.
You can totally write FirefoxOS apps in Notepad if you want to. It is just HTML, CSS and Javascript (and a JSON manifest that tells the OS where the entry point for your app is).
I thought the whole point of Firefox OS was that the hardware layer would be accessed through JS API's like window.ondevicemotion, Canvas, navigator.geolocation, window.DeviceOrientationEvent, navigator.getUserMedia, navigator.vibrate, 'devicelight', navigator.battery.level, SMS, contactList, etc. And that the Firefox team would add even more features that can be accessed from the VM layer. And making the Firefox OS the most secure OS by adding app permissions. And that the VM could not be bypassed unless the computer gets re-flashed.
This might sound funny, but considering Moore's law and where we currently are on that exponential curve, making a fully virtual OS will be the next step in the PC evolution. You shouldn't have to write low level interrupts or accept that apps need full root access anymore.
>This might sound funny, but considering Moore's law and where we currently are on that exponential curve,
In case you haven't noticed, the exponential curve has already started to level off. Moore's Law is done. Every incremental improvement from Intel offers smaller and smaller improvements, as transistor sizes run into fundamental physical limitations imposed by quantum mechanics and thermal and battery-life constraints impose limitations on the amount of power a chip can use. Combine that with fact that RAM speed stopped scaling about a decade ago, and the conclusion is clear. Moore's Law isn't ending. It's already gone. As far as desktop application developers are concerned, Moore's Law ended about a decade ago when single-thread performance plateaued.
Yes, we have more and more cores, and more and more cache. But the cache is only there to make up (poorly) for the fact that RAM speeds haven't scaled. And more cores don't help, because most desktop applications don't parallelize easily beyond ~2 threads. Yes, the user is able to run more applications at the same time, but each individual application is still running at about the same speed.