So write your own open source reverse engineered drivers or find a manufacturer that gives you docs for their sensors. It's what the Linux crowd has done for a decade now. That's what hacking and tinkering are all about. They are doing free and open Adreno drivers for example. Also Tegra. Saying Android is not open source because it doesn't come with open drivers for all current and future hardware is just plain impractical. It just means you want everything in your plate. It doesn't mean Android isn't open.
Edit: Noticed people might be taking offense to my strong wording.
Quite, but by the time you finish that the next generation is out and everybody has moved on.
The same thing happened with the HTC devices and Linux ports, it took years to get a port working and by the time it did everybody had a newer Android device (or whatever). A few people still use the HD2 and I still see questions on our (HTCLinux) Facebook page from people, but, for the most part, even XDA has moved on.
The 'current and future hardware' part is what Linux is really about, it's an abstraction layer over the specific hardware so that the user experience can be implemented once and used on many devices. Unfortunately, the lack of support for current 3D hardware means that the abstraction is incomplete. The development processes of the OEMs also break this abstraction, with everything build around one version of the kernel, one version of the phone HAL, one version of the camera driver and module, and the rest. Android, as delivered by OEMs, is not designed to adapt to other hardware. It's open in the sense that the core code is available, but not in the sense that the higher layer can be used as is on a new device.
Support for future hardware is not neccesary, but a profile that allows that hardware to be used once drivers are provided is desirable. This requires a strong abstraction between the Android userland and Linux kernel, GLES with standard DRI/DRM interfaces, UVC for the camera module (or an SoC-specific open specification much like ASoC is for audio), standard IOCTLs and sockets for network devices (wifi could use net80211 for instance), upstream Bluetooth stack including WiPHY, HE, and LE support, with working A2DP audio routing, input devices with valid metadata in the driver so they can be probed, supported flash filesystem, and probably some others.
Then Google takes this and builds a set of tests that verify that these requirements have been met, and states that the next 3 versions of Android will use this base.
They also provide an ABI guideline (so version) for the phone interface, the audio playback, sensors, and other aspects provided by libhardware. They make the same commitment: the next 3 versions will require this so version of libhardware.
The OEMs get this set of guidelines and build a device to meet them, passing the tests provided by Google.
Then Google releases an OTA version of Android that works on top of this base. Then they release another, and another.
As features require it, they amend the guidelines, adding NFC for instance. Not all devices support NFC and on those that don't, the functionality is not exposed by Android.
>Android, as delivered by OEMs, is not designed to adapt to other hardware. It's open in the sense that the core code is available, but not in the sense that the higher layer can be used as is on a new device.
Don't you think that's more due to the way ARM / mobile hardware scene is than due to Android? There isn't something like a BIOS or ACPI type of standardization to make that kind of thing happen. Also the pace of obsoletion is way faster in the mobile space than it ever was in the PC space.
Initially it was, but Android/Google have become the establishment. And ARM-based devices are becoming our general-purpose computing devices. Having an option for running the software we choose is the question of whether we have open computing device in the future.
There actually is a solution at the hardware/firmware/driver level now, the ARM Linux community has adopted DeviceTree, actually a varient called FDT, and drivers have migrated from the one-off platform buses to FDT. Even vendors like Qualcomm are embracing it and the time to upstream is falling.
As far as bootloaders, there are a number of open options. also don't personally see anything wrong with EFI on ARM, as long as there is a way for the end user to choose what signatures are accepted when booting software. A standard for describing memory layouts independent of hardware is important, as that allows for the newer multi-SoC ARM Linux kernels along with DT for defining the hardware and what driver should bind to what hardware device.
There are standards that exist at this layer, as long as the OEMs and SoC makers adopt them.
The basic concepts of SoCs are not being obsoleted, and the basic set of devices that need to be present to boot a system are not really changing that much, specific buses are changing, better DMA mapping is supported on newer hardware, including IOMMUs, but the basic concepts are still pretty much the same.
My proposal takes place at a different layer though, and is more focused on the fragmentation problem of Android.
Essentially you have an ARM ABI standard which defines what registers a call into the Linux kernel uses, this used to be more complicated with multiple ARM ABIs. Now it's pretty simple and ARM binaries on Linux work across devices, across SoCs, across Android OEM vendors. This is not much better or worse than the same compatibility on x86/x64.
The issue lies in the "expanded ABI," the suite of drivers, libraries, daemons, configuration, RPC procedures, hardware specific IOCTLs, and other particulars of each OEM's hardware. A Samsung camera APK isn't going to work on a device with a different sensor, even though they are both coded to Android's camera APIs. They might handle zoom differently, or lighting. They might handle sending the uncompressed stream to the DSP differently, or drawing into the LCDC different for real-time previews. They might handle flash timing differently. But what would happen if this was a plugin for the standard Google or even AOSP camera instead? What if this was a defined API/ABI that CyanogenMod could use on that hardware, or Firefox Apps with embedded WebAPI camera media source and canvas? What about capturing from the command line with something like uvctool?
Numbers like 4.4 at 2% (from another story today) should scare Google. It should scare them when apps built for Chrome Mobile don't work on all Android devices. It should scare them when the improvements made to Google Now won't be usable by an overwhelming proportion of their users. It should scare them when others like Yahoo and Mozilla can come in and poach Android users stuck on an outdated version of Android and build alternatives economies on their own (Google's) partner's flagship devices. And they should respond by competing.
Edit: Noticed people might be taking offense to my strong wording.