Hacker News new | past | comments | ask | show | jobs | submit login

Sadly a lot of people prefer this over an open environment, as locked down environments usually work out of the box, consuming recommended information is easier than searching by yourself. Centralization brings convenience at the expense of freedom, but it is much easier for us to feel convenience (or being frustrated by something inconvenient) than to understand the importance of freedom.



I'm not sure if people prefer it, or that they are strongly steered towards that environment.

My apple watch is super cool from a hardware perspective, but so locked down that I can't use it in the ways that I would like to. For example, it has a barometric pressure sensor, but I/my apps can't directly read the sensor data, instead a filtered update is pushed to the app approximately every 1.5 seconds. Why? I know the sensor is capable of reading at 20+ hz.


The hardware data being sent to a centralised service which dishes it out when it feels like it is almost always for battery saving.


So that someone doesn't write an app that polls at 20+ hz and burn the battery. Apple is optimized for user experience and simplicity, that includes precluding bad behavior. If you want a real time weather station, a smart watch is not the right tool.

My hot water kettle heats water, why can't I wire it up to be my whole house heater?


I mentioned it specifically because I write software for hobby devices that poll at 20hz that use literally the same Bosch sensors as apple. The use case is for gliding variometers (audio altimeters)

The sensor that is in the Apple Watch draws significantly less than 1ma when polled at 20 hz. Without an EE degree I have my devices, including the 90s era processor and piezo speaker, running for 100+ hours on a 150mah button cell.

I cite this example because I KNOW what is possible. This is a pure software issue.

I suspect that apple rate limits because the raw sensor data is quite noisy, and would look glitchy in a badly designed app. But there is a lot of signal in that noise that I want access to. Instead people in the gliding hobby spend hundreds to buy devices that have the same sensor package as an iPhone 6, but are able to access the sensors in a way that are useful.


Well, some of my friends prefer Apple's walled garden because they think the applications there are better than open source ones, and consider policies requiring Apple to allow third-party app store as dangerous to users...


I think the parent's point was that they might think that way because of Apple marketing, not any kind of innate preference.


There can be many 'why'-answers for such a context. While we can see 'a sensor' with some properties like data format, refresh rate etc. that sensor is a mere 'implementation' of a desired functionality of a product design.

It used to be that the designed feature or function would be very close to the implementation, but that really hasn't been the case for a very long time. People aren't buying "a large bank of memory addresses" but rather "a device that contains pictures", for lack of a better example.

With the watch a customer isn't actually buying a package of sensors, ARM-cores, BMS, Lithium-Ion battery, and display, but they are buying the experience of having a device that tells the time, notifies them if something happens and can track some aspects of their life so they have an overview of it later (be it for turning their life into a game or simply tracking their energy use/consumption). And then all of that for at least an entire day.

Why would the implementation of the feature result in a sensor that can be polled at a high frequency but is actually only pushed at a lower frequency? It's anyone's guess but here is my guess:

The sensor has its own specs but those are set in isolation and might differ based on implementation inside a casing, so they only way to get true data would be to have some form of calibration or offset where a low-power CPU core for sensor tasks just gets the raw values and applies the offset/calibration. Next, there is power consumption where they might have found the perfect balanced duty-cycle between data that has had enough time to cool down and be useful as well as power requirements for the sensor core and the sensor itself. So they have some sort of RT OS doing reading, processing etc. on a low-power core at a lower interval to get a 1% battery life increase. Do that for 10 sensors and suddenly it's worth it. It's quite an investment to have a team of people dive into the hardware, firmware and application development to do all that, so it's likely not a matter of "how can we spend a multi-million R&D chunk on making the hardware less useful", but rather some "how do we make millions of mass produced devices use a little bit less power" concept.

This is also where the push vs. pull comes from, instead of having every application do some interrupt or scheduling, you just ask to be part of a list of observers and when the data changes you get notified. Much more efficient, and if everyone has to do that, there is a much smaller chance of the user experience suddenly changing and support personnel (phone, in-store) getting complaints about something they can't fix because some third party app did it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: