I was gifted AirPods, but I'm on the Android ecosystem. I see a huge difference between what I can configure on an Apple device vs my Android devices. For example, on an Apple device I can enable features to help me hear better in a noisy environment which would be nice to have.
I wonder if they are using a proprietary configuration API that is deliberately kept secret or if no Android devs have figured out how to reverse engineer it yet (seems very unlikely).
If the likely former, I'd like someone to address this as well as it almost feels like if they're getting into hearing assistance features, then accessibility becomes important.
I wonder if they are using a proprietary configuration API that is deliberately kept secret or if no Android devs have figured out how to reverse engineer it yet (seems very unlikely).
If the likely former, I'd like someone to address this as well as it almost feels like if they're getting into hearing assistance features, then accessibility becomes important.