Hacker News new | past | comments | ask | show | jobs | submit login
A New Standard for Mobile App Security (googleblog.com)
75 points by theafh on April 15, 2021 | hide | past | favorite | 34 comments



> The Internet of Secure Things Alliance (ioXt) manages a security compliance assessment program for connected devices. ioXt has over 300 members across various industries, including Google, Amazon, Facebook, T-Mobile, Comcast, Zigbee Alliance, Z-Wave Alliance, Legrand, Resideo, Schneider Electric, and many others.

Apple being noticeably absent from this list while Facebook is on it speaks volumes.


This is interesting to me because Apple pushes the security so hard for connected devices. My understanding of the early HomeKit tech was that it was in fact prohibitively secure, and most low end manufacturers were unable to meet the security bar at the price they were targeting.

This could be a bad indication about Apple, but I think I'd end up interpreting this as a bad indication about ioXt. Big consortiums like this often end up creating a lowest common denominator, and Apple are known for not bowing to pressure to engage with things like this where they believe they can do better.


It's more like Apple loves to push their own standard on products and then hide them behind NDAs

The amount of hoops you had to jump through to make a MFi bluetooth device was insane.


It's more like Apple loves to push their own standard on products

People said the same thing about Apple and USB. iPod and FireWire. MacBooks and Bluetooth. Apple not supporting HD-DVD. Apple not adding Blu-Ray to SuperDrive. Apple and DisplayPort. Apple when it released its first Airport.

It didnt matter that other companies were also using the same standards and methods. People are hypersensitive to what Apple both does and does not do.


I think that is the parents point...


Yeah I wasn't sure which way it was leaning so thought I'd add some more details to the discussion.


Looking at the details for a ioXt Certified Mobile Application [0] vs. the details for an iOS app, at first glance the ioXt specifications do not include privacy details like the iOS app store does.

[0] - https://compliance.ioxtalliance.org/product/173

[1] - https://apps.apple.com/us/app/facebook/id284882215


Apple is a member of Zigbee Alliance / Connected Home over IP. Also https://github.com/project-chip/connectedhomeip#connected-ho...


In which way? In that this alliance deserves a lot more skepticism (if Apple of all companies didn't join)? Or in that Apple should be more supportive of securing open, non-Apple platforms (if Facebook of all companies joined)?


The former. Apple is supportive of an open, non-apple platform for iOT / connected home stuff: CHOIP.


> Apple being noticeably absent from this list while Facebook is on it speaks volumes.

My thoughts, exactly.


Funny how the S in ioXt stands for security...


A lot of this seems to focus on certification of the cloud things that you have to use with most IoT devices.

Since HomeKit operates locally, with the exception of remote access to your home through a HomePod/Apple TV/iPad hub, could that be why Apple hasn’t shown an interest in this?


Subterfuge has been a successful strategy. Take on the mantle of something people are keen on, water it down to your advantage and push it as being superior.


The standard itself looks kind of good, honestly! Have a look at 4.2 (pg 9):

https://static1.squarespace.com/static/5c6dbac1f8135a29c7fbb...

I think tech as an industry has come quite a long way in the past few years of clarifying (and hopefully adopting) what security best practices is. This, OWASP, and other similar initiatives are good: you can have a rough checklist of what's required to have a roughly "secure" app, and what the common loopholes are. This is not easy stuff.

I don't know what this means for the institutional/enterprise side, whether certification will be meaningless, etc. But the document itself seems relatively sensible to me!


Their certification requirements do seem solid but the concerning thing about ioXt is that, if you look at the list of certified devices / apps, there's a grand total of 3 pages containing less than 50 entries. Given the large list of members, it seems odd to have so few issued certifications.


This is kind of a ridiculous announcement, but I appreciate what they're trying to do. I find it ironic that they're trying to improve security & privacy for app users, but then the first apps they certify are almost exclusively VPNs, which are some of the worst for user privacy.

VPN vendors collect all kinds of data on their users and are sometimes even backed by intelligence agencies. Sure, use them to get around region restrictions for something uncontroversial, but don't send all your traffic through them and expect privacy.

I also see that they didn't tackle the hardest part of mobile app security -- the backend services. Many apps scrape data from the device and then push it to the service, where it is logged (for how long?) and reused in who knows how many ways. The lack of transparency around backend processing is the real problem for app users.

How many users have been had their data exposed by an open S3 bucket or database versus by a vulnerability in the app code itself?


I bet mandatory device updates isn't part of the list.



I stand corrected, lets see if Google goes beyond 3 + 1 on their devices.


isn't this this just STIGs by another name?


"The ioXt Mobile Application Profile [...] leverages existing standards and principles set forth by OWASP MASVS"

In a way DoD (STIGs), NIST, Mitre, SANS, OWASP are all different takes on the same general idea.



I wouldn't be surprised at all if this turns out to be a way of locking out users and destroying right-to-repair/modify... all in the name of "security".


What? It's a self-certification checklist for apps (really the cloud services behind the apps, it only has one section about the app itself and one of the items is just "provide a privacy policy"). Nothing to do with the OS or devices.

The document the post is about: https://static1.squarespace.com/static/5c6dbac1f8135a29c7fbb...


The problem is that it mixed good recommendations along with user-hostile ones. I have no qualms about things like "Detect and throttle guessing attacks" and "Require authentication for remote services containing user data.", but then there's also...

SE1.1 End of life notification policy is published SE1.2 Expiration Date is published

Planned obolescence.

AA4 Security Updates applied automatically, when product usage allows

VS4 Anti-Rollback

User-control and herding. "You want this feature we removed? Too bad, fuck you."

SI113 Enforce x509 certificate pinning for primary services.

You can't easily MITM and see what data it's exfiltrating.


What is it that suggests that might be the case?


https://developer.android.com/training/safetynet

When they say "security threats" they really mean "root access".


History.


It may spell "security threats" but it reads "user control"


We don't need a security standard for apps from bluechip companies, we need better vetting of apps uploaded by nobodies to the app stores. And the Chrome plugin store while we're at it. I always thought Google should have a score system for Android developer accounts. You need to earn trust to have access to sensitive Android APIs.


What's your desired process for earning trust?


New Google accounts start off with little trust. New apps start off with little trust. Established apps should have good trust along with Playstore verified developers. Android is destroying a lot of great file manager apps with the new scoped storage. These are great apps that have been in the store for 5 years plus. Surely these can be trusted.

Of course, there'll be ways around this but it would stop a lot of dodgy apps. Of course, this is a lot harder as it sounds as you'd need all Android apps that want access to sensitive APIs to interact with the Playstore API. But it's better than Android just imposing systemwide security changes that break a load of apps.


What if you want to make an app that doesn't work without a trust feature?

Can you sell your trusted account?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: