Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The scenario they're protecting against appears (to me) to be a bad actor changing the hardware. If the Touch ID sensor is bad, and you propose falling back to a PIN, who's to say the digitizer isn't compromised too and is recording touches?

Sounds a bit like a downgrade attack if they didn't fail fast and hard at the earliest opportunity.



> If the Touch ID sensor is bad, and you propose falling back to a PIN, who's to say the digitizer isn't compromised too and is recording touches?

So why not brick the phone out of the box? Even if the finger scanner is original, somebody may have compromised the digitizer anyway.


I don't think they are protecting against that scenario so much as not accounting for it. I expect Apple's assumption is that they provide all components for their devices. People who install unauthorised third party components can no longer have those devices serviced by Apple — so it no longer matters to Apple whether those devices are compromised, because they aren't really "Apple" devices at that point anyway.

This botched handling of replaced hardware that hasn't been paired with the Secure Enclave ties into the above. Apple doesn't expect people to replace such hardware through a third-party, so they don't think to engineer their software to fail gracefully when it happens.


There's no need to be so hostile and assume malice when there's plenty of perfectly sound explanations otherwise.

Apple is a fairly security conscious company now so security tradeoffs should not be a surprise.


Assume malice? How do you mean?


There is only one valid reason to authenticate the fingerprint scanner before using it, and that is to prevent the use of aftermarket replacements.

No matter what the motives behind this mechanism were, it was put in place exactly to prevent 3rd party scanners from working.

And if they implemented authentication and didn't even test what happens if it fails, then well... how do they know it works at all?


If you have a secure enclave within the device, then any hardware which has a direct connection to that secure enclave must be authenticated. It doesn't matter about aftermarket replacements.

The entire purpose of the secure enclave is defeated if it trusts any hardware connected to it.

I'm not saying they didn't test what happens when it fails. I'm saying they didn't do user testing on what happens when it fails. I'm sure the engineers tried out the hardware authentication system. They just didn't test the whole scenario once iOS was sitting on the end product.

So yes, it was put in place to stop any hardware that could not be trusted from accessing users' secure data. But no, it was not done to prevent aftermarket replacements.

The only reason I can see Apple caring about aftermarket replacements is because they are often low quality, and cause customers to go back to Apple with unauthorised repairs. (I've witnessed this more than once in an Apple store, someone coming in who had their screen replaced outside Apple and the touch digitiser was failing. Apple just sends them away.)


> If you have a secure enclave within the device, then any hardware which has a direct connection to that secure enclave must be authenticated.

Consider reading the description of iOS security features linked somewhere in this thread.

Because what you are describing is a disaster, not security. If some off-chip sensor had access to fingerprint data or crypto keys, anybody capable of installing such chip would also be able to simply dump all the data himself in the comfort of his lab.


If I understand this correctly (not an iphone person), the touch ID sensor is just a fingerprint scanner?

As a standalone measure, biometrics make a shitty password substitute because you can't change a finger print if it's compromised, so shouldn't the iphone be secured on the premise that the finger print scanner is already compromised, hence losing it should not qualify as a downgrade attack?


Touch ID is a fingerprint scanner, but the Touch ID system is paired with the "Secure Enclave" in Apple's AX chips.

Secure Enclave is a separate coprocessor running its own L4-based microkernel. This hardware is directly paired with security-sensitive hardware (Touch ID, Apple Pay NFC chip, etc). It provides all cryptographic operations for data protection key management and maintains the integrity of data protection even if the kernel has been compromised.

So when you stick a third-party Touch ID sensor in an iPhone it's obviously not going to be paired with the secure coprocessor. It doesn't really matter whether biometrics are shitty passwords, the iOS update process realises there is compromised hardware touching the Secure Enclave and fails in the worst possible way for the user.


Does this mean your fingerprint never leaves the scanners coprocessor and is inaccessible to iOS and the other processors and OSs within the phone?


Basically, yes. The Secure Enclave is hardware isolated from the rest of the chip.

Apple's own security guide explains it best [1]:

> The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered ngerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but cannot read it. It’s encrypted and authenticated with a session key that is negotiated using the device’s shared key that is provisioned for the Touch ID sensor and the Secure Enclave. The session key exchange uses AES key wrapping with both sides providing a random key that establishes the session key and uses AES-CCM transport encryption.

Regarding the actual fingerprint storage, it looks like the encryption key is kept in the Secure Enclave and the entire decryption and verification process occurs within the Secure Enclave. However the encrypted data itself may be stored outside the Secure Enclave:

> The raster scan is temporarily stored in encrypted memory within the Secure Enclave while being vectorized for analysis, and then it’s discarded. The analysis utilizes sub-dermal ridge flow angle mapping, which is a lossy process that discards minutia data that would be required to reconstruct the user’s actual fingerprint. The resulting map of nodes is stored without any identity information in an encrypted format that can only be read by the Secure Enclave, and is never sent to Apple or backed up to iCloud or iTunes.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Yeah, there's the source of that quote, 'sub dermal ridge flow angle mapping', which at the time was described as 'how we know it's really your finger', along with supposedly measuring 'micro RF fields' to ensure it was a live finger.

Except it could be defeated by a laser printed fingerprint on a piece of paper (initially).


Yes, at least according to Apple: https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 7).

"The Secure Enclave is responsible for processing fingerprint data from the Touch ID sensor, determining if there is a match against registered fingerprints, and then enabling access or purchases on behalf of the user. Communication between the processor and the Touch ID sensor takes place over a serial peripheral interface bus. The processor forwards the data to the Secure Enclave but _cannot_ read it".


Yes. The OS can only tell if the fingerprint matched or not. The checking is done solely in the secure enclave. Presumably the OS also gets told which fingerprint matched too?


I'm pretty sure there was no direct malice here (except for the usual Apple disregard for 3rd party components), but the security reasoning behind this move is still questionable. The only danger of a compromised Touch ID sensor, is that it could record your fingerprint and let the attackers replay that fingerprint to access all your encryption keys on Secret Enclave.

That would be a huge vulnerability, if there hadn't been thousands of other ways to record your fingerprint, and while most of them are less accurate than a trojan Touch ID, they're also much easier to pull off.

And at the very worst, if a sophisticated malicious actor got the chance to meddle with your phone, they could just skip the Touch ID sensor altogether and install a stealthy fingerprint digitizer in the touch screen or on the back of the phone.

So in short, Apple's security measure, if my understanding is correct, does absolutely nothing to protect the user.


THE critical security system of your phone has been tampered with. PIN data, TouchID data, crypto data and everything else related to security is on that same bus. You do not detail secure info over that channel.


Aha, that makes more sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: