>> but if Apple creates this tool (that can be applied to any phone (or at least a large subset of their phones)), it's essentially the same as if the decryption tool were present in every phone, even ones sold before the commission of a crime.
This concept I find interesting. If Apple did have the ability to create this tool, then is there much of a difference?
I guess the difference is whether Apple were to give such a tool to the FBI, rather than creating it, extracting the data and deleting the tool. But who is to say that some clever individual can't create their own tool to achieve the same thing.
Until Apple deliver on their desire to create phones you can't decrypt, I would therefore consider all iphones vulnerable.
It's true, it's a fine distinction, but there's still a significant different between "There is no software in existence that can break into this phone and we are the only ones that could ever create such software" and "We have software that can break into the phone, but we promise not to use it. Unless someone tells us to and we've already got dozens of such requests in queue."
Third parties can't do anything because iPhones accept software only if signed by Apple. So Apple is the only one that can create a software that lower its protection level to allow for bruteforce.
And Apple can't decrypt the phone: it can remove the bruteforce counter and/or exponential delay, both of which are paramount parts of the security architecture that requires users to be able to unlock the phone using a 6-digit code (4-digits for non Touch ID phones like that in discussion), which in turn is an important UX compromise to allow to reach the goal of having 90% of 1B iOS devices with passcode lock activated.
Allowing for a remote bruteforce is basically undermining the whole security architecture.
To make a phone that is not bruteforcable, you need to give up on the 6-digit concept, which is already an option (I turned it on on my phone) but it's strongly unlikely to gain widespread acceptance
Apple can decrypt the phone -- once they remove the bruteforce protections, it would be trivial for them to bruteforce the typical small phone unlock keyspace.
The FBI hasn't asked for this since they can trivially do it themselves, but Apple could certainly do it if they wanted to.
And if they are forced into creating this hack, then the next request will be to force them to decrypt the phone too since once manufacturers can be coerced into doing anything that the government demands in a warrant, what restriction would prevent the government from asking for full decryption?
Bruteforce protection doesn't mean giving up on a 6 digit passcode, it means making the hardware security module completely hardware only with no way to alter software (or at least unalterable without the unlock code), and with too many brute force tries, the hardware wipes it's copy of the key, which is essentially the same as wiping the disk, but it can't be intercepted by an IOS software change. For added security, it could have a timer so if the phone is not unlocked in X number of days, it wipes its key so if someone steals your phone, it's completely unrecoverable after X days.
Thank you, the bit I didn't appreciate was the signing of Apple's software to do this. So yes, only Apple can (but wont) do this, thus making it secure provided Apple don't or aren't forced to change their mind.
Once the government can force apple to create software against their will, why do you think they won't also force Apple to hand over signing keys that will let them target any phone they want? I don't see how the slippery slope can end at one phone (and that slope is already sliding to encompass dozens of phones - when it reaches hundreds or thousands of phones, the government is going to decide that they need to be able to target phones on their own, without Apple in the way)
This concept I find interesting. If Apple did have the ability to create this tool, then is there much of a difference?
I guess the difference is whether Apple were to give such a tool to the FBI, rather than creating it, extracting the data and deleting the tool. But who is to say that some clever individual can't create their own tool to achieve the same thing.
Until Apple deliver on their desire to create phones you can't decrypt, I would therefore consider all iphones vulnerable.