In cases like this it is always FBI and the govt. which usually have the PR upper hand. They wait to find a most abhorent crime that nobody sane would want to defend (terrorism seems to work well today) and use that as an example. "Oh look everyone, Apple doesn't want to fight terrorism. We are keeping people safe and what side are they on?". It is almost too easy.
Fighting that is an uphill battle. One can present the technical details ("They could have cracked that particular model themselves") or appeal to more general ideals of freedom and privacy etc. Those typically are not as effective in convincing the average joe out there when the other side uses the "T" word.
But here they are playing the same card as FBI -- using a crime that most people can fear -- their phone getting stolen, their identity used. Everyone has heard stories, has friends at least who this happened to and so on. So it works well. Terrorism is more scary, but this is more real. Great work.
Good PR, and also grounded in truth. If the DOJ understood the technology and implications of what they were requesting then they would not ask for it. Literally every person who speaks out against Apple begins with "I'm not a tech expert, but I know our world is going to be different if we can't get warranted access to criminals' phones". They completely fail to see how the private tech sector has fought for public safety. Comey said,
They [Apple] sell phones, they don't sell civil liberties, they don't sell public safety, that's our business to worry about. [1]
> If the DOJ understood the technology and implications of what they were requesting then they would not ask for it.
Oh they understand what they are asking, they just don't care if the average person's data is stolen every once in a while as long as they can get all the information they want when they want it. This is pretty similar to how the NSA can buy zero day exploits and not feel the need to inform the American people.
Their "truth" is a little sketchy. Their argument is essentially "you can't trust us to protect this software if we make it, and if we failed to keep it out of criminals hands you could be harmed by it". Phrased that way it doesn't really sound all that great.
That is FBI director Comey's statement, not Apple's. Comey says,
Apple is highly professional in protecting its own innovation, its own information [1]
Comey even articulates Apple's position clearly,
Apple's argument, I think will be, that's not reasonable because there are risks around that. Even though we're good at this, it could still get away from us. And the judge will have to figure that out [1]
So Comey claims Apple is very good at technical stuff and so they must have a way to keep this software secure. Yet at the same time he claims we should not believe Apple when Apple says this backdoor would weaken everyone's security. On one hand he is saying we should believe in Apple's technology expertise, and on the other hand he is saying we should not. Well, given that Comey is self-admittedly not knowledgeable about technology, and given that many other tech companies and independent tech experts have stood up to support Apple, it's clear to me who to trust on this issue about tech security.
Apple has demonstrated their commitment to safety by improving the iPhone's security with successive iterations of software and hardware. The FBI has demonstrated their deep desire to gain guaranteed backdoor entry to devices for over a year, yet claim this case is only about one phone. Apple's track record and rhetoric are much more consistent and trustworthy than the FBI's.
Framing the FBI as a major threat to national security is serious hardball, and publishing that piece in the FBI's hometown paper is a roundhouse kick to the head. I'm pretty sure the FBI didn't see this coming.
Are you implying it's Apple's fault they must defend themselves against the DOJ's attempts to get a backdoor?
Apple's job here is to educate the public. If this ever becomes a political issue, we want the public to be informed. We don't want the public to be reactionary.
Criminals already know how to use tech. We need a security force who can figure out how to maintain public safety without relying on backdoors.
How is Apple teaching them anything? The FBI can't outlaw math. The real bad guys are already using encryption apps that Apple or US corporations are not in charge of. And if people start getting busted while using iPhones, they'll just start using Cryptocat, Cypherpunks OTR, or any of the other encryption services. Maybe they'll just use PGP and email.
What do we think the likelihood is that at least some of the three-letter agencies already have the capability to get into a locked iPhone (likely their method would be based on some sort of baseband vulnerability) and this whole thing is simply a bid to gain the ability to openly unlock these phones?
Baseband vuln seems likely. Additionally, I've always wondered if the NSA or the like could just decap and read out data (IE: the passcode hash) from the chips. Then again, I'm not a computer engineer and don't really know anything about the physics of all this!
EDIT: This paper [1] seems to document such an attack, decapping the chip and reading the memory cells using lasers (although the details are over my head).
I think most of us are fine with the government being able to decap our phones and steal data, or at least try. I'm fine with that because It cost money and time and would be difficult for them to do in a massive, dragnet style operation.
The idea is once they have the phone they can do whatever they want with it, but they can't go to a manufacturer and force them to weaken the security of _all_ phones because they don't want to do any work.
Yeah, I'm completely fine with that as well - I'm happy to see the NSA do computer-security stuff that isn't this dystopian "monitor everybody all the time" nonsense.
If I was in their place (FBI, NSA) I would infiltrate agents into all the big companies to place hidden vulnerabilities in the code. Some of the employees could be contacted covertly and convinced to "work for their country" and "be real patriots". If they didn't do this, they didn't do their job as spies.
The same is true for China and other major powers. There are enough foreign origin people working at these companies to pick from. So, NSA is basically keeping up with everyone else. Even smaller groups/organizations could have a spying system in place.
There is no real safety when it comes to information, as long as we use operating systems and software created and compiled by other people. It's just trust that keeps it running.
This topic was brought up by Apple's lawyer during the hearing in support to the theory the FBI's request was more about setting a legal precedent than getting help for this one phone.
Such a vulnerability would almost certainly require the phone to be powered on, and for the boot time pin code entered (to allow the phone to decrypt user-space data).
It's mechanism of action would be along the lines of reading the PIN code out of memory, trick the user into entering their PIN into a fake screen, read the decryption key out of memory, of make un-authorised calls to decrypt data.
In the case that we're all talking about, this capability would be pointless.
On the other hand, could a 3rd party create a crack that manipulated the PIN reset counter, in a way similar to that being requested by the FBI? Potentially with the right vulnerability.
(Note, I've only skimmed the Apple white paper, so I may be wrong here).
For a few weeks Patrick Gray has been saying he has a source(s?) that claim there is a viable attack involving pulling the NAND from the phone, reprogramming it offline and reattaching.
We'll have to see if the reverse engineers are able to reproduce and publish, but between this and my overall security cynicism, I lean towards there being a way in but wouldn't be surprised that if the intelligence agencies have it, they're legit not sharing.
It seems Apple is committed. As strange as it may sound, I wonder how that will influence recruiting. Apple was never really on my radar as an employer (silly as it may be I still categorize them in the "hipster and annoying apple store" bin). Their recent "hardline" regarding privacy has made them a nudge more attractive. And that's despite me usually favoring non-proprietary stuff for everything if possible.
Strange to catch yourself in these thoughts but the narrative seems to work.
While I find security fascinating I would not consider myself expert in the low level implementation details or hard limitations of these specific systems.
Perhaps someone more knowledgable could clarify, but....
Isn't the entire point of good security that I don't have to "trust" Apple to do the right thing? Shouldn't it be impossible for them to do the wrong thing?
For example, if I encrypt my phone with my key and set it to secure delete after so many failures how could Apple circumvent this in a truly secure system? Shouldn't they not be able to push an update without my permission? Permission that can't be given because the phone can't be unencrypted?
So the fact that Apple can circumvent the encryption of they want is an indication of a vulnerable system?
Apple can't circumvent its own encryption (not that we know, anyway, and that hasn't been claimed). The FBI knows that the govt have tools capable of breaking the encryption, with time. They essentially scan for the correct key.
Apple has security in addition to the encryption though: in this case they have two: one is a timing system that limits the guessing rate, and the other is a 'self-destruct' system that destroys the data if too many guesses fail. These are the systems that the FBI want compromised. They would then set their normal password cracking toolchain to work.
It's disturbing, because the govt are definitely not the only ones with such a toolchain. And when you increase the number of devices that can be scanned (by producing and signing a version of iOS without these protections), you increase the ability to crack at least one. In general 'the bad guys' have it easier, because they are rarely interested in cracking a specific phone, they are interested in cracking some of a larger set of them.
The aim is to have these kind of security features on the silicon, so we don't even trust them to keep the extra security in place. But that's not the case here.
You're right in that the system is vulnerable, but this doesn't mean that it's not "secure" in practical terms.
One could argue that the phone is secure because even Apple has no way of recovering your passcode, nor do they have a master key that might give access to your data. That's fairly secure.
On the other hand, the system is insecure because it can have updates applied (which is what the FBI wants Apple to do) without requiring user consent, only physical access.
Imagine that updates required a user to put in their passcode, or otherwise wiped all the encrypted data; it would be more secure, but is that enough? Theoretically, there are still ways of getting at the data - someone might find a bug in the software, or they might physically crack open the CPU to get at a piece of needed data.
Security is always a balance - we want to trust Apple as little as possible, but I know of no way to create an invulnerable system.
I'm probably just not remembering it correctly but I know of no way to apply updates to a locked iPhone. Even when unlocked if you plug it into a new computer the phone will ask if you trust the new computer. Is there some method of updating the phone when it's locked I'm unaware of?
I think it's some kind of device recovery mode designed to recover from "bricked device" situations. It's not the same method that you use when updating via iTunes.
Ultimately, all security is founded on a basis of trust. Be it trust that their word is good, their signature is difficult to forge, their code they open-source is actually what is running, etc. We trust these companies to do right by us as their customers.
There's an entire tower of technology and thousands of people that you have to trust. Unless you are going to build a computer by starting with a pile of sand, you build on things made by others that you have to trust to some degree.
They can't circumvent the encryption. They can only bypass the failsafe mechanisms to allow for a brute force attack rate limited by the hardware (80ms per retry). Those two mechanisms are: exponentially longer password retry attempt delays, and deleting all data after 10 failed attempts.
I imagine they are working hard on preventing themselves from being able to bypass the failsafes.
Yes, in an ideally secure system even the manufacturer should not be able to get it. I am not aware of any such system, however. No system is completely secure.
Sure. But that doesn't happen very often. Or often enough to make much difference, anyway.
Sometimes the NSA feeds information to the FBI, DEA, etc, etc. Recipients do parallel construction. And then lie to courts about it. Which is perjury. How often does that get revealed?
Have any prosecutions resulted from widely practiced perjury about obfuscated stingray use? That behavior is common knowledge, no?
I think it seriously hurts the ability of the NSA to recruit the best and the brightest people (as if the mass surveillance hadn't already hampered things a little too well) if there is a public perception that they are just like IT support of other three letter agencies, mindlessly doing what they order without any discretion.
I hear people say we can't have universal healthcare because a decently priced tri care is one of the main draws for joining the military and we'd have to pay more to recruit and retain soldiers if we have universal healthcare. What basic rights do we curtail so it will be easier for the NSA to recruit in the future?
"Sometimes the NSA feeds information to the FBI, DEA, etc, etc. Recipients do parallel construction. And then lie to courts about it. Which is perjury. How often does that get revealed?"
Well, first you'd have to have evidence that such a thing happened.
I believe that some evidence has become public. I recall reading about this, and it was based on more than "We know that they must do it". I'll see if I can find some examples. But it won't be today.
Legally yes, in practice, no. Last year a story broke about a small town sheriff's office running guns and drugs to another country in what must be the most inept undercover sting ever conceived. They had no jurisdiction or legal authority to operate where in other states/countries.
Yet no one was charged.
[EDIT]
Had some facts way wrong. Here's the story I was thinking of. It was money laundering outside of their jurisdiction. No one was charged.
The FBI has a long history of colluding with criminal informants. FBI agents tend to become virtually indistinguishable from said informants. Look at the Silk Road cases. Or all those murders in Boston, back in the day.
"They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013." This is interesting. Perhaps instead of requesting a custom version of iOS with specific vulnerabilities, the FBI should simply request a signed version of iOS 7.0. They could use existing vulnerabilities to gain access to the phone, without placing an undue burden on Apple.
No. Among other things, the order the FBI sought requires Apple to build a RAM only version of iOS.
Anything that requires reflashing the phone, or that writes log files at boot or alters as much as a single bit on the phone is a no-go from an evidentiary standpoint.
Edit: but anyway, iOS 7 has no understanding of the encrypted format introduced in iOS 8
I'm pretty sure the many on-device storage formats have changed as new features are introduced. I'm pretty the Notes app in iOS 7 doesn't have the capability of inserting doodles or checklists for example. And the Messages app doesn't support groups with names.
FBI doesn't just want an unlocked phone, they want an unlocked phone with data intact, so Apple still needs to write and test a lot of code that performs reverse database migrations, possibly losing information in the process.
If systems can be technically downgraded without losing data, just making it inaccessible, then they can downgrade, crack the code, then upgrade.
I don't know enough to know whether this works. I suspect apps that didn't exist in iOS 7 would have their data deleted, although perhaps this wouldn't happen if it wasn't unlocked (and it would be power-cut as soon as it was cracked).
Presumably any request from the FBI for new software includes the fact that the new release has been signed. What good would an unsigned software release do for the FBI? The resources to create that signature likely exceed the ones to jump past the code that does an erase after too many attempts.
The signed software with bad behavior is precisely what Apple is afraid to unleash upon the world. Some FBI agents have recently shown themselves to be extremely untrustworthy with valuable items [1].
Simply put "The FBI and other law enforcement agents want to make it easier for many criminals to commit many new crimes for the sake of solving a few crimes already committed"
Basically, the FBI's actions are tantamount to being accomplish to every crime they enable if they succeed in forcing us to make the iPhone knowingly vulnerable to law enforcement and criminals alike.
Would be funny (by funny I mean sad) if Apple gives them the new hacked ios build and the password turns out to be something like 'NYOEC"T80BkLYExMU7JYWaz&P}dtMBR', ie they can't ever brute force it.
Yes, rubber hose crypt-analysis. One of the reasons that the FBI wants apple to make a new version of the encryption software is that you can set your phone such that after ten attempts, it's wiped. Key erased, data bye bye.
You misunderstand, or at the very least, misrepresent the situation. The only threat Apple apparently didn't design for is the threat of the most powerful nation on earth overreaching their authority by ordering Apple to write and sign an entire operating system with the express purpose of defeating their own safeguards. Which is next-level crazy.
Nonetheless, if Apple had designed the update mechanism to require both a signed firmware image, and authentication by the user, then the FBI's request would not be possible. Such a design would also help to mitigate the end-user effects of an insider attack at Apple, where a rogue employee signs malicious firmware, or leaks the signing key to some adversary.
While that may be a good design for various practical reasons, the technical properties of Apple's security design are not very relevant. This is a political issue about how far Apple (and eventually, any of us) has to go in creating access for law enforcement.
Right now the FBI is being reasonably diplomatic and "asking" (formally, through the court) to create this backdoor. If the situation were as you suggest and no update capability existed, they would simply skip the current step and move on to the next step: forcing Apple to include explicit "lawful intercept" capabilities. It is very important for us to win this earlier stage of the fight, because it will be much harder to fight a law.
In case you've forgotten, the traditional telcom industry already lost that fight (CALEA).
The way it is currently implemented, by which Apple can - if in physical possession of the device - unilaterally force new firmware onto it without the consent of the user, yes.
Whether we want to call it a backdoor or get senselessly pedantic about terminology and call it a "front door", the practical reality is yes; it is that functionality that the FBI is asking Apple to take advantage of.
Has Apple addressed this? I was under the impression the PIN was needed to update the OS. Why isn't it? Some sort of recovery method that doesn't wipe data?
Who is "we" here? Apple could do that, for sure. In the present contest, I can see why they'd want to. It would be a great move to forestall this sort of attack in future.
But "we", as in you and I, cannot easily do this. Because we can't sign stuff as Apple. But maybe it's doable.
I think this is where things are headed. I think we will soon see a new version of iOS by Apple with more stringent security (no updates without authentication or information is wiped when you force an update).
If this doesn't happen, I would highly suspect there is an NSL (or something similar) involved forbidding it.
> The encryption technology built into today’s iPhone represents the best data security available to consumers.
I appreciate Apple's willingness to fight, but this article could have benefited from less bragging PR talk and a little more humility. If the issue really is larger than Apple (and they are claiming it is) then we don't need this kind of messaging.
The message is that being forced to create a backdoor is bad for everyone, and they're doing it to Apple today but they'll do it for everyone else tomorrow.
I would be much happier if more companies actually tried on security, competed on that vector and then bragged to high heaven about their successes. Usually when you hear a company touting it's security, it's snake oil. The companies who are actually putting forth a solid effort to compete on security, and I think Apple has a reasonable claim that they are one of these companies (even if there are nits to pick), have my blessing to brag as much as they can about it. I'm talking multimillion dollar ad campaigns touting their (actual) security advantages versus competitors. Have at it. Donald Trump style. Go wild!
Any company that "bragged to high heaven about their successes" with security are setting themselves up to fail. As the old saying goes, "pride comes before a fall".
Not saying that Apple aren't doing a good job. Just saying, bragging and actively trying to differentiate, followed by a major security flaw, is a good way to lose trust.
> Usually when you hear a company touting it's security, it's snake oil.
I guess this is part of the reason I have an aversion to companies speaking about their security. I know Apple security is very good (anyone curious can read the white paper), but most of the time corporations speak about security in a way completely detached from reality.
I think you missed the point of the piece, which was to reframe the issue in a very aggressive fashion. By pivoting away from the privacy angle to one that's focused squarely on national security, Apple is using the FBI's own canard against them. The fact that they're on such sold ground places the FBI in a vastly harder position to defend.
In Cook's initial letter he was so careful about stating his belief that the FBI's intentions wee good that he said so twice. That was the warning shot. This piece, from Federighi was a shot to the head.
> The message is that being forced to create a backdoor is bad for everyone, and they're doing it to Apple today but they'll do it for everyone else tomorrow.
I still got that vibe from the article, despite the one line you pointed out.
That message becomes more clear when you see the support coming from Google, Facebook etc. It's clear they stand with Apple on the side of keeping encryption legal, secure, and backdoor-free. Apple doesn't need to speak for them too much, and I think Apple can still argue backdoors are bad for the whole tech industry while speaking about their own perspective, experience, and products.
The answer can be "No." Or the answer can be unknowable.
That's the problem with compromise positions: If you make a product that helps keep Chinese, Russian, Iranian, etc. dissidents out of prison, it might very well be good enough to keep the NSA out, too. It's too dangerous to try to fine-tune the level of security.
We're facing an election in the US about which people are half-joking it might be the last. That should be a wakeup call for people who think we can arrive at a technological compromise.
They can. The arguments are over whether the FBI has the legal standing to compel Apple to create code to do that, and what precedent, legal or otherwise, it sets for similar cases.
A better analogy is whether the FBI can compel your doctor to develop and then administer a medical procedure that allows them to extract 'unencrypted' thoughts from your brain against your will.
There are laws that protect people against self-incrimination in most countries, so that analogy doesn't work, but if a doctor can uniquely perform a procedure that can help an investigation because of some relation to the person under investigation, she can be compelled to perform that procedure in most countries, including the US.
You're not wrong about the Fifth Amendment, but you're assuming the compelled testimony is self-incriminating. What if the FBI decides to round up all the people in the area of a crime, and coerce a doctor to administer some "thought extraction" procedure that would offer evidence of the culprit? If one of the witnesses turned out to be the perpetrator, their testimony would be unconstitutional, but the other witnesses would (in theory) be valid and admissible.
It's more like if the safe deposit maker already has a master key that is harder to use than a normal key but can still open the safe in a day, can a judge compel the safe deposit maker to use that key on the FBI's behalf?
Even that example's not quite right; it's the judge compelling the safe deposit maker to _give_ the key to the FBI, such that they can use it in perpetuity.
Yes, they do, or the FBI wouldn't bother Apple with a request, and if the FBI sent a request anyway, Apple would say it is impossible. The FBI sent Apple a request, and Apple didn't say the request was impossible, so Apple has what amounts to a backdoor (really, a front door).
It is closer to a safe designer (Apple) who sells a safe that destroys its contents when broken into. And the state compelling the safe designer to participate in cracking it open.
That's the closest non-digital analogy I have seen yet. To make it closer, the safe designer has a code to disable the autodestruct mechanism that nobody else has.
> ...the safe designer has a code to disable the autodestruct mechanism that nobody else has.
No, not quite. The safe designer can more easily figure out how to bypass his autodestruct mechanisms than anyone else, but that doesn't mean that he "has a code" that bypasses them. In this analogy, it's still nontrivial for the safe designer to perform the work, but it's (probably) easier for him than for anyone else.
It takes a day of work to use this code to disable the autodestruct, but that doesn't really change anything. That they have this code (that nobody else has) at all is the main reason a judge can force the safe maker to assist with the investigation.
They don't have any bypass code. They have detailed plans for the autodestruct mechanism. That's it. They are probably the entity on the planet with the best understanding of the autodestruct system, but the autodestruct system does not have an "off" switch.
To reverse the analogy: The safe designer's bypass "code" is analogous to a back door already built into iOS that opens when presented with some information held in secret by Apple. (Think CALEA wiretapping infrastructure built into US-destined telecom infrastructure equipment.)
Because Apple has to write a special version of iOS to fulfil FedGov's request, we can reasonably assume that there is no pre-built back door in iOS that sits around, awaiting the secret information.
The code in Apple's case is the release build key. Only Apple has it, and it takes a day to use it to disable the autodestruct. Ergo, the analogy fits perfectly.
> The code in Apple's case is the release build key.
No, the "code" [0] in Apple's case is the yet-to-be-written version of iOS that can both be loaded at boot time to bypass the PIN retry count and retry delay functions and -to a sufficiently high degree of certainty- only be run on the intended iPhone.
Ergo, your analogy is -as I said- not quite right.
[0] And it's not a "code", but a bypass procedure.
I won't keep up this conversation if you're not going to put in the effort to understand what I'm writing. The code we've been talking about is the safe manufacturer's code that only the safe manufacturer has and that takes a day of effort for the safe manufacturer to use. The equivalent for Apple is the release build key, which has exactly the same properties.
You then made further mistakes. The signed build does not need to be limited to a single device, so it has no place in the analogy. The FBI allows Apple to keep the build themselves and delete it after use.
Please try to keep up, or you'll continue to tilt at windmills, and you won't learn anything.
> Please try to keep up... The signed build does not need to be limited to a single device... The FBI allows Apple to keep the build themselves and delete it after use.
You misunderstand what business this safe manufacturer is in. He's in the business of high-security safes. Any bypass technique he comes up with must be as specific as possible, and defended in every way possible, lest he fail his customers [0] and destroy his reputation (and -thus- his entire business) as a high-security safe manufacturer.
So, yes, the technique developed must be limited -as far as is possible- to a single safe (or phone). Anyone who reads HN knows that leaks happen. The more valuable a piece of information, the more likely someone with the ability to snatch it will be targeting it. And -as we should all know by now- it's tremendously difficult to prevent a state-level actor from acquiring data that's not very securely locked away.
Again, if you're in the business of providing high-security services to your clients, you are going to ensure that any bypass mechanism you're forced to make is as difficult to create and one-time-use as possible.
[0] And -in some jurisdictions- aid in the "legal" torture, imprisonment, or death of his customers.
The way to make it difficult to reuse is to keep it at Apple (where nobody can extract it from the device) and dispose of it after use, not try to limit it to a single device and allow the build to leak. The signing key is much more valuable target than this build, and the build needs to be secured for a very limited time.
One can make any complex issue simple with a blunt enough analogy. That's why analogies are great for introducing a concept and poor for drawing conclusions about them.
I have sympathy for Apple's position but when they resort to hyperbole they seriously undermine that goodwill.
The "if we make this software, criminals will get their hands on it" argument is absurd. Does Apple have no faith in their own security? The FBI has at no point suggested the modified firmware would ever leave Apple's possession.
Does Apple have a problem with theft of internal code that I am not aware of?
Further, I'm just waiting for someone to transpose the argument, for example: "Apple says guns shouldn't be manufactured for any reason because they could kill someone" or "Apple says cars are unsafe and should be banned because people are killed in accidents."
facepalm
EDIT: Behaving like a bad actor even if you believe your cause is just still makes you a bad actor, albeit one with good intentions. If they want to win the argument they should stick to realistic positions and leave the manure shovelling to "those other people".
Yeah for this level of a crack, you're talking about the CIA/NSA (not to mention other countries) embedding people as spies inside of Apple to 'acquire' the hacked version. Illegal as shit but they would do it anyway.
Once Apple does it for the USA, what's stopping from the UAE and China to request the same?
If a government can compel to break into your own device, what will happen later on when you close those backdoors? What's stopping the governments from demanding you leave a backdoor by pointing "hey but you could do it before"?
I'd wager that FBI and the other 3-letters want to unlock phones themselves; they don't want Apple to know what they're doing. Soon every agency and county police department has that software.
In cases like this it is always FBI and the govt. which usually have the PR upper hand. They wait to find a most abhorent crime that nobody sane would want to defend (terrorism seems to work well today) and use that as an example. "Oh look everyone, Apple doesn't want to fight terrorism. We are keeping people safe and what side are they on?". It is almost too easy.
Fighting that is an uphill battle. One can present the technical details ("They could have cracked that particular model themselves") or appeal to more general ideals of freedom and privacy etc. Those typically are not as effective in convincing the average joe out there when the other side uses the "T" word.
But here they are playing the same card as FBI -- using a crime that most people can fear -- their phone getting stolen, their identity used. Everyone has heard stories, has friends at least who this happened to and so on. So it works well. Terrorism is more scary, but this is more real. Great work.