It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
In the physical example, according to the FBI's "just this one iPhone" claim, one would reasonably expect that the company could then destroy the hypothetical master key as soon as it's used. This makes sense in a physical world, but the analogy breaks down completely in a digital world. [Returning your spider doesn't solve the problem](http://www.27bslash6.com/overdue.html).
In the digital world, you can't guarantee that the key hasn't been copied, and you can't guarantee that destroying the "original instance" of the key destroys all others.
The custom OS that the FBI is asking Apple to build will also take development time, and likely take more than one person to develop, meaning that if there's a security breach during the OS's development, any number of intermediate builds may also be stolen during development, before the FBI can even access the particular phone in question.
> It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
I'm not trolling at all. Genuine question.
Assuming what Apple said in the open letter is true:
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
Is like ask the Company to create a tool/masterkey able to open "just that safe".
I well know that in software you can make the OS to auto destroy itself (TTL) etc... but still imagine that after they get the OS they will copy it and start reverse engineering it.
>In the physical example, according to the FBI's "just this one iPhone" claim, one would reasonably expect that the company could then destroy the hypothetical master key as soon as it's used. This makes sense in a physical world, but the analogy breaks down completely in a digital world.
Why not? Can't apple just delete it?
>In the digital world, you can't guarantee that the key hasn't been copied, and you can't guarantee that destroying the "original instance" of the key destroys all others.
You can't do that in the physical world either. But you can be pretty damn certain that it isn't done.
>The custom OS that the FBI is asking Apple to build will also take development time, and likely take more than one person to develop, meaning that if there's a security breach during the OS's development, any number of intermediate builds may also be stolen during development, before the FBI can even access the particular phone in question.
Apple already takes this risk with every since iOS release.
Yes, but leaks of pre-release iOS software can't be installed on locked phones as a means of unlocking them, so the risk is not nearly the same.
If you really want to carry this analogy to term, fine, I'll concede that you can't be 100% sure that a physical key wasn't copied before you destroy it, but then you must take into consideration the complexity of manufacture and duplication - if the complexity of duplication is high, and you only make one, and guard it at all times, you can have a fairly high confidence (barring ridiculous film plots) that the key you're destroying is the only one.
With digital things, the complexity of duplication is beyond trivial. One copy leaks, and instantly there are tens of thousands, if not millions of copies in all corners of the internet. Physical objects simply do not behave this way.
Are you kidding? Law enforcement across the country wants to use this on thousands of phones.
Even if it were truly going to be used only on one phone, there's still a risk that the signed software gets out. That risk is exponentially increased when you realize that Apple will be required to retain this software for responses to law enforcement requests indefinitely.
>> It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
I'm not sure I fully agree that things are different because it's physical over digital.
Lets say that the way a safe manufacturer could circumvent the lock on the safe is to build a custom tool that can rapidly try every combination much faster than any currently known method. Such a tool could be reverse engineered (i.e. copied) after returning it.
I agree that copying software is far easier than hardware, but it's the design of the tool that's important, not it's physical representation.
The only way I would agree with the FBI's "just this one iPhone" statement is if they got Apple to crack it and they just returned the data but not the method. Which of course they wont do.
> I'm not sure I fully agree that things are different because it's physical over digital.
...
> I agree that copying software is far easier than hardware, but it's the design of the tool that's important, not it's physical representation.
So you do understand that software is different. It is easily copyable, and all software is copyable.
> The only way I would agree with the FBI's "just this one iPhone" statement is if they got Apple to crack it and they just returned the data but not the method. Which of course they wont do.
I wouldn't even agree to that. Creating a signed copy of this software creates a vulnerability in iPhones worldwide that does not exist today.
is it admissible (under the current laws) to ask the Company to make a master key?
That's exactly what's under debate. The FBI believes the All Writs Act does allow that, and Apple (and many others) disagree.
The safe analogy is a bit flawed[1], but to play it out, the debate is over whether or not a company should be allowed to build and sell an uncrackable safe. Law enforcement appears to say no, and privacy advocates say yes.
[1] ... since if you build a safe that's actually uncrackable, it's by definition impossible to later modify it so it will accept some sort of master key, whereas adding software to allow disabling the passcode prompt on a phone is doable after it's built.
Please read more about this case and inform yourself. Creating or giving that master key, Cook says, will put every device owner at risk. And given that there are millions of iPhone users worldwide, that risk is huge.
Creating signed software that Apple feels is unsafe and not something Apple should be compelled to do.
Additionally, we all know very well that eventually law enforcement will demand their own copy of the software, or find it with the help of the NSA.
By the way, I'm totally with you that Apple should improve the security of their phones. But this case and that feature are separate issues. You can advocate open hardware & software and defend Apple's stand against the government compelling companies to act at the same time. Those ideas are not mutually exclusive. Hate the game not the player.
If they ask for the software, Apple should deny that request because that's very different from asking for assistance on a case-by-case basis with a court order for each one.
The question reasonable, non-technical people are going to ask is: how is this different than asking to search a house. There's a court order, it's not just the FBI acting on its own. They're asking Apple to do it, and not hand over a master key, which would not be reasonable at all.
Somehow Apple has managed to keep its signing key from getting out into the wild. Only Apple knows the systems required to keep that safe.
The same cannot be said of signed copies of software.
> If they ask for the software, Apple should deny that request because that's very different from asking for assistance on a case-by-case basis with a court order for each one.
If it comes to that, I'm sure they will put up the same fight. What's the difference to you whether they fight now or later? Must Apple act exactly as you would?
> The question reasonable, non-technical people are going to ask is: how is this different than asking to search a house. There's a court order, it's not just the FBI acting on its own. They're asking Apple to do it, and not hand over a master key, which would not be reasonable at all.
To those reasonable, non-technical people I would say the following: "If we create special signed software for the government for unlocking the iPhone, then terrorists will just use something else that is out of the reach of our government. We will have weakened our privacy and this will not keep us safe from terrorism. If the signed software gets into the wrong hands, it could be used by anyone in the world to hack into iPhones and get location data on your kids, your home, and other information you'd rather keep private."
> Somehow Apple has managed to keep its signing key from getting out into the wild. Only Apple knows the systems required to keep that safe.
> The same cannot be said of signed copies of software.
Presumably, you could employ the same systems for both pieces of data, since it should all stay within Apple.
> What's the difference to you whether they fight now or later?
Because they should respond to the request that was made on its merits, not on "what might happen".
I'm just pointing out the kinds of arguments that reasonable, non technical people are making.
For what it's worth, I'm pretty technical, and all these arguments in Apple's favor seem a lot like arguments for why law enforcement is generally untrustworthy, warrants are meaningless, and everyone should resist every search under every circumstance because what will they ask for next?
Aside from the fact that's all probably actually true, that is not the case Cook is making. Cook is, in fact, trying to pretend like this represents a cryptographic back door, when it plainly absolutely does not.
> Cook is, in fact, trying to pretend like this represents a cryptographic back door, when it plainly absolutely does not.
He's not saying that. He's saying, yes there's a back door, and no we don't want to open it for you because once we do, everyone will have a chance to open the door, including possibly your neighbor or a foreign government who wants to snoop on US embassy employees.
What politicians do not understand is you can't outlaw encryption. You can't tell math to stop working. It just does. If they did understand that they would not pursue using the AWA with Apple. They would instead have more open discussions with Apple.
But our politicians keep thinking there must be some way around this, and there isn't. The sooner they get it into their heads that terrorists will simply choose a new device, the better, because
(1) In their attempt to shore up technology, the current administration is going to do more damage to the US tech industry. They already helped destroy Yahoo
and
(2) Law enforcement needs to find a way to keep people safe other than relying on back doors to seemingly secure devices. There are no doubt terrorists using secure communications now that the NSA cannot access, and no amount of begging or pleading from the DOJ is going to change the fact that math is not governable by humans.
Let's say that the gunman had a safe in his apartment. FBI can't open it without destroying its content.
FBI (like in others countries) can ask the Company who made the safe for help to open it and that company should help them to do so at its best.
If that company is unable to open the safe, is it admissible (under the current laws) to ask the Company to make a master key?