The thing I think should be emphasized is that while the majority of the people may now trust the FBI, the NSA, and the US Government in general to do the right thing, this is the same FBI that just a generation ago investigated Dr. Martin Luther King, Jr. as a <strike>terrorist</strike> communist and apparently tried blackmailing him into committing suicide[0]. (The communist threat at that time filled the same place in the American psyche as does the terrorist threat today.) The NSA recently spied on their congressional oversight committee. Without strong controls on government investigative powers, we're just one Herbert Hoover away from a completely unchecked and out of control security apparatus.
It's not about trusting the American government to do the right thing. It's about trusting this government and all future American governments to do the right thing.
EDIT: Also, at least one legal expert estimates that the average American accidentally commits three felonies per day, meaning that most of us are mistaken when we think we don't have anything to hide.[1]
To add onto this, in addition to the governmental threat, another point that Cook is trying to make is that security is hard. It's hard for the government, which he backs up by citing breaches, and it's hard for Apple (which he omits, but remember those iCloud breaches?).
I completely agree with your point about future governments and unchecked power, but there is also the point that, if Apple creates it, there is the possibility that organizations (or even just normal people) besides these anointed three-letter agencies may also have the same power to access a good chunk of your entire digital life (and the locations of your friends and family who use Find My Friends) if you were to lose your phone.
Good point. The thing Cook can't say publicly is that once Apple creates FBiOS, every government around the world, especially the most oppressive regimes, will block imports of iPhones until Apple gives them a copy or they steal it from the FBI.
I believe the FBI is asking for FBiOS to refuse to run if the IMEI doesn't match a compiled-in whitelist, but my understanding is that phone thieves routinely desolder a chip to change the IMEI on a stolen phone. The FBiOS is for use when they have physical access to the phone, so I don't believe the IMEI whitelist is a large hurdle to get over.
>EDIT: Also, at least one legal expert estimates that the average American accidentally commits three felonies per day, meaning that most of us are mistaken when we think we don't have anything to hide.[1]
Sadly beyond the title "Three felonies a day" apparently he does not back up these claims or mention them at all:
>It's not about trusting the American government to do the right thing. It's about trusting this government and all future American governments to do the right thing.
But you don't have to trust the FBI or today or tomorrow. The court order doesn't require the FBI have access to the modified software.
I'm not particularly concerned about FBiOS for the pre-Secure Enclave iPhones. I'm concerned about the precedent of allowing the security apparatus to demand the creation of new surveillance/investigative tools, as long as they whisper the magic word "terrorist" 3 times before asking.
1) They'd still need the Apple signing key otherwise you can't update the phone. The software, without the key, is totally useless.
2) The same thing that prevents the FBI from subpoenaing Apple's key. There isn't a good legal basis for it. It's almost definitely unduly burdensome for apple to give out their key.
If the FBI were to win a case like that, it doesn't matter what happens in this situation.
I think that if the government was asking gun manufacturers to install an electronic lock on guns that would allow the FBI to disable any gun if the suspected the person might be about to commit a terrorist act, there would be a completely different reaction.
Those things aren't really comparable. One is a pre-emptive action taken by government before the commission of a crime, the other is a reactive investigation tool. I also suspect your gun example might actually generate more outrage at governmental overreach.
Side-note: your observation may be valid for this case, but not in the case of backdoored encryption. Backdoored encryption is exactly a pre-emptive action. Relevant b/c I think the real worry in this case is that the FBI is laying the groundwork for a PR campaign for mandated backdoored encryption.
Software and hardware are fundamentally different so it's hard to find a close analogy, but if Apple creates this tool (that can be applied to any phone (or at least a large subset of their phones)), it's essentially the same as if the decryption tool were present in every phone, even ones sold before the commission of a crime.
The closest parallel I can think of is the Gun Microstamping law (which would require that bullet casings be stamped with a mark identifying the gun it was shot with). This has (as you might expect) generated a lot of controversy and lawsuits.
>> but if Apple creates this tool (that can be applied to any phone (or at least a large subset of their phones)), it's essentially the same as if the decryption tool were present in every phone, even ones sold before the commission of a crime.
This concept I find interesting. If Apple did have the ability to create this tool, then is there much of a difference?
I guess the difference is whether Apple were to give such a tool to the FBI, rather than creating it, extracting the data and deleting the tool. But who is to say that some clever individual can't create their own tool to achieve the same thing.
Until Apple deliver on their desire to create phones you can't decrypt, I would therefore consider all iphones vulnerable.
It's true, it's a fine distinction, but there's still a significant different between "There is no software in existence that can break into this phone and we are the only ones that could ever create such software" and "We have software that can break into the phone, but we promise not to use it. Unless someone tells us to and we've already got dozens of such requests in queue."
Third parties can't do anything because iPhones accept software only if signed by Apple. So Apple is the only one that can create a software that lower its protection level to allow for bruteforce.
And Apple can't decrypt the phone: it can remove the bruteforce counter and/or exponential delay, both of which are paramount parts of the security architecture that requires users to be able to unlock the phone using a 6-digit code (4-digits for non Touch ID phones like that in discussion), which in turn is an important UX compromise to allow to reach the goal of having 90% of 1B iOS devices with passcode lock activated.
Allowing for a remote bruteforce is basically undermining the whole security architecture.
To make a phone that is not bruteforcable, you need to give up on the 6-digit concept, which is already an option (I turned it on on my phone) but it's strongly unlikely to gain widespread acceptance
Apple can decrypt the phone -- once they remove the bruteforce protections, it would be trivial for them to bruteforce the typical small phone unlock keyspace.
The FBI hasn't asked for this since they can trivially do it themselves, but Apple could certainly do it if they wanted to.
And if they are forced into creating this hack, then the next request will be to force them to decrypt the phone too since once manufacturers can be coerced into doing anything that the government demands in a warrant, what restriction would prevent the government from asking for full decryption?
Bruteforce protection doesn't mean giving up on a 6 digit passcode, it means making the hardware security module completely hardware only with no way to alter software (or at least unalterable without the unlock code), and with too many brute force tries, the hardware wipes it's copy of the key, which is essentially the same as wiping the disk, but it can't be intercepted by an IOS software change. For added security, it could have a timer so if the phone is not unlocked in X number of days, it wipes its key so if someone steals your phone, it's completely unrecoverable after X days.
Thank you, the bit I didn't appreciate was the signing of Apple's software to do this. So yes, only Apple can (but wont) do this, thus making it secure provided Apple don't or aren't forced to change their mind.
Once the government can force apple to create software against their will, why do you think they won't also force Apple to hand over signing keys that will let them target any phone they want? I don't see how the slippery slope can end at one phone (and that slope is already sliding to encompass dozens of phones - when it reaches hundreds or thousands of phones, the government is going to decide that they need to be able to target phones on their own, without Apple in the way)
I tend to agree. People who love when governments control things and people who hate terrorists are both probably pro-FBI. People who want reduced government control or people who value digital freedom are probably pro-Apple. Not that these groups are mutually exclusive, it just depends on what you value most.
Isn't that how it's supposed to work? Everybody tries to make the best possible argument for their desired outcome and then some ostensibly neutral decider reads all the arguments and decides who's right.
I think the reaction will be exactly the same except that many of the people in the debate will immediately switch positions :).
Many of the people who take privacy and the 4th amendment very seriously have no qualms being very nuanced and "progressive" about the 2nd (and vice versa). I find that fact entertaining
Those two are related in principle (the whole All Writs Act thing), which means in so far as you agree with Apple, but disagree with the hypothetical analogous gun manufacturer, maybe it's not so much the common principle (applying the All Writs Act) you take issue with. Which was your point, and it's a good one.
At the same time, it's hardly an unreasonable or inconsistent position to take. You can be opposed to the government asking Apple to install a backdoor regardless of constitutional concerns.
For reasonably sane people, yes, reactions would not be different. But for the evangelical nutjobs that implicitly support that kind of crap, there would definitely be an outrage about how the government is overstepping their boundaries.
Interesting perspective, I hadn't thought of it that way. I was thinking the opposite, and we wouldn't be having a conversation, because Apple would have already complied.
2nd amendment vs. 4th amendment. Why is one more important than the other ? once there is a precedent to bypass the 4th ? I just think it is something to think about. I get it that it is not a perfect analogy.
Most people aren't going to just come out and say that they think one amendment is more important than another. They will just interpret each amendment to be compatible with their beliefs. Someone who proposes stricter gun control will interpret the 2nd amendment as a reference only to militias, or they might argue that there is a gun death epidemic in the US that is worth a "small" violation of the 2nd amendment. Similarly, someone who approves of the government's ability to decrypt a phone might argue that the threat of terrorist is worth a "small" violation of the 4th amendment.
Yeah, I agree. I was only criticizing the analogy (which I fumbled), because privacy itself is a very complex topic without involving it with gun rights.
I personally do not hold this opinion, but many gun owners see gun ownership as an inalienable right and any attempt to limit or take it away as a violation of privacy and overreaching by the government.
I agree. To me the relationship is government over reach. I think it is too much to ask Apple to write a back door into their product. I can see getting a warrant to look at phone records etc. I am a gun owner but not fanatical, I am ok with some reasonable controls. The key in both cases, 2nd and 4th amendment restrictions is reasonableness. But that is always in the eye of the beholder, hence the controversy.
The FBI isn't asking for this on all iphones. It's asking for apple to help them hack into a single iphone, which it already has it's possession. And it's pursuant to a court order, which limits the chance that this is a fishing expedition.
It's more like asking the gun manufacturer to make a tool that fits the mold of the gun to take it apart so they can gather forensic evidence. Hell, I best most people would okay with that even if it affected all guns. But this apple situation isn't going to effect all iphones.
With digital technology, proliferation is magnitudes easier than something that involves a physical object without involving digital computer technology.
Nobody is stopping the FBI from doing whatever they want with the phone. They have it in their possession and have a warrant to search it. Nobody is questioning that.
What is being questioned is whether a software manufacturer can be forced to build a way for the FBI to compromise the phone. This is much like the question of whether a firearms manufacturer can be forced to build a way for the FBI to compromise that type of device.
No those questions are not at all similar, one is about a device the FBI has, the other is about everyone's device.
The actual question is can the FBI force Apple to open the phone for them. That's it. Can the FBI force a bank to open a safe for them? It's not any different.
Just because there is technology involved does not change the fundamental nature of the question.
If you want a gun analogy can the FBI force a manufacturer to tell them the proper disassembly instructions for the gun. (For example if it needs some special tool, can the FBI force them to sell them that tool.)
Somewhat fair points, with a couple of key pieces of data left out:
> one is about a device the FBI has, the other is about everyone's device
Except that one Apple claim is making, whether realistic or not, is that once the "device the FBI has" is unlocked, it potentially exposes "everyone's device" to the same exploit
> If you want a gun analogy can the FBI force a manufacturer to tell them the proper disassembly instructions for the gun
Except that in the case of the gun, this is information the manufacturer already has. In Apple's case, they are claiming they do not currently have this information, and would need to do work to create the information being requested.
> Except that one Apple claim is making, whether true or not, is that once the "device the FBI has" is unlocked, it potentially exposes "everyone's device" to the same exploit
The court will not care about that. The FBI is expected to open devices only with a warrant and that's it. Just because they have a tool doesn't authorize them to use it.
Just because the FBI bought lockpicks, doesn't mean they can go around opening locks.
> this is information the manufacturer already has. In Apple's case, they are claiming they do not currently have this information, and would need to do work to create the information being requested.
Also irrelevant. At best Apple can request to get paid (and that would probably be approved).
It's completely ordinary for companies to be ordered to go collect and collate information relevant to a warrant. They can charge a reasonable fee to do so, but they can't refuse to unless it would severely impact them (and the definition of severely is up to the judge).
In Apple's case the Judge is unlikely to believe them that doing this would be a big problem for them - they are a huge company.
> Just because the FBI bought lockpicks, doesn't mean they can go around opening locks
Similarly, just because Schlage makes high security unbreakable locks, doesn't mean they have to create lockpicks for the FBI that will work on those locks. They spent a lot of money developing those high security locks and if they create lockpicks that can break through them with ease, they lose much of the value of those locks. The unbreakable lock is no longer unbreakable. So the loss of value for the company is not just the employee time to create the lockpicks, but also the loss in value of the product line when they can no longer say that there's no lockpick that can break in.
The FBI is free to study and try to break into the locks themselves, but if it is a truly unbreakable lock, then hooray, the world is arguably safer on the whole, even if a few criminals use the locks to hide their secrets.
> Just because they have a tool doesn't authorize them to use it.
Why do you think they wouldn't use it? Of course they would use the tools that are available. The FBI's regular use of "stingray" devices ("IMSI-catchers") has was in the news recently, as just one example.
Law enforcement does a lot of stuff without a warrant. The "exclusionary rule" doesn't prevent them from conducting a search; it only allows the defense to get the evidence from those warrantless searches thrown out. The problem with modern data technology (for both mass surveillance and easy access to private data such as an Apple phone) is that it makes parallel construction much easier, where data found during a warrantless search is laundered illegally into a new search that can be used in a court.
Besides, the FBI has stated many times how they intend to continue accessing phones and other network devices.
> they are a huge company
So you believe Apple should be de facto deputized into a law enforcement role against their will and at their expense, just because they are "huge"?
> Apple can request to get paid
As this will continue into the future (the FBI apparently already has a list of cases in waiting), do you really think the government will continue funding Apple for their new law enforcement role? The government doesn't get to nationalize part of a corporation just because it would be convenient for law enforcement.
>They can charge a reasonable fee to do so, but they can't refuse to unless it would severely impact them (and the definition of severely is up to the judge).
What is a reasonable fee? Tim Cook reportedly spent months trying to convince Chinese officials to sell the iPhone in China.
If China were to then ban iPhones on the premise that they don't the FBI to seize the data on iPhones belonging to Chinese nationals (remember, warrants only really apply to US citizens), could Apple then charge for the loss of the Chinese market (~$13B/quarter?). Should tax payer money be used to secure the investments of Apple's shareholders who banked on the fact that proper encryption would give Apple an appropriate foothold in China?
It's a disaster, and an absolute stain on HN, that your sensible comments are getting downvoted.
I am a huge advocate for privacy, and I understand the technical details, but like you, I can't see what magically protects iPhones from search warrants.
I _love_ that Apple have made it so hard for unauthorized access to my device, but I cannot understand on what basis they're making the current stand, given that the FBI are asking specifically for a modification that will let them attempt to brute-force the device, rather than anything more magical.
Because Apple realizes (rightfully so), that it's a slippery slope.
The only way to prevent a tool from being misused is to not create that tool in the first place.
If you _love_ that Apple is making it so hard for unauthorized access to your device, why do you also think they should create a tool that allows unauthorized access to your device? Are you so sure that you'll never be on the wrong end of a government fishing expedition?
I'm generally law abiding, yet I still don't think the government should be able to view my private records just because my name came up on some secret watchlist that I'm not allowed to know about and could entirely be because my name matches someone else's name.
The only reason the FBI didn't ask that Apple fully decrypt the device is because they know that bruteforcing a small keyspace is trivial if the device doesn't have countermeasures - not because they thought they were overstepping the bounds of their request.
You're assuming that this request is at the bottom of the slippery slope rather than at the top.
The FBI made clear that this was a special case, just one solitary phone to decrypt from a nasty terrorist. And no one likes terrorists, so we can all agree that we need to read his secret data.
But then it's revealed that there are dozens, perhaps even hundreds of these requests in queue, and few involve terrorism.
And after the government is successful at coercing Apple into creating special software, why do you think that the next stop won't be to coerce apple into handing over signed code that will work on any phone (or even signing keys themselves)? After all, if the government needs to read data off of thousands of phones, surely they can't wait for Apple to do them all.
> The FBI didn't ask Apple to decrypt the device because that would be impossible.
Decrypting the phone is only impossible if the user had a strong passphrase, which is unlikely, so if Apple turns off the brute force checks, decrypting the phone means only cycling through 10000 or 100000 passphrases. A few hours or a few days at most.
If decrypting the device is impossible, then the FBI wouldn't be asking for Apple's help.
Not all slopes are slippery. But Americans have had several hundred years to get to the grips with search warrants allowing the government to search stuff. Nobody seems to want to answer why an iPhone should be more privileged than — say — a diary.
Apple cannot decrypt the phone. Removing the brute force checks is not the same as decrypting it, because even in the nightmare scenario where the FBI get their hands on this modified software, and randomly start installing it without a warrant, it only works if they have the device and you've used a crappy pin code. If you use a strong pin, you're protected, still. Except for the whole contempt of court thing.
Don't we have freedom of speech in this country? What other situations could the government compel Apple to say or write something for the FBI? Or do you think that computer code is not a form of protected speech?
One kills people (gun), one connects people (phone).
One shutdown pre-emptively (gun) prevents deaths (ostensibly) -- the other reveals the plans of the killers.
I do not have a more appropriate analogy, but I feel that the government has NO FUCKING RIGHT to ask apple to do what they are doing in a supposed free-market capitalist state.
These are the acts of tyrannical/dictatorial states.
So fuck the government in this case. no WAY should apple comply.
1) There is no search warrant in the case in question, but an order under the All Writs act.
2) If Apple had designed the phone properly in the first place (ie, impenetrable even with their help), then they would indeed be protected from attacks via search warrant.
A sufficiently robust security solution doesn't care whether the attack happens to come from the state.
Apple is going to lose this, and it should, under the law. It should also be free to develop a Secure Enclave that prevents iOS updates in lock, which sounds as though it is already in development.
> 1) There is no search warrant in the case in question, but an order under the All Writs act.
I was under the impression that there is a search warrant that the government has secured to search the phone, and an order under the All Writs Act dragooning Apple into writing custom software to support the search authorized by the warrant.
Right, sure. I meant that there is no search warrant at issue in FBI v. Apple (or whatever the case ends up being called). Nobody disputes that the FBI properly effected a search warrant against the home of the shooters.
But Apple is not a party to that document, and its legitimacy is not in dispute in this case.
Are you sure? (Other than your inflammatory "involuntary servitude" silliness.)
A warrant actually can compel third parties to help in a search. What do you think happens when a business gets a warrant? They have to assign someone to go search for and find the records the warrant asks for.
Or if you serve a bank a warrant to open a safe - the bank has to go and help open the safe.
While true, in your examples the third party is obtaining something that is already within their possession and they are not being asked to create something new. There is a reasonableness to searching for a document, but essentially Apple is being asked to create something that does not exist.
Think about the first court case where this "FBiOS" is used against a person that is on trial. Wouldn't the defense have a potential field day with this, and wouldn't Apple then be drug into every trial to testify? I could see the defense claiming that because the OS was replaced it is impossible to determine for sure what the original device contained versus what the government may have added to it.
Essentially from my very limited understanding of forensic techniques, techs are not allowed to ever add software to a device and can only search the device. Otherwise they are essentially "modifying" evidence. But I am not an expert in this area so I may not be fully understanding some of the intricacies.
Yea, it may be true for this one case that they are not looking for evidence so much as leads. The problem is this one case affects 100's or potentially thousands of others in the future, by setting legal precedent.
Based on news reports there are currently 100's of criminal cases and many prosecutors waiting to use the "FBiOS" once it is developed. If the DOJ's goal was strictly about terrorism there are alternative ways to solve this and Apple would likely be more accommodating, as it appears they have been in the past. But the DOJ is now looking to establish legal precedent within the courts so it can be used routinely and not just for terrorism cases.
No. That is what the draft does. And if it is legal and constitutional for the government to take a kid, ship it half way across the world and get him killed - i think that commenting two if's in a codebase and recompiling are quite less onerous.
The draft is authorized by express constitutional provisions regarding the militia and powers related thereto assigned to Congress (for providing rules for calling them into service) and the President (for directing them once so called) and implemented through statutes adopted by Congress.
Unless you imagine that the All Writs Act is intended to be an exercise of Congress powers relating to the militia, that's not really going to support this order.
Well, I guess if this gives the government to much trouble they will forbid phones in the USA having encryption at all. Just like they forbid having torrent software on you computer, of having radar detection in your car (in some counties).
All simple available tech products, but not legal because it bothers them...
I think the FBI might have made a mistake in picking a privacy fight with a gay, middle-aged Alabaman who controls the (2nd) most valuable company on Earth.
"and whose brand has taken a strong market stance on privacy and security."
I think people expect Microsoft and Google to cave given how cavalier they've been with user data, particularly of late, so seeing them back this is a bit two-faced.
However for Apple, this could have a significant impact on how their brand is perceived in the market depending on how high of a priority the "privacy/security" aspect is with their customer-base. They will fight this as hard as they can because those dollars aren't just wasted legal costs, they are building trust in the brand since they are putting their money where their mouth is.
Former Apple employee here. My impression is that Tim and other high level management are really committed to user privacy in a very Jobsian way. It isn't something that's being driven by some marketing person (although they are clearly using it that way). Honestly, I doubt their business would suffer if they caved. It's their philosophy. I saw it at all levels within the company. I think their stance on user privacy is a natural outcome of the need-to-know company culture within Apple.
Great point. And I think my comment may have implied more of a "this is a marketing play" message than I intended.
Everything Apple has done over the past few years have really cemented their stance on privacy/security as a top priority, even when it means sacrificing some obvious revenue in the short-term.
A more accurate way to phrase this is I feel privacy and security are now a part of Apple's DNA much like design has been historically. It isn't just what the leadership stands for, it is what the company stands for. So when the question arises internally of "how far do we fight this," I'm left with the distinct impression that the immediate response is "as far as we need to in order to win."
- Apple have chosen that strategy, and either loses or wins the trial
How different it's market share would look like considering that realistically it's largely a prestige brand, and the relatively low popularity of secure Android phones like the black phone ?
Tough call. It would certainly be one of the most public discussions around government privacy we've had with regards to mobile phones.
Apple would take a big PR hit from being associated with having unsecure phones, but that would also raise the question of whether other phones are any better (or worse). So it may be a wash all around.
I definitely agree. This, combined with Apple open-sourcing Swift, makes it feel like Apple is grabbing some of that coveted tech/developer-friendly cachet that Google and Amazon seem to have over them.
It's surprising that this is rarely mentioned in this debate. The US is pretty open and tolerant now, but Cook has lived through times where being gay would destroy a career. Even now there are countries where being gay can get you killed, and I'm sure this is part of the reason he's fighting as hard as he is.
Fair enough, although that has the unintended side-effect of very slightly enriching people without your scruples.
Slight tangent - this is a variant of the Trolley Problem. I think it is sometimes morally good to do something slightly morally bad in order to acquire significant resources which can be used in a much more impactful way. For example, I don't think it's hard to argue that it is net morally good to make millions in slightly sleazy mid-2000s affiliate advertising and then give away most of the money to the Against Malaria Foundation.
I agree that it is always difficult to know where to draw the moral line in cases like this, but if the Saud’s decide to sell Aramco then it will be because they are desperate for money. Anyone buying shares will only increase the price and hence the amount of money going towards propping up what is a totally corrupt theocracy. The sooner the House of Saud is forced by lack of cash to join the 21st century the better.
Yes, they're comparing the market cap in the article, which changes like every 5 minutes or with stronger gusts of wind. At current market cap AAPL is about 50B more valuable than GOOG. Hence Apple is again the most valuable publicly traded company in the world.
Rather this is what should be happening. Principals set forth their arguments, congress steps in and sets the rules which then the courts interpret and settle.
Government and manufacturer and service provider along with the constituency coming to a decision on what should and should not be allowable.
As I understand it, the gov, is setting up a commission to investigate this question, as it should [1].
As for mr Cook, I agree, what he speculates to should not happen, and as far as I know is not happening. He's essentially setting up a strawperson argument when he states: "If a court can ask us to write this software, think about what else they could ask us to write. Maybe it’s an operating system for surveillance, maybe the ability for law enforcement to turn on the camera."
No, the enforcement arms of the government should obey the law and leave making laws to Congress and the constituency. CALEA specifically prohibits government agents from ordering private companies to adopt "any specific design of equipment, facilities, services, features, or system configuration". Congress considered this authority and decided not to grant it to the police.
Precisely and this is what is beginning to happen. The congress is now taking up the issue as it should[1]. Mr Cook on the other hand engages in speculation to bolster his case --which is fine but he should also be called out on it.
I think it's pretty obvious that Cook's statement is speculation. Why "call him out" on it? I think his speculation is actually fairly plausible, but even if I didn't, it's certainly not out of place there.
Because it's not all that different from people who speculate that data valuable to the terrorism investigation is on that and other phones. Maybe there is maybe there isn't, but we should not make policy based on speculation.
We should leave policy to policymakers and courts as well of course as constituents' input into the congress.
The question for people, congress and the courts is, should phones be different from other mediums of communications as well as data repositories where warrants cannot be executed to reveal or discover data in the course of a criminal investigation. Will people, the congress and courts decide that data should remain dark or can be revealed by court order. It's rather a basic question.
"If a court can ask us to write this software, think about what else they could ask us to write. Maybe it’s an operating system for surveillance, maybe the ability for law enforcement to turn on the camera."
I don't find it speculative to point out that setting a legal precedent here could have far reaching consequences well beyond what is being asked of Apple specifically in this case.
Did you watch the full interview? Cook is saying the AWA should not be used to compel Apple to act. He's saying that there should be public discourse and that congress should make any necessary laws so that the public has a chance to weigh in during elections.
The government is using courts to decide this. Judges are unelected officials, and that is absolutely in the process of happening.
Conspiracy theory warning: might I ask, is there a remote possibility that this is a coordinated attempt by both parties at making the world feel safe about iOS, while this backdoor already exists? Like the allies and decrypted enigma data during ww2, used only in strategic occasions to prevent suspicion? Should we trust a closed-source handset?
That would be quite a coordination. I doubt it, because the debate will inform the public about encryption, and there are people watching Apple's moves here who are prepared to call them out if they give anything but the best security in future versions of the iPhone.
Personally I track the software Snowmen endorses and use that as a gauge. You can google that and compare the security features of software out there. The ones snowden uses are open source and have the most security features.
That doesn't say anything about Apple, but the idea that it is coordinated, given Cook's private life and his stance on privacy, seems slim to me. Note also that Apple was the last major company to be forceably joined into the Prism program. They've always been resistant.
The risk of coordinating an attempt would be huge. If info got out, there'd be irreparable damage to a massive company.
Maybe they have their price and literally billions of dollars could have made it happen, but I think it's pretty easy to understand how their success has instead come from attractive and effective products.
I have wondered before about whether intelligence organisations would be smart to co-opt/fund rising communications companies, but there'd be too many people that'd have to be kept quiet. Probably not practical.
There is an inordinate amount of noise being made about the issue.
Often people make less noise when really trying to get what they want. Sometimes noise is for other reasons. Not saying it's a conspiracy but jeeze.... enough already! We know, we know... Apple can be trusted 100% and are our bestest friends and even those mean governments can't come between us! We get it. And a few of us even might believe it.
It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
In the physical example, according to the FBI's "just this one iPhone" claim, one would reasonably expect that the company could then destroy the hypothetical master key as soon as it's used. This makes sense in a physical world, but the analogy breaks down completely in a digital world. [Returning your spider doesn't solve the problem](http://www.27bslash6.com/overdue.html).
In the digital world, you can't guarantee that the key hasn't been copied, and you can't guarantee that destroying the "original instance" of the key destroys all others.
The custom OS that the FBI is asking Apple to build will also take development time, and likely take more than one person to develop, meaning that if there's a security breach during the OS's development, any number of intermediate builds may also be stolen during development, before the FBI can even access the particular phone in question.
> It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
I'm not trolling at all. Genuine question.
Assuming what Apple said in the open letter is true:
> Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
Is like ask the Company to create a tool/masterkey able to open "just that safe".
I well know that in software you can make the OS to auto destroy itself (TTL) etc... but still imagine that after they get the OS they will copy it and start reverse engineering it.
>In the physical example, according to the FBI's "just this one iPhone" claim, one would reasonably expect that the company could then destroy the hypothetical master key as soon as it's used. This makes sense in a physical world, but the analogy breaks down completely in a digital world.
Why not? Can't apple just delete it?
>In the digital world, you can't guarantee that the key hasn't been copied, and you can't guarantee that destroying the "original instance" of the key destroys all others.
You can't do that in the physical world either. But you can be pretty damn certain that it isn't done.
>The custom OS that the FBI is asking Apple to build will also take development time, and likely take more than one person to develop, meaning that if there's a security breach during the OS's development, any number of intermediate builds may also be stolen during development, before the FBI can even access the particular phone in question.
Apple already takes this risk with every since iOS release.
Yes, but leaks of pre-release iOS software can't be installed on locked phones as a means of unlocking them, so the risk is not nearly the same.
If you really want to carry this analogy to term, fine, I'll concede that you can't be 100% sure that a physical key wasn't copied before you destroy it, but then you must take into consideration the complexity of manufacture and duplication - if the complexity of duplication is high, and you only make one, and guard it at all times, you can have a fairly high confidence (barring ridiculous film plots) that the key you're destroying is the only one.
With digital things, the complexity of duplication is beyond trivial. One copy leaks, and instantly there are tens of thousands, if not millions of copies in all corners of the internet. Physical objects simply do not behave this way.
Are you kidding? Law enforcement across the country wants to use this on thousands of phones.
Even if it were truly going to be used only on one phone, there's still a risk that the signed software gets out. That risk is exponentially increased when you realize that Apple will be required to retain this software for responses to law enforcement requests indefinitely.
>> It's unclear if you're trolling, but assuming you're not, the analogy doesn't work because digital things and physical things behave in fundamentally different ways.
I'm not sure I fully agree that things are different because it's physical over digital.
Lets say that the way a safe manufacturer could circumvent the lock on the safe is to build a custom tool that can rapidly try every combination much faster than any currently known method. Such a tool could be reverse engineered (i.e. copied) after returning it.
I agree that copying software is far easier than hardware, but it's the design of the tool that's important, not it's physical representation.
The only way I would agree with the FBI's "just this one iPhone" statement is if they got Apple to crack it and they just returned the data but not the method. Which of course they wont do.
> I'm not sure I fully agree that things are different because it's physical over digital.
...
> I agree that copying software is far easier than hardware, but it's the design of the tool that's important, not it's physical representation.
So you do understand that software is different. It is easily copyable, and all software is copyable.
> The only way I would agree with the FBI's "just this one iPhone" statement is if they got Apple to crack it and they just returned the data but not the method. Which of course they wont do.
I wouldn't even agree to that. Creating a signed copy of this software creates a vulnerability in iPhones worldwide that does not exist today.
is it admissible (under the current laws) to ask the Company to make a master key?
That's exactly what's under debate. The FBI believes the All Writs Act does allow that, and Apple (and many others) disagree.
The safe analogy is a bit flawed[1], but to play it out, the debate is over whether or not a company should be allowed to build and sell an uncrackable safe. Law enforcement appears to say no, and privacy advocates say yes.
[1] ... since if you build a safe that's actually uncrackable, it's by definition impossible to later modify it so it will accept some sort of master key, whereas adding software to allow disabling the passcode prompt on a phone is doable after it's built.
Please read more about this case and inform yourself. Creating or giving that master key, Cook says, will put every device owner at risk. And given that there are millions of iPhone users worldwide, that risk is huge.
Creating signed software that Apple feels is unsafe and not something Apple should be compelled to do.
Additionally, we all know very well that eventually law enforcement will demand their own copy of the software, or find it with the help of the NSA.
By the way, I'm totally with you that Apple should improve the security of their phones. But this case and that feature are separate issues. You can advocate open hardware & software and defend Apple's stand against the government compelling companies to act at the same time. Those ideas are not mutually exclusive. Hate the game not the player.
If they ask for the software, Apple should deny that request because that's very different from asking for assistance on a case-by-case basis with a court order for each one.
The question reasonable, non-technical people are going to ask is: how is this different than asking to search a house. There's a court order, it's not just the FBI acting on its own. They're asking Apple to do it, and not hand over a master key, which would not be reasonable at all.
Somehow Apple has managed to keep its signing key from getting out into the wild. Only Apple knows the systems required to keep that safe.
The same cannot be said of signed copies of software.
> If they ask for the software, Apple should deny that request because that's very different from asking for assistance on a case-by-case basis with a court order for each one.
If it comes to that, I'm sure they will put up the same fight. What's the difference to you whether they fight now or later? Must Apple act exactly as you would?
> The question reasonable, non-technical people are going to ask is: how is this different than asking to search a house. There's a court order, it's not just the FBI acting on its own. They're asking Apple to do it, and not hand over a master key, which would not be reasonable at all.
To those reasonable, non-technical people I would say the following: "If we create special signed software for the government for unlocking the iPhone, then terrorists will just use something else that is out of the reach of our government. We will have weakened our privacy and this will not keep us safe from terrorism. If the signed software gets into the wrong hands, it could be used by anyone in the world to hack into iPhones and get location data on your kids, your home, and other information you'd rather keep private."
> Somehow Apple has managed to keep its signing key from getting out into the wild. Only Apple knows the systems required to keep that safe.
> The same cannot be said of signed copies of software.
Presumably, you could employ the same systems for both pieces of data, since it should all stay within Apple.
> What's the difference to you whether they fight now or later?
Because they should respond to the request that was made on its merits, not on "what might happen".
I'm just pointing out the kinds of arguments that reasonable, non technical people are making.
For what it's worth, I'm pretty technical, and all these arguments in Apple's favor seem a lot like arguments for why law enforcement is generally untrustworthy, warrants are meaningless, and everyone should resist every search under every circumstance because what will they ask for next?
Aside from the fact that's all probably actually true, that is not the case Cook is making. Cook is, in fact, trying to pretend like this represents a cryptographic back door, when it plainly absolutely does not.
> Cook is, in fact, trying to pretend like this represents a cryptographic back door, when it plainly absolutely does not.
He's not saying that. He's saying, yes there's a back door, and no we don't want to open it for you because once we do, everyone will have a chance to open the door, including possibly your neighbor or a foreign government who wants to snoop on US embassy employees.
What politicians do not understand is you can't outlaw encryption. You can't tell math to stop working. It just does. If they did understand that they would not pursue using the AWA with Apple. They would instead have more open discussions with Apple.
But our politicians keep thinking there must be some way around this, and there isn't. The sooner they get it into their heads that terrorists will simply choose a new device, the better, because
(1) In their attempt to shore up technology, the current administration is going to do more damage to the US tech industry. They already helped destroy Yahoo
and
(2) Law enforcement needs to find a way to keep people safe other than relying on back doors to seemingly secure devices. There are no doubt terrorists using secure communications now that the NSA cannot access, and no amount of begging or pleading from the DOJ is going to change the fact that math is not governable by humans.
There's this man-on-the-street video where people at an organic vegetable market are asked to sign a petition outlawing Di-Hydrogen Monoxide. Many do, as they are advised it so corrosive that it cuts through rock!
I'm starting to think gov't spying is the techie's equivalent boogieman. I fear this is born out of a shallow understanding of dystopian surveillance state sci-fi.
Currently, law enforcement can already intercept your calls and emails - Yes that means compelling your teleco provider with extra work. They can already come into your house and take anything they want: read any document, open any safe. All provided they have a court order. If they do any of this without a court order then they go to jail. And all the evidence collected illegally gets thrown out from being used against you.
Because we have standards for what can be used as evidence in a trial, we continue to have liberty, not because law enforcement has been kept in the dark technologically speaking. In fact, it's some of the least technologically advanced societies - like N. Korea and Afghanistan (under Taliban i.e. banned music) that come to resemble sci-fi surveillance states.
Finally: your smartphone's logs, your DNA, public surveillance video are far far far likelier to exonerate you if you're innocent than wrongfully convict you. We're thinking about this issue all wrong and technology will continue to play a role of helping the traditionally disenfranchised resist police over-reach - think what portable cameras have done to police interactions already.
Remember Steve Job's vision for Apple has always been a closed system end to end... He was security minded before the net. It is a shame that the government has taken the low road by trying this case in the court of public opinion after they permanently locked themselves out of the iphone! The worse part is using the grieving victim's families to make Apple look like the bad guys. I kind of wish Job's was alive for this fight! Tim Cook can't say it, but 14 people killed by some kooks vs millions of people's safety being intentionally compromised is not a bargain. 14 people die in mass shooting Damn near every month! What happened to good police work? Doesn't the NSA have everyone's communication data on a server somewhere anyway? If the government in the business of ordering businesses to work against their interest for the sake of public safety, where is the court order for tobacco companies to permanently stop selling cigarettes. Cigarettes kill hundreds of thousands of people each year. Luckily Apple is sitting on enough cash to fight this... I just hope their stock price and future aren't hurt by this deliberate attempt to assassinate their brand.
> the government has taken the low road by trying this case in the court of public opinion
But they're not.. They're trying the case in front of unelected justices. The court of public opinion would be to create a law with Congress, and give us a chance to elect people who will represent our views.
Security is a huge for advancement and social progress as well as business secrecy for innovation and market timing.
This hits the broad political spectrum and I can't believe our War on Terror has led us here. We now (legally somehow?) have internal systems that attack our own freedoms much greater than any terror attack could, it is the legislation after the fact that has taken our freedoms.
We are here now, where a company has to go public to get back to the old real freedom ways of the Fourth Amendment, to be secure in your papers (data). That was written for a reason at a time when people might try to take advantage of powers to protect people. This is a frontline battle.
Thank you Apple, the new killer feature is security...
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
Unless congress is willing to outlaw the ability of Apple (or any OEM) to securely architect their phone hardware such that it is impossible for them (the OEM) to be helpful to law enforcement, this issue will soon solve itself. These OEMs simply do not want to be in the middle of these investigations.
The article doesn't tell the complete story of Cook's position.
I would instead summarize by saying that Tim is not afraid to call out the government on their lack of communication skills, and he admits his discomfort of the position he is in.
Here are some relevant excerpts from the interview [1]
----------------------
15:12
David Muir: You have talked to the President before on these issues [yes] of privacy and security. Are you disappointed there wasn't more of a dialog with the administration before this swift action from the justice department?
Tim Cook: Yes.
David Muir: You wish there was more done?
Tim Cook: Yes. And I think there should've been... We found out about the filing from the press. And, I don't think that's the way it should be run. And I don't think that something so important to this country should be handled in this way.
---------------------
26:00
David Muir: I'm curious Tim. Did you ever think that you would find yourself at the center of such a crucial national debate?
Tim Cook: No. This is not a position that we would like to be in. It is a very uncomfortable position. To oppose your government on something doesn't feel good. And to oppose it on something where we are advocating for civil liberties which they are supposed to protect, it is incredibly ironic. But this is where we find ourselves. So for all of those people who want to have a voice, but they're afraid, we are standing up. And we are standing up for our customers because protecting them, we view, is our job. And I hope, and I think, I'm very optimistic, I think we will come together. I don't know what will happen. But I think we will come together and there will be one path forward. The US always comes out of these things well. I feel very good that the debate is going on. Even when people disagree with us, it is good that the debate is happening. That's what makes this country so special.
David Muir: And for you, personally, has this been the biggest challenge in being CEO of Apple that you've faced?
Tim Cook: I've faced a lot of challenges. But I've never felt, sort of the government apparatus. And so yes I would say this is right up there. But it's not my sole focus by any means. We're focused on making great products.
David Muir: ... Are you prepared to take this all the way to the supreme court?
Tim Cook: We would be prepared to take this issue all the way, yes. Because I think it's that important for America. This should not be decided court by court by court. If you decide that it's okay to force a company to do something that they think is bad for hundreds of millions of people. Then, think about this for a minute. This case is an awful case. There is no worse case than this case. But there may be a judge in a different district that feels that this case should apply to a divorce case. There may be one in the next state over that thinks it should apply to tax case. Another state over it might apply to a robbery. And so you begin to say, wait a minute, this isn't how this should happen. If there is going to be a law, then it should be done out in the open for people so their voices are heard through their representatives in congress.
David Weir: And if congress decided that there's this small category, this was a terrorist's iPhone. If congress decided that, if the American people signed off on that, you'd entertain it.
Tim Cook: Now let me be clear. At the end of the day, we have to follow the law. Just like everybody else, we have to follow the law. What is going on right now is we're having our voices be heard. And I would encourage everyone who wants to have a voice and wants to have an opinion to make sure that their voice is heard.
If Tim Cook wants this not to happen in this country (or anywhere), then he needs to ensure that future Apple products are sufficiently secure that he can say, "there's nothing I do to help - this device cannot be broken into by any method we know of."
That doesn't actually prevent the government from issuing orders with dire consequences for violation to manufacturers to break into devices on behalf of the government, it just means that such orders will leave manufacturers no choice but to accept the dire consequences -- and, if they can't, to stop manufacturing devices that they can't break into.
If you are referring to the law invoked in the Apple case (the All Writs Act), a company unable to comply does not face legal consequences. From Wikipedia:
"In the case U.S. v. New York Telephone Co. 434 U.S. 159 (1977), the Supreme Court established a three-factor test for the admissible application of the All Writs Act: the party ordered to perform an action cannot be too far removed from the case, the government's request cannot impose an undue burden on that party, and the party's assistance is necessary."[1]
I'm not a lawyer (and law can be tricky), but forcing a company to do something impossible (breaking into a truly secure device) seems very clearly to be a "undue burden". The real issue here is that Apple has created a device that is possible for them to backdoor themselves, and the clear solution is to remove that capability from future devices.
Actually it does. After all the warrant was issued after apple claimed they have the technical capability to do so. I am willing to bet that even judges will admit laws of physics being higher power than them.
Apple screwed themselves by not creating lets say double signed bootloaders - the code must be signed by the owner and by apple to run. In that case the only probable attack is microcode on the cpu or apple saying to the federal government - here are the transistors you need - good luck getting to them - we don't have the equipment to dig in microchips.
But if you accept that evolution of the internet in this direction in inevitable and a natural process of history, then this is like saying that "there's nothing stopping the government from issuing orders that gravity be temporarily disabled."
> But if you accept that evolution of the internet in this direction in inevitable
I don't accept that the evolution of the internet in the direction of consumer devices with manufacturer-unbreakable security is inevitable, particularly in the face of governments actively punishing manufacturers for failing to break into devices at the governments' demand.
In fact, I would argue that such an evolution is quite evitable.
There is nothing "inevitable" about the process of history. Governments can and have limited the ability of companies to secure data in the past and they will do it again, given the chance.
Do you think this would be happening if this was an IBM product and Cook was in charge of IBM instead of Apple? I think the corporation itself brings out some real politico-cultural prejudices in people.
Different companies have different values which foster different cultures and attract different people to work there. Apple's credo is think different. I've no idea what IBM's was or if they were friendly with the government in their heydays but it's an interesting question whether or not they were friendly with the government back then on intelligence matters.
Cook should play the card he has: Cook is known to be gay. What if "being gay" was illegal and then the government wanted to "find all the gays and round them up".
He should use that card as his last straw. THAT should help people understand the nefariousness of this matter.
I imagine that the subset of people who are in the "save lives, stop helping the terrorists" camp are generally not in the "worry about gay people under a hypothetical future government" camp.
I'm still not convinced Cook is right. This phone may very well contain information which could save a lot of lives.
According to the article, the ballpark is 2 weeks of engineering, so it is probably done in less time.
Why so quickly? Because the user used an extremely weak password.
Security is also the responsibility of the user, not only the tech companies. If you are really concerned with your own privacy, than you shouldn't encrypt your data with a 4 digit passphrase which is relatively easy to crack.
Considering the commited crime, and the relative ease of cracking the phone I'm in favor of the FBI. As long as the firmware is not handed over to them, but that should be possible I presume.
People worrying about Apple engineers smuggling the software outside should be worried everyday. People with access to the codebase can do this probably allready.
It obviously is not about this particular phone, but if you are still convinced that is the case nothing I will say will make you move.
The real reason is the millions of other phones out there. If there was any indication that this couple was acting as the vanguard of a large number of others acting in a similar way what exactly do you think the chances are that that data would be left on a phone when they took the effort of destroying their other phone?
This is the prelude to a large number of fishing expeditions and it seems the FBI made this particular effort possible by first ordering a third party to make it impossible to reach the data in any other way. If they actually cared about that data you'd think they had contacted Apple right away about the best possible way to get it out without bringing this to a head over a terrorist case where lots of panicky people would make the wrong decision out of fear.
If you really are concerned about your privacy: don't store your important stuff in a phone or a computer for that matter.
Ironically, terrorists on a suicide mission don't need ironclad encryption. They just need to stay ahead of the law long enough to do their deed and to make sure the data flows in only one direction through their organization without leaving a source address for any transmissions. So for actual terrorists that are even remotely looking at their operational security they could be transmitting at the same level within their cell in plain text using a few silly code words and you'd only know what they were up to when the ambulances arrived.
No, I'm not saying that. I'm saying that they caused the situation to exist in the first place, whether or not it is deliberate is anybody's guess at this point in time.
Your Apple engineers smuggling code example seems a little out of left field.
I haven't been following this super closely but as the recall the FBI is essentially asking Apple to provided them with a backdoored OS that lets them get around the passcode attempt limits. Apple is concerned that this both sets a precedent while also providing the FBI with a way to get around any passcode on any iOS device they possess. Cook has stated repeatedly that there is no way to guarantee that a backdoored OS would only be used for this one instance.
You say you're in favor of the FBIs request as long as the firmware isn't handed to them - but isn't that exactly what they're asking?
Alledgedly, the FBI already has the information (via iCloud backup), or at least had access to the information before an FBI made error.
In any case, Apple has already been known to comply requests to hand over data on an iPhone. If it were simply the case they needed the information, Apple may have done the whole thing behind closed doors.
The FBI decided to bring this case out into the open, and given the reports about even more phones that need to be cracked open, you have to consider that this may have been a move made to force Apple's hand.
It's not this particular case that has Apple and others up in arms. It's the bone chilling precedent that the government can compel Apple to create software to compromise their own encryption protections.
The FBI directed the custodians of the device to reset the password in order to bring about this situation. Why would they do that? Because they want the precedent.
IANAL, but by that action alone the FBI's request should be dismissed with prejudice. Besides, most of the data they are seeking can be obtained by other means (such as phone carriers) as Apple points out in it's filing.
We talk about consumer electronics. This encryption is strong enough so your wife can't see who you called last night. But now it's being used by terrorists who plan attacks on a country.
I mean, an iPhone is not a heavily secured server storing Iran's nuclear secrets. If you really need to protect information, you as a user also have a responsibility.
Contrary to the government’s contention that CALEA is inapplicable to this dispute, Congress declared via CALEA that the government cannot dictate to providers of electronic communications services or manufacturers of telecommunications equipment any s
pecific equipment design or software configuration.
In the section of CALEA entitled “Design of features and systems configurations,” 47 U.S.C. § 1002(b)(1), the statute says that it “does not authorize any law enforcement agency or officer —
(1) to require any specific design of equipment, facilities, services, features, or system configurations to be adopted by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
(2) to prohibit the adoption of any equipment, facility, service, or feature by any provider of a wire or electronic communication service, any manufacturer of telecommunications equipment, or any provider of telecommunications support services.
True! That's why I don't understand the issue at all. It's probably the reasen the FBI went in the open, to get public presure to get what they want.
Anyway, China and Russia also act to their laws so we cannot rely on that. What if Trump takes his first shit in the White House and wipes his ass with every privacy law he can find?
Hackers should be independent of laws, provide for their own tooling which can protect people like Snowden.
People who care about privacy must take care of themselves. And if some terrorist decides to kill people and use a 4digit passphrase I won't defend the guy.
I'm not against privacy, on the contrary, having worked on compliancy/risk management software. (Some of) you guys have no idea about the crazy amount of data which is acquired about almost every person with a SSN. Kept in hundreds of databases, balancing on legal boundaries.
I see your point. In the case of countries without a strong rule of law one doesn't want to rely on legislation to protect data. Better to possess a technical means then.
Where my opinion differs is regarding "[people] who care about privacy must take care of themselves." I don't think an activist should have deep technical knowledge of cryptography in order to do activism. Doing so establishes a de facto barrier to entry. (Cryptography is hard, and not everyone has the means to acquire the expertise.)
I would actually go a step farther and advocate a responsibility of knowledgeable hackers to provide activists (whose objectives they agree with) with the tools needed to communicate securely. In the age of media ubiquity, privacy rights are sacrosanct. Something along the lines of what Open Whisper Systems is doing.
As for the terrorists case, I won't defend them or their actions. However it's still a provable fact that the FBI was in possession of a means to access the data they are seeking, yet they chose to deliberately reset the passcode and create a need for this additional access. Viewed in the kindest light possible, it's incompetency. Assuming the worst, it's an attempt to manipulate the media and the public into giving them more invasive tools to surveil citizens.
It's not about trusting the American government to do the right thing. It's about trusting this government and all future American governments to do the right thing.
EDIT: Also, at least one legal expert estimates that the average American accidentally commits three felonies per day, meaning that most of us are mistaken when we think we don't have anything to hide.[1]
[0] https://en.wikipedia.org/wiki/COINTELPRO
[1] https://en.wikipedia.org/wiki/Harvey_A._Silverglate