Hacker News new | past | comments | ask | show | jobs | submit login
Apple – Privacy – Government Information Requests (apple.com)
278 points by declan on Sept 18, 2014 | hide | past | favorite | 204 comments



"On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data"

This is key. The way we engineer software and services can have a major impact on the war against overly invasive government requests. We know that these requests will come; it's our responsibility to design things in a way that protects customers from our legal obligations when confronted with them to the greatest extent possible.

While this certainly serves their own interests, kudos to Apple for baking this type of consideration into the basic iOS design. They should and will be financially rewarded for it.


What's keeping government agencies from putting keylogging code on the SIM card or baseband processor (whichever has the best access to host cpu/memory) via the carriers to obtain the passcode? Not much I guess.

Has Apple publicly claimed that they will also refuse to push individualized compromising code updates to devices on demand by gov't authorities?


Some very knowledgable iOS security people have told me how hard it is to break iOS. You need quite a few chained exploits to do anything meaningful. Browsers are pretty much the only things with Read/Write/Execute memory.

Security is always a convenience/security trade-off. iOS is about as good as you can get before inconvenience will turn people to less secure devices.


There is the question of vanilla iOS being susceptible to attack from NSA and any other malicious attacker and then there is the question of is the security design sufficient such that even apple could not backdoor your privacy should they desire. The latter is what the original commented was addressing. I think the statements from apple do not clarify this matter and we should assume they can backdoor should be required by warrant. I need to see detailed technical specifications for their claimed security measures before I would trust these statements (if someone has a RTFM link, much appreciated).


>keylogging code on the SIM card or baseband processor

That won't work.

>Has Apple publicly claimed that they will also refuse to push individualized compromising code updates to devices on demand by gov't authorities?

No, and even though I strongly doubt the government would even try to do this and even more strongly doubt Apple would comply, their new tech (apple pay and touch ID use hardware support to even protect against this sort of breach which shows that they are certainly thinking about it). I'm pretty sure Apple won't even have the ability to silently push an OS update (if they did then someone could find out because it would be in the OS and then all hell would break loose) - it would have to be accepted by the target which means there is a reasonably large chance of detection.


Not on the sim card, but on the base band processor it very probably will work, since the base band on many cell phone cores share the memory with the application processor.


Well it certainly doesn't work like that on an iPhone.


Please illuminate me then as to what the iPhones processor architecture looks like, seeing as iPhones also use Qualcomm basebands, same as most Android devices...


The baseband processor has no access to the parts of memory it is not meant to, possibly some of the early iPhones did but currently they are properly isolated.


Please substantiate with a link.


The baseband firmware is cyptographically signed by Apple (and would thus require their complicity in backdooring) and the A7 models have a so-called Secure Enclave, which handles the encryption and decryption of memory contents.

Link: http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...


Thank you very much, I'll definitely read through the doc through the course of the day. But a cursory glance still does not mean that rogue software on the baseband cannot compromise your phone, but it is getting more and more difficult to place rogue software on the baseband.

"And would require Apple's complicity" - Or perhaps rather Qualcomms's...


Considering iPhone has their custom processor design, this information is most likely confidential. However, it isn't inconceivable that some sort of memory protection method similar to IOMMU can be added to the internal fabric to prevent baseband processors (or any other peripherals) from accessing arbitrary physical memory.


Thats fine, I just wasn't sure how abrittishguy can state his assertions so absolutely as facts.


Why the downvote?


They still don't have physical keyboards.


The quote is extremely misleading. Sure, it's encrypted by your passcode, and that's great.

It's important to note however that the passcode is just 4 digits long by default, and could be bruteforced by Apple in milliseconds if they wanted to. So to say that "Apple cannot bypass your passcode" is misleading, as guessing it is absurdly easy.

http://www.slideshare.net/alexeytroshichev/icloud-keychain-3...


It's not quite milliseconds. According to https://www.documentcloud.org/documents/1302613-ios-security... they've increased the iteration count somewhat:

“The passcode is entangled with the device’s UID, so brute-force attempts must be performed on the device under attack. A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds.”

This is still an argument for using a longer length code, however, since a simple 4-digit number would only take 800 seconds to brute force.


This can't be true. If you restore your iCloud backup to a new device, all you need is your passcode. It obviously won't ask you to enter the UID of the device you previously backed up with.


You're not limited to 4 digits. In fact, when you set up Touch ID, the phone prompts you to enter a longer text-based passcode on the basis that you shouldn't have to enter it often.


The government has your fingerprint.


Which doesn't help them much because the fingerprint will only reveal the passcode if they gain physical access to the device's "secure element" and its contents (which is reportedly hard).


It is really surprising how much HN turns a blind eye to / lacks imagination for potential security issues just because it's Apple.

Well, actually it isn't surprising at all.


I'm inclined to reverse that. Whenever Apple tells the world about a way they're attempting to protect their users' privacy above and beyond what any other for-profit technology company of comparable size and scope has done, HN immediately tries to deconstruct it--which is by and large good, since we need to remember that no security model is perfect. But a small but vocal subset always demonstrates a remarkable level of imagination in presenting scenarios in which Apple can actively conspire against their users while remaining within the letter of their guarantee. I'm not positive this happens jut because it's Apple, but I have my suspicions.


You've got a lot of reasonable responses above.


I am certainly not "HN". If you want to argue/discuss, do so. I'm well aware that a "secure element" is never 100% unbreakable. But in the usecase the GP mentioned (government broadly gathers data), hiding the data in a separate hardware compartment will at least help.


I think it's just PR bullshit. Apple has been known to abuse the law. I'm sure some government gave them the green light to lie about this and have another backdoor.


My government doesn't have my fingerprint, but the US government does...


If you grew up in the US, they already have it. Every elementary school kid in the US gets fingerprinted.

EDIT: It appears I was wrong, this was only in LA county.


Uh, what? Neither I nor my children have had such an experience in public school.


Not the point. The point is that they've done enough to legally respond to government requests by saying "we don't have a way to access the data". Hacking customers' phones, regardless of how easy it might be, is far beyond the scope of anything that a US court can order a private party to do.


> is far beyond the scope of anything that a US court can order a private party to do.

They could order Apple to disclose signing keys so that the government can install spyware themselves. See http://en.wikipedia.org/wiki/Lavabit#Suspension_and_gag_orde... for a case where they have done something similar before.


>the government can install spyware themselves

It would have to be an OS update since applications don't have access to that stuff, even if signed with an apple key.


That's probably not quite true. Take an app that the user is already likely to have given access to their photos, like Facebook. Create a malicious app with the same app identifiers, sign, and push to the user. At that point, the phone thinks it's Facebook, and should allow access.

That said, it'd be kind of unsubtle, and they'd probably get caught.


Ahh yes, I assumed we were talking about messages.


I assumed it was primarily the review process ensuring that the proper sandbox configuration was included in the bundle and applied to apps at runtime, checking for private API use automatically etc, and that Apple could probably ignore their own restrictions if they chose to, especially those private APIs.

iOS won't let even the most permissively configured, "unreviewed" app do things that apps aren't supposed to be able to do?


But its something the NSA is more than capable of accomplishing


It is only protected with just the passcode when on the physical device - Apple cannot bruteforce it because they don't have the capability to access it remotely (they can only wipe it).


Is this still the case in iOS 8? I know it was the case previously, but has anything changed in iOS 8 to prevent apple from just brute forcing the 9999 combinations with a mounted flash drive?


There is a "self destruct" option to wipe the iPhone after 10 failed pass code attempts. It's not bulletproof against a bruit-force but it's something


I am aware of once incidence of an Australian state police forensics unit being unable to circumvent a non-simple passcode lock on iOS 6.1.3 with physical access to the device (iPhone 4s), set to wipe all user data after 10 unsuccessful attempts. Seems secure to me.


Yet the page says "less than 0.00385% of customers had data disclosed due to government information requests."

So they ARE giving data to the government at their request...

BTW, that percentage of say, 100 million customers (which I'm sure Apple has, probably more) is 3850 people, not an insignificant number...

Anyhow, I personally wouldn't trust anyone anyway. Google, MS and Apple are all American corporations, and need to comply with US law, no matter how asinine.


Their detailed report[0] is pretty clear, and defines what they mean for the categories they're reporting. According to that document, they had 20221 device data requests, and they provided data for 12146 of those requests (some of those requests have many individual devices). For account data, they received requests for data from 2803 accounts, and provided some data for 1333 accounts.

0: https://www.apple.com/privacy/docs/government-information-re...


Unless I’m doing the math wrong, 0.00385% of 10 million is 385.


Not to mention that Apple has sold over 800 million iOS devices.


Yeah, my math was bad, too early in the morning (that was pre-coffee). Anyhow, they have way more than 10 million customers (likely more than 100 million too).


I am a bit surprised that there is no explicit declaration that they don't have user's passcode.

It should be obvious, but I think it should also be stated in a way that makes it a lie if it's later discovered that left some way to access or keep the passcodes.


Aren't all these data backed up to iCloud? Is it also protected with a passcode?


This seems like the flaw in the argument, yeah. iCloud isn't mandatory, of course, but most people do use it, so peoples' photos and things will be in The Cloud, anyway. Possibly several The Clouds; Google+ on iOS keeps badgering me to enable automatic photo uploading.


Hmmm.. Isn't it way easier for the feds to peek into the cloud for my data without even me knowing so that I'll keep giving them the data then to confiscate my phone and investigate it?

To me, it seems like this passcode thing is a good way to keep my data safe from thieves but from the govt.? don't think so.


This. In the coming years it will be increasingly convenient and perhaps mandatory to have your data on iCloud - if you want to backup, restore or find your devices. The latest iCloud upgrade allows for 1TB of data - it's easy to imagine having your entire laptop, phone, ipad up there in the near future. So physically preventing access to a single device is not nearly as important as protecting your entire digital life in the cloud. This announcement by Apple is a good sign, but it feels like the real battle for privacy from govt snooping will be in the cloud.


Yes; if this is something you're worried about, you probably shouldn't use cloud services. It's not like they're mandatory.


>"On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data"

Too bad they (and other phone manufacturers) don't protect phone calls with some kind of end-to-end encryption.


There's not much one company can do about the standards. [1] They can only control their own output.

Apple asserts that Facetime and Facetime Audio are end-to-end encrypted. And Google claims Hangouts are encrypted as well.

I don't know whether there are caveats (or how many) to either of those claims. But that's about as much as one could hope for in the current climate. [2]

[1] Particularly upstart computer companies dealing with the telecom oligopoly. Long cozy with governments and law-enforcement, if not an explicit part of government.

[2] It's a serious bummer that FaceTime never developed into the open standard they claimed at introduction. I've been curious about where that fell apart. (Competitor disinterest, patent liability, carrier terms, etc)


I heard a rumor that the FaceTime open spec never happened due to complicated patent issues, possibly related to this one: http://www.bbc.com/news/technology-20236114

All of this is a damn shame because in my experience the competing standards don't have the usability and quality that FaceTime provides.


Yet there's no way to be sure of that...


No way to be sure - but, the profit motive is pretty good. Apple is going all in on Privacy as the basis for their $500B+ business model.


All it takes is one secret court order to force them to leak the key to NSA.

They control the OS, they DO have access to everything.


Not if they designed it properly. That's like saying that all makers of encryption software can, if they really want to, access all data encrypted with their software.


Of course they can. All you have to do is leak passwords/private keys.

An OS is actually much more powerful, it controls everything, it has direct access to memory. If a user can see his photo, the OS can.


> Not if they designed it properly.

You can (and always should) design encryption code without having a single root key. Entirely open sourcing that code should not make any difference to the security of the encrypted data.


And then NSA forces Apple I to issue the OS update that breaks all your efforts.


That sort of order is out of scope of anything a US court can order, and I believe that an Apple employee would leak it before they complied with such an order.


> That sort of order is out of scope of anything a US court can order

How do you know that? Secret court orders are secret. "National security" trumps everything these days.

LavaBit chose to shut down rather to insert a backdoor that was forced on them. I doubt Apple will shut down over that.

And thanks to Snowden we know Apple's products have been backdoored already.

http://www.spiegel.de/international/world/catalog-reveals-ns...

http://www.infoworld.com/article/2609310/hacking/apple--cisc...


They don't need to create a special backdoor for the government, Apple has it's own backdoor that it can share with the government if needed... (they have full control of your device already when you are online)


But the laws that makes them legal are not secret.


The laws? I'm pretty sure mass surveillance is unconstitutional, that didn't stop them.

Are you saying it's outside of scope for NSA to forcefully install a backdoor or request encryption keys?

How do you explain LavaBit?

How do explain NSA's hardware in data centers?

How do you explain NSA's hardware in cell phone providers centers?

How do you explain NSA's backdoors in Microsoft's products?


How is that out of scope? Haven't they been doing exactly this for at least five years? Miscrosoft backdoors et al?

And I think an Apple employee is that last person that would leak anything. They work for Apple after all, starting to work there surely takes some abandonment of principle.

In the end, why would anybody feel assured by what someone believes someone else would do?

Edit: Why is imaginenore downvoted so heavily? His concerns seem rather rudimentary.


not quite, on the platform I am on you cannot extract passwords at all, they simply are not stored. encryption at the hardware level may have a backdoor but os level is similar.

lets be honest, if they want your data your the weakest link. if you want to be paranoid enough to go the route your thinking its far cheaper for them to take you to a back room and employ a hammer to you.


Doesn't matter if they are stored or not. If a user can access his files, so can the OS he is using.


Yet no comment from them about what being a "provider" under PRISM entails.

* "In addition, Apple has never worked with any government agency from any country to create a “back door” in any of our products or services."

If Apple provides an interface to request user-data to law enforcement / NSA, that's not a back door in the product or the service.

* "We have also never allowed any government access to our servers. And we never will."

If they provide user-data after being served with a warrant (possibly through email or to their legal department), their servers were never accessed, yet the data was provided.

It's always interesting to read what is and isn't said. Word games, I swear.


I find it amusing that people are being outraged at Apple, instead of being outraged at their government. From outside the US, it looks like it became generally accepted that the NSA will spy on US citizens all it wants, unchecked, and there is nothing anyone can do about it. Instead, all the rage is getting directed at companies that disclose the information to NSA.


Is this directed at me? I'm not outraged at Apple at all, I think they're trying to play the long game here. I'm just sick of the word games, deception, and legalese from anyone regarding privacy. It's hard enough fully understanding exactly how all the pieces fit together (technology, laws, politics, etc.) for people that are genuinely interested in this, let alone the average consumer. Pointing out loopholes in wording helps to start conversations about what is/isn't happening as well as what could happen/be-abused.


Well, I think whatever Apple says, you will find a "loophole" in the wording that supposedly allows them to do evil things. Try it: pretend you are Apple and just try writing a statement that you would consider acceptable and that you wouldn't call "word games, deception and legalese".

Myself, I think they did fairly well, certainly the best of any tech company out there today.


> Try it: pretend you are Apple and just try writing a statement that you would consider acceptable and that you wouldn't call "word games, deception and legalese".

Ok. "We're enabling custom encryption key management. You may now generate and use your own encryption keys. In addition to ensuring the device data is encrypted, nothing in your iCloud account can be recovered if you lose your key because it is all pre-encrypted before being sent to iCloud. Again, if you lose your key, there is no recovery possible. We'll also be opening up a new bug-bounty program specifically for identifying weaknesses and exploits in our baseband, firmware, and OS that could result in the leaking of your encryption keys and personal data. Here at Apple, we take your privacy serious."


I don't see any promise there that they won't somehow store your encryption key in a place where they can access it. I see a huge line saying "we invite you to try and catch us and if so we will try to find another way that you haven't yet found out about". So I think you failed on your premise.


You'd still get "But they could just push an OS update that sends them the keys"


Absolutely correct, no amount of wording will provide adequate proof. Thats why if you are providing a security product or product providing security features, open up the source code and provide a way to validate that that source is actually what is running on the hardware.

Until then I assume it is unsecure.


Often it seems like the people want businesses to take up the fight for privacy, rather than the people themselves.


And why not? Businesses, through lobbying, essentially control the US government.

It's not unreasonable that American citizens might feel more empowered voting with their dollars than with ballot papers.


It was about three $3.2 billion last year in 2013 from the federally registered lobbyists. The state level is probably equal to that. Some think the real numbers are closer to $9 billion at the federal level because PR and consulting don't have to be filed with the Senate clerk.

https://www.opensecrets.org/lobby/


Excellent point. So do it. iLobby.co


People don't care about privacy or they wouldn't be using Android.


Explain to me how businesses aren't people ?


Business are made of people, but a business is not a person in and of itself. That's why we have the saying "design by committee", because a group of people doing something will not lead to the same result of one person doing something.


I think people are outraged at both.

With the companies in question, it's not hard to vote with your feet and complain. With the government, it's very difficult.


You get the government you vote for.


No, that's not how it works. You get the most heavily marketed party that pissed the least number of people off in the last couple of years.


National elections have become such a farce anyways. A single senator or representative can change the political landscape, yet a single citizen cannot. I cannot vote for Senator Wyden, nor can I vote against Feinstein. Rubio is unfazed with my voting and McConnell couldn't care less about my opinions. Yet, these handful of senators have shaped the issues that I care about to be what they are today. The best I can do is not vote for Hatch, but his death from old age seems to be the looming reason for his departure from the senate.

When someone voted in by a single state can influence the entire country like that, it makes everyone feel powerless. I'd rather give my money to a company that has my interests in mind throughout the years in exchange for products and services than dump thousands of dollars into someone's campaign like a drunken bet at a Las Vegas casino every couple of years.


You get the government the majority of voters vote for.


You also get the one the majority didn't vote for, since it's more or less indistinguishable from the one they did vote for.


Minus that whole lobbying, "corps are people (only when convenient)", and gerrymandering thing, exactly!


I'd agree if US voters actually bothered to go voting, and participated in their own democracy instead of turning their brains off and watching TV. But seeing how they don't, the US really gets the government that the most active/noisy minory supports.

(The same is largely true in the EU, even though it's possibly more pronounced the US.)


Apple has directly addressed the PRISM diclosures, last year:

https://www.apple.com/apples-commitment-to-customer-privacy/

In addition the new section launched today includes a page on government information requests, including "National Security Orders from the U.S. government." One sentence summary: they provide data to the government when required to by law.

PRISM itself is a program that is structured as a request/response, and the requests are approved by the FISA court, so it would fit into Apple's language in that section.

The initial report of PRISM implied that the NSA and FBI had direct, unfettered access to providers' "central servers", but that has been since walked back a bit. The points of concern with PRISM are 1) poor oversight from the FISA court, 2) overly broad request criteria resulting in huge datasets collected, and 3) the lack of public oversight due to gag orders.


> The initial report of PRISM implied that the NSA and FBI had direct, unfettered access to providers' "central servers", but that has been since walked back a bit.

Really? I haven't been able to follow every report, as the Snowden leaks generated a lot of content over the past year. Can you give a source to where PRISM's central server access has been "walked back a bit"?


The Wikipedia page is up to date and contains a lot of links to external sources.

http://en.wikipedia.org/wiki/PRISM_%28surveillance_program%2...

The press report that most directly addresses the issue is probably this one:

http://www.cnet.com/news/no-evidence-of-nsas-direct-access-t...

Incidentally the author of that story is doing a startup now, visits HN, and actually has posted in this thread! Username is "declan."

Essentially, it seems there are a number of NSA programs that we can now distinguish from one another.

PRISM uses the FBI and FISA court orders to directly request records from hosted application providers like Google, Yahoo, Apple, etc.

But there are also other programs that claim to be authorized under the FISA law that target network infrastructure companies like Verizon and AT&T, apparently sucking up and storing huge amounts of raw traffic directly from network infrastructure. This would be the famous "secret room" at AT&T network building in California. These could suck up Apple traffic (or anyone else) but Apple would not be aware because it's at the network layer.

Then there is MUSCULAR, in which the NSA helped the British GCHQ hack into the internal networks of Google (without Google's knowledge) to suck data out of the unencrypted connections between Google servers.


Thanks, that's a really great summary.



>If they provide user-data after being served with a warrant their servers were never accessed, yet the data was provided.

Yes, that's true. But if you run an email service that's based in Cupertino, what do you do when served with a lawful search warrant or wiretap order? Say "no," and be found in contempt of court and have all your servers carted away by some men in black so they can find the file they're looking for?

There are a few ways around this. One is not to permanently store warrant-worthy user data (Snapchat, Wickr, etc.). Another is end-to-end encryption (original version of Hushmail), though key distribution and UX become problems. A third is to move your servers to Switzerland, though then you get served with process from the Swiss authorities instead of FBIDHSDEAetc.

Apple has taken none of these steps. Assume our hypothetical Cupertino-based email service has not either. What do you do when the Feds show up with a lawful order?


> What do you do when the Feds show up with a lawful order?

I don't expect them to say no. That's why I don't expect them to imply that they would or suggest that they never have.


> A third is to move your servers to Switzerland, though then you get served with process from the Swiss authorities instead of FBIDHSDEAetc.

And, if you keep your headquarters in Cupertino, quite possibly from US authorities as well (see the currently in-progress Microsoft case about access to servers in Ireland).


I go under the assumption that if I'm not the one generating and providing the encryption keys (and really, pre-encrypting the data), absolutely nothing is secure/encrypted. And in all honesty, if it touched the internet, it's already insecure to some extent.

It's been "fun" to read Jewel vs. NSA proceedings, press statements, and officials' statements about these issues because of the extent to which they play word games to technically not lie according to specific (re)definition of words.


Why stop there? If you're not root, you're not secure either. For example, if someone else is root, they can do whatever they want with your locally generated keys, upto and including sending them to the NSA with a little bow on top.

Even if you are root it's no guarantee, thank you rootkits :/

As with all things security, sometimes it comes down to trust and legal protections.


Open source client libraries providing end-to-end encryption and zero knowledge search, would enable a functional e-mail system that operates without any server held keys.

You could still NSL a backdoor, but if the service is open source at least there's a chance of the code being audited.


End-to-end encryption will improve things but it is still only as secure as your key management. For email to be convenient you need an easy way to get you friend's public key and easy ways are hard to make secure.


Agreed.

Without a trust the only thing that comes close is an in person P2P pairing ceremony.

Key management might work in an enterprise setting with a central authority, but making sure your friend's public key isn't swapped with the government's is pretty hard if you don't trust the cloud provider, telecom, or intermediate infrastructure.


There's no reason the refusal to store warrant-worthy data or to use end-to-end encryption would prevent men in black from carting away all your servers. In fact, it might very well increase the likelihood of them doing so.


You're forgetting that for all the Snowden leaks have revealed, the info in the press about Prism is still classified. IANAL, but I would imagine Apple could get smacked quite a bit if it started talking about its cooperation (not necessarily voluntary cooperation, that is) with NSA initiatives like Prism.

I mean when the wikileaks cable dump came out, there was quite a bit of noise about how US government people should not visit the site because it would still constitute a breach of security even though it was in the public domain, and that they could lose their jobs for it. This is no different, really. The material is still classified whether it's spread all over the press or not, so discussing it with uncleared persons such as the public would (IANAL, again) I imagine be a breach of security.


> If Apple provides an interface to request user-data to law enforcement / NSA, that's not a back door in the product or the service.

I consider that a back door.

>We have also never allowed any government access to our servers. And we never will

Makes it clear what he means in the first statement.

I've known Apple to engage in hyperbole to a significant degree, but never -- in nearly 40 years-- to lie publicly.

They haven't denied providing information under warrant. Basically, every US company is going to do that.

But that's on a case by case basis, not wholesale.


An interface to request data wouldn't be a back door, as that falls under the "request/response" model. "Here is our warrant for Person X by judge Y in district Z" != "sudo apple-get * /mount/nsa" The important distinction there is that each request is an individual request and can't be automated / done in bulk / wholesale.

I'm very interested to see how the device vs. account data-request ratio maintains/changes with the release and adoption of iOS 8.


I'd like to see a breakdown of exactly what they can provide with a device request and an account request.

I think that would be reasonable information to share, because it'd educate both law enforcement and the general public about what data is available from their devices.

Plus it'd settle the issue of word games- or at least bring it closer to being settled in my eyes. We can't get a good idea of what is going on with these 10k foot marketing pronouncements.


http://images.apple.com/privacy/docs/legal-process-guideline...

Page 4 onwards provides a list of information they provide to law enforcement agencies. Probably prudent if you're an apple customer to simply assume all of this information is as good as public.


I. Extracting Data from Passcode Locked iOS Devices

...

For all devices running iOS 8.0 and later versions, Apple will no longer be performing iOS data extractions as the data sought will be encrypted and Apple will not possess the encryption key.

Interesting. What changed in iOS 8, I wonder?


Law enforcement access is not the same as public access. I think it is extremely important to make that distinction. And there is one.


There are interviews were Tim Cook states the range of warrant requests (he isn't allowed to tell the exact numbers): http://www.youtube.com/watch?v=Bmm5faI_mLo

Seemingly, they also have some sort of canary in place in case they get secret requests under the patriot act: http://boingboing.net/2013/11/05/apple-hides-a-patriot-act-b...


If you look at the transparency reports, only the report for the first half of 2013 includes that canary (specifically mentioning Section 215).

It was removed for the latter half of 2013, and this just-released first half of 2014 report.

Take that as you will.


I highly doubt the canary would be a valid defence.

By omitting the line you are, for most intents and purposes, saying that you received a request. Now, saying you received a request and saying specifically what request you received are different, but I imagine the gag order doesn't make that differentiation.

Gag order is not that you can't say a specific thing, it's that you can't communicate a specific piece of information. The canary is communicating something.

A court order can ask you to lie, so the "I can't lie to my customers" defence probably won't work.


Here's an observation, and an idea for testing Apple's claims on iMessage privacy:

China seems quite determined to block IM systems which do not cooperate with the authorities and permit monitoring of communications. Most recently, both Line and the Korean KakaoTalk were blocked [1].

Skype remains useable in China, presumably because Skype permits efficient monitoring [2].

It seems unlikely that China would tolerate such a prominent opaque communications channel as iMessage in the hands of a significant proportion of their citizens.

Thus, if China refrains from blocking iMessage for a prolonged period of time, wouldn't it be reasonable to assume that China is in fact able to snoop on iMessage?

[1] http://www.ibtimes.com/china-restricts-messaging-apps-confir...

[2] http://www.reuters.com/article/2012/01/31/us-china-dissident...


It is interesting you are the only one who has mentioned China. The other side of the coin is Apple being worried about being locked out of China for not being secure enough. http://www.reuters.com/article/2014/07/11/us-apple-china-idU...


...where by "not being secure enough" means that location data ends up on US servers. I'd be surprised to see the same sort of statement if location data was collected and kept nationally.

Nonetheless, Apple's relationship to China is a interesting case:

• On one hand, China is posed to be the largest market for Apple in just a handful of years.

• On the other hand, it's hard to imagine China approving of e.g. un-snoopable instant messaging in the hands of the populace.


So the US is now a country where mainstream companies market it as a competitive advantage that they will try to minimize what they will release to the government. I'm glad companies are doing this, but I'm sad that they even have to.


The desire for privacy, and a healthy dose of suspicion for ones government, have both existed for as long as the notion of government itself.

Personally, I'm waiting for the other shoe to drop. Now that the police can't go to Apple for your data, we'll start seeing more judges ordering users to unlock their phones to allow them to be searched. I don't think it will be long before such a case reaches SCOTUS, and then we'll see how that works out...


I am no lawyer.

While lower court decisions have gone both ways, according to wikipedia,

    in United States v. Doe, the United States Court of Appeals for the 
   Eleventh Circuit ruled on 24 February 2012 that forcing the decryption of 
   one's laptop violates the Fifth Amendment [1]
So there is hope.

Perhaps a more practical problem is even with a 5 digit pin, it's entirely possible to simply try all combinations. You probably need more than 8 digits before that becomes impossible.

[1] http://en.wikipedia.org/wiki/Key_disclosure_law#United_State...


> even with a 5 digit pin, it's entirely possible to simply try all combinations

Given a long time. Don't forget iOS slows your brute-force attempts down substantially. I'd be curious how long a brute-force attack would take given the current behavior of the OS.


Is there no way to duplicate the whole filesystem onto a computer and try decrypting that? I didn't think they would actually be thumbing in every password combination...


There is supposedly no way of duplicating the secure element which is what controls the encryption.


The US government forced Lavabit to install a back door that didn't exist before (or go out of business, which isn't really an option for Apple). I wonder what is to legally stop the government from forcing Apple to do the same? I guess one answer is that Apple has more resources to fight or lobby.


Lavabit wasn't forced to install a backdoor, the guy running it just kept refusing to comply with previous legal requests which were much narrower in scope. Since he didn't comply they took the nuclear option (which he could have easily prevented).


Whether or not he was in contempt, that's still a bullshit way to approach the problem.

Contempt of court? Fine and then jail him for non-compliance. Seize Lavabit's assets if required. I doubt he would have let it come to that.

But their solution was analogous to "won't give us back that alleged stolen $20 we told you about? We want your whole bank balance."

Aww hell naw.


None of this would have happened if he would have complied with the legal court order.

I guess he made his point, but he screwed over his customers twice in this case. First by dumbing down his crypto to be easier to use and allowing it to be broken as it did by the feds, and second screwing every customer over by not complying and shutting down his service.


Prevented by complying, ie. selling out his customers' privacy?


By not complying he 'sold out' every single one of his customers privacy instead of the sensible choice of complying with the legal order to turn over emails of just one customer.


> Now that the police can't go to Apple for your data, we'll start seeing more judges ordering users to unlock their phones to allow them to be searched

Is not that against 5th amendment?


This is hardly a unique situation for the US. There are lots of countries where the state is surveilling their own citizens en masse, some for much more nefarious purposes, and it's not all tiny dictatorships either - plenty of large Western countries are doing it. From my perspective (as someone in the UK) it's actually really refreshing that in the US, some companies are pushing this as a competitive advantage. That means people actually care about it and will make decisions based on it, which is far better than the total apathy I'm surrounded by here. Companies having to work to minimise what is released to the government is not unique to the US; the fact that they are however is relatively rare.


The honest truth about all of this is, even if Apple were handing over information because of back doors, custom database interface applications for the NSA, they wouldn't tell us and would probably be gagged from doing so anyway, have we all forgotten about Lavabit? I hope not.

I think we are all intelligent enough to know that even if Apple were handing over information, it wouldn't exactly be good for business to admit you've been complicit in handing over personal details to the Government, would it? "Yes, we have been giving away your information, but we promise not to do it any more. Hey, we just released a couple of new iPhones, want to buy one"

Anyone else notice the page is cleverly worded and any mention of security seems to be limited to iOS 8 context? "In iOS 8 your data is secure", "In iOS 8 we can't give law enforcement access to your phone" - maybe I am just overanalysing things here, but I have learned not to be so trusting of companies as big as Apple considering the amount of information that they hold.

You know we're living in a new kind of world when privacy is being used for marketing purposes...


It is pretty naive nowadays to store any of your data online while thinking it is secure and no one will, or can touch it. Unless you do some hardcore encryption on your data yourself, which is really hard or even impossible with some data like call-logs, geographic location-data etc..


> less than 0.00385% of customers had data disclosed due to government information requests.

According to [1], there are about 600 million apple users, so this translates to 23,000 customers exposed due to government information requests.

Seems like a large number. Is 600M correct?

[1] http://www.cnet.com/news/apple-to-reach-600-million-users-by...


There are millions of people arrested in the US alone every year. It doesn't seem unbelievable that 23,000 of them had iPhones/iPads that law enforcement wanted data off of.


The "less than" seems pretty superfluous when they're giving that many digits...


My fundamental issue with Apple's privacy claims is they are pretending that they have a technological solution to what is, ultimately, a political problem. As the laws in the US (and I imagine some other countries stand), Apple can be compelled provide your data to appropriate governmental authorities, install back doors, not tell you and even lie to you and the world about it. As long as that's true, no assurance from any third-party service provider is worth a damn.

I can understand the marketing benefits Apple sees in making these disingenuous privacy claims. I'd be willing to call that "just business" except for one thing: Trying to persuade people they have a technological solution will necessarily get in the way of the absolutely vital political project of destroying the political and legal foundations of the surveillance state.


I'm very skeptical that traditional screen-lock passcodes offer useful protection for the average person. Most people still choose to use 4-digit passcodes for convenience, leaving exhaustive key search [1] well within the reach of even very small attackers.

Are these four-digit passcodes being used to derive encryption keys? If so, I'd like to hear where the additional entropy comes from. There's no use encrypting things with a 128-bit key when the effective entropy of the key is really only ~12.3 bits.

I'm sure the engineers at Apple would not have overlooked this; it would be great to hear more about the specifics.

[1] especially if the attacker can download encrypted data and try an infinite number of times (instead of e.g. typing the passcode on the phone or hitting the iCloud servers)


IIRC, there are three crypto keys involved:

1. PIN

2. Random key in effaceable NAND storage (generated on device reset)

3. Burned in permanent CPU-unique key.

I think OP's linked statement from Apple means that Apple is now also encrypting data stored on the iCloud servers.

http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...


>I think OP's linked statement from Apple means that Apple is now also encrypting data stored on the iCloud servers.

From today's announcement (scroll to "iCloud"): Mail and Notes are not stored in encrypted form on iCloud servers. http://www.apple.com/privacy/privacy-built-in/

From December 2013: Mail and Notes are not stored in encrypted form on iCloud servers. http://support.apple.com/kb/HT4865


Just to be clear, we're talking about mail sent with an @icloud.com or @me.com address, right?

Surely all IMAP traffic doesn't flow through their servers.


Then what has changed?


This is a very good question. I'm now no longer sure.


And this means the passcode has to be broken on the device. The key is generated from the passcode and the other keys you mention via PBKDF2, which slows down the cracking process. You also have to have a non-destructive jailbreak or Apple's update-signing key to do the cracking.

I'd recommend using a longer passcode. If you don't want to use the keyboard, choose a long number for a passcode and you will still get a number-pad when entering it.

Another loophole to be aware of is the "escrow keybag". If you're paired with a laptop, there is a file in /var/db/lockdown that can work in place of the PIN (the device can decrypt the escrow keybag with #2 and #3 above and use the keys therein to decrypt the files it needs). Apple did this to allow backups without unlocking the device.


If you're an iOS user who becomes the target of an investigation by a law enforcement or intelligence agency, remember your data is likely unencrypted in the cloud. So if your device is inaccessible, your email, your location history, your text messages, your phone call history will probably remain accessible. Apple acknowledges, for example, that "iCloud does not encrypt data stored on IMAP mail servers": http://support.apple.com/kb/HT4865

[Edited because it now seems unclear which Apple policies have changed.]


> though the celeb hacking shows the limits of that approach

Apple has clearly stated that its system was not compromised.

The user reset questions were socially engineered meaning it is irrelevant whether or not the data is encrypted. From Apple's perspective the owner of the data is downloading it.


> The user reset questions were socially engineered

Yep, you're right. My point, perhaps poorly stated, is that if Random Hacker X can figure out the answers to the iCloud reset questions, so can a law enforcement agency. Then they can log into that account. Impersonating someone this way is legal -- or at least has not been ruled to be illegal -- as long as it's done under court supervision under the Wiretap Act or similar legal authority authorizing prospective surveillance.

Possibly related: I disclosed last year that the Feds have demanded that major Internet companies divulge targeted users' stored passwords, and in some cases the algorithm used and the salt: http://www.cnet.com/news/feds-tell-web-firms-to-turn-over-us...


> if Random Hacker X can figure out the answers to the iCloud reset questions

Answers about very famous people. Wikipedia will not tell me your mothers maiden name.

Also, as much as I sympathise with the women whose accounts were breached, actors aren't always the sharpest tools in the shed, and phishing schemes are a common tool for gaining access to other peoples accounts. One of them (I don't remember which) publicly claimed iCloud backup for her iPhone was "too complicated" a while ago. Given that it's as complicated as "turn it on, and make sure it gets plugged into power with Wifi every so often", I don't doubt some of them would fall victim to even a very simple phishing scam.


You don't have to guess -- they specify exactly what is encrypted in iCloud;

  "On devices running iOS 8, your personal data such as photos,
  messages (including attachments), email, contacts, call history,
  iTunes content, notes, and reminders is placed under the
  protection of your passcode."


But "iCloud does not encrypt data stored on IMAP mail servers" or Notes

http://support.apple.com/kb/HT4865


How is Apple supposed to encrypt data stored on servers they do not own?


> iCloud does not encrypt data stored on IMAP mail servers

Well, no shit. If they did that I'd log into my Gmail web interface and see encrypted gobbledygook instead of my emails.


Like how when you use a password to open an encrypted zip file it's all gobbledygook inside?


Google isn't going to let iCloud embed a decryption UI into their webmail tool.


"Last Modified: Dec 12, 2013".


That December 2013 page is linked to from today's announcement, under "Learn more about iCloud security." See: http://www.apple.com/privacy/privacy-built-in/

Also note that today's announcement says Mail and Notes are "encrypted in transit" only. In other words the December 2013 page remains current.


OK. You win 2 internets. And fuck that highly misleading ad copy, Apple. :-(

[Edit] - Clearly something is off here. iPhone keeps a copy of the last several hundred emails downloaded from my IMAP server, I would expect an iCloud backup of those emails would be "under the protection of the passcode" (a.k.a encrypted).

That doesn't mean Apple is somehow encrypting the messages stored on my IMAP server. Likewise, it doesn't mean Apple is encrypting customer emails stored on their @iCloud.com (or whatever) email servers....

I'm going to assume there are just some wires crossed here, but I do hope they clean up the document and clarify this.


In the end this is not about an information security solution (which is measured by the weakest link). This is about engineering consumer expectations. Privacy and security must be measured in terms of the overall digital-economic ecosystem. Systems at the margins of everyday consumer experience will determine how absolutely secure any computation can be. Consider the baseband processor in each iphone.

I think companies like apple and google are undertaking PR exercises like this in the hopes of finding that sweet-spot between the sense of crisis (excitement?) that smart phone ownership brings and the banal integration of technology into everyday life. There _are_ government requests, but they do not affect _you_. maybe. So my question: Is government surveillance now officially part of the iPhone experience?

To the extent that a debate exists, apple is engaging and steering that discussion. This is just pure organizational reflex. And it's cynical in some sense, but apple doesn't really have a choice in the matter either. Ultimately it is what the US officials consider to be an acceptable level of visible surveillance, which is a political consideration.


"Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data" Oh really? Privacy is marketing now.


It's one of the strongest marketing moves nowdays. Imagine the amount of "secure" messengers out there in which you only have a "word" from the company saying that it's secure. But they deliver a full closed source service in which there is no way to believe or not. So marketing works.


That this has become an area companies are competing in is a good thing. The fact that it was necessary is sad but I don't blame the companies for that. When it is the area for competition it will naturally be the focus for marketing - providing that the marketing matches the reality and isn't misleading that is also a good thing so people can know where the privacy is and choose it.


The terrorists have won.


Apple has taken a shot against Google and facebook. It has mentioned that unlike its competitors their business model does not depend on selling user data. Which is kind of true, but Google and facebook's business model itself is using user data for marketing.

Sometimes I feel it's not unethical to use user's data for marketing, the way facebook and Google tell us; that they don't directly share details with marketers, but they let them target the audience.


Except it's not open source. If it's not open source then you have no idea what's going on beyond what Apple tells you.

Ask your self:

Would Snowden use this phone? Your answer to this question is the same as the answer to the question "Is this phone secure?"

I guess I'll get downvoted for this sense it goes against the Apple circlejerk, but this issue is more important to me than magic internet points.


Nobody sells a 100% open source phone. Everyone uses broadcom, motorola, TI, etc. chips which have propriety firmware. (I'm aware there are a few obscure phones that claim to be 100% open, but not many people use them, not available, etc.)


The "its only secure if its open source" argument only really works if you compile the source yourself - otherwise you 're just trusting that the binaries on the devices are in fact compiled from the source that is published, in which case you may as well trust anything they say.


Finally those numbers of iPhone activation and mac sold are useful.

#1 Mac unit sales

http://www.macworld.com/article/2062821/apple-by-the-numbers...

2010 @ 13662k

2011 @ 16735k

2012 @ 18158k

2013 @ 16341k

Total = 64,896,000

#2 iPhone unit sales

http://www.statista.com/statistics/232790/forecast-of-apple-...

I only take the number from 2013 & 2014 because Apple trend to upgrade fast.

2013 @ 53.6 Million,

2014 @ 63.2 Million,

Total = 116,800,000

Now, quote from "Government Information Requests"

"less than 0.00385% of customers had data disclosed due to government information requests."

Only 699529.6 round to 699529 customers had data disclosed.


0.00385% of 181,696,000 is 6995, not 699529.

Even if we take the total number of iTunes customers (~800 millions), it comes down to 30,800.


It's between 0-250. Tim Cook confirmed it on the Charlie Rose interview on Monday.


That was for the first half of a this year alone. They specify this in the page on Government Requests


(116800000+64896000)*0.00385/100 == 6995

Source: http://verizonmath.blogspot.com


#3 iTunes users

http://www.digitalmusicnews.com/permalink/2014/04/24/itunes8....

800 Million,

#4 ipod users

http://www.statista.com/statistics/276307/global-apple-ipod-....

2010 @ 50.31 Million

2011 @ 42.62 Million

2012 @ 35.17 Million

2013 @ 26.38 Million

Total = 154,480,000

#5 Apple Laser Printer user

#6 Apple magic mouse user

#7 Apple Newton user

#8 Apple PowerBook user

#9 Apple MacBook user

#9 Apple Xserve user

Yes I made a mistake, however this is just a word game from Apple overall.


It's not 699529, but 6995 since it is 0.00385% (*0.0000385)


There's probably a big overlap between iPhone and Mac users, but yes, a big number is going to come out the other end.


To put this in context: 0.2% of women and 0.01% of men become rape victims every year. So the chance you get raped in the next year is 27 times bigger than the chance Apple disclosed your data to the government. Seems like we have worse things to worry about...


I'm not sure where you got this data, but probably it doesn't include the prisoners that were victims of rape (where the gender ratios would probably be quite different).


That might be true, but it wouldn't change much about my point, so I don't really care...


The problem is that local police are getting increasingly aggressive about abusing your privacy, and seeking access to your devices during the course of routine stops and searches.

Legally there is a combination of abuse and grey area right now, and it's going to get worse near term.

Those government information requests don't cover how many times an officer at the local or state level has accessed someone's phone either illegally or questionably legally.

This move by Apple can help to blunt that effort by the local police.


[deleted]


Absolute numbers are useless... Pretty much any statistic put into absolute numbers ends up being a stunning number of people.


It's a bit less than 4 in 100,000, which I think is an easier number to relate to.


The second hand market for iPhones is quite large in Australia and I imagine its the same elsewhere. Probably not fair to only include those two years iPhone sales.

I would think at least 50% of the older models are either on-sold or the user hasn't upgraded.

edit: UVB-76 has a good point, there is probably a very high percentage overlap.


Didn't know that many.


I understand this does nothing to stop the NSA from snooping on me. However, the local / state police are a much more imminent threat to your average person with the rise of the police state than the NSA and FBI are. The local police are becoming ever more aggressive when it comes to your privacy and devices like your phone.

If this turns out to be as good of a move as it seems like it is, Apple has acquired my attention in a way they weren't able to previously (I've been an Android user from day one). Plus I like the new larger iPhone 6.


Android phones also have 'Encrypt Phone' feature ( since Gingerbread ) but not on by default.


Edit:

Can someone confirm or deny the following? I think this is the current state of affairs.

A) Apple will unlock PIN-locked devices by government request, but the best they can do is brute-force. This is very slow, as it can only be done using the phone's on-board crypto hardware (which has a unique burned-in crypto key), and the PIN is stretched with PBKDF2. It has been this way for a while. Apple has no "backdoor" on the PIN or any form of cryptographic advantage here that we know of.

B) The new thing mentioned in the OP's link is that things stored on Apple's servers are now encrypted as well, with your iCloud password.

Is this correct?


It's actually really easy to recover the passcode from iTunes backups (so probably from iCloud backups too). I've had to do it before to rescue photos off a friend's iPhone. Don't know about >iOS6 though.


>It's actually really easy to recover the passcode from iTunes backups

What do you mean by this? I doubt the passcode is stored in plaintext anywhere, and if I recall correctly, the passcode is convolved with the CPU's burned-in crypto key before storage, so you couldn't recover it from a backup without the corresponding phone.


iTunes backups are encrypted by default. And I can't imagine a typical person deliberately disabling it.

Your example is a little unique though because you have physical access to both their computer and their phone. In theory you could just brute force their iTunes backup password.


No they aren't. You have to check a box in iTunes to have them encrypted.


They are if you have a passcode lock


> According to Apple, the only way to crack the passcode was via brute force.

I imagine every vendor selling encryption would make this claim. Otherwise, they would have to say there is a flaw in their implementation or publicly reveal their backdoor.


Not surprisingly, that instruction was for the general population. Not the police.


No, not all iCloud data is encrypted in storage. Email and Notes are not.


These threads should come with a tin foil hat requirement. There is so many different views on this. But if you wear a thick enough tin foil hat, it really doesn't matter what anyone says. You will think the gov is spying on you regardless...


I don't need to read this. Everything on the iPhone is proprietary software. As it has been proven countless times, there is an 100% probability that there are backdoors everywhere on this device. This entire blog post is a lie.


And software built by volunteers, like OpenSSL, has proven to be so much more secure. It's not like heartbleed left practically the entire internet vulnerable to abuse. Oh wait, yes it did.


A bug is not a backdoor. Unless you are implying the OpenSSL devs left the bug there on purpose.


Yay for hyperbole!


19250 people have their Apple accounts accessed by #NSA every year.


Doesn't Android phones also have 'Encrypt phone' feature?


It almost seems like it's a feature of iOS8.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: