Hacker News new | past | comments | ask | show | jobs | submit login
A message from Tim Cook about Apple’s commitment to your privacy (apple.com)
135 points by dombili on Sept 18, 2014 | hide | past | favorite | 131 comments



"Our business model is very straightforward: We sell great products. We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t “monetize” the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you. Our software and services are designed to make our devices better. Plain and simple."

Shots fired.

(At Google, obviously.)


Interesting you mark this section of it, this part also stood out to me.

As a longtime Apple iPhone lover, I switched over to Google Nexus 5 around 6-8 months ago and with it, I decided to try and be as open as possible with my data. This means that I opted into allowing Google to read my mail, save my Chrome searches (and start using Chrome), saving my notes in Google Keep, and then regurgitate all that information back to me on my Google dashboard. I thought to myself: "Once the next iPhone comes, I'll determine if I should go back."

At first, it was striking that one day I booked a flight and then about two weeks later, my dashboard mentioned to me that I should leave 40 minutes early for my flight because of a traffic accident. It knew my flight information, when I should leave, and routes to the airport. I followed Google's suggestion and made it to the airport on time, taking an alternate route Google displayed for me. Once I landed, Google updated my dashboard about the currency exchange, surrounding events, restaurant reviews, foreign news, and more... it "knew" I wasn't home anymore, but across the world. It was initially strange, but with more trips and more experiences, I grew to like that Google could give me handy data.

Another strange thing, when reading my mail, it would point out words like "... On Saturday, Sept. 4...", then ask me if I wanted to save it to my calendar. Small touches like this grew on me and kept my life in sync.

Now, having spent many months with the phone and with the release of iPhone 6, I've come at a crossroads in trying to find out how "okay" I am with giving Google my data. I wonder to myself, "how can Google use this data against me" and come up short. There are things I wish to be private and for those things, they are rare and I use the appropriate channels (Incognito Mode, other email accounts, etc.), but those are much more rare than the typical. It does sometimes bother me that ads show my interests, but other times, I've actually found it surprisingly useful and it has led me to learning about other products available.

I suppose for Apple to release this comment, I agree on the one hand and can see their point, but I wonder... Sometimes it's nice to have certain services available to you, if that means giving some information out.


With my iPhone, I've noticed that the device itself will notice things (though perhaps not as much as Google), without necessarily reporting this information back to Apple.

For instance, it does the content recognition and offers to create calendar events for phrases like "tomorrow", "September 18th", and so forth out of my email. Also, it knows where I live and where I work and tells me how long it'll take to drive back and forth (which isn't helpful to me because I walk), but it also notices when I'm spending the night with my girlfriend because it starts telling me how long it'll take to go to her place.

Perhaps for the level of sophistication Google offers, they need to keep your entire life stored in Google's data centers. But the kinds of features you point out, by and large, are more than possible without it. Your iPhone would know when you have a flight booked, because your boarding pass would be right there in Passbook. It would know when and where you land, because it has GPS. Google might have more clever technology for aggregating and displaying that information, but it doesn't inherently require sending all that data to Google's servers, and the fact that they do so is telling.


And now apple has promised not to just give all that information, and access to your phone, to people claiming to be cops without a warrant anymore.

This was such a good read until that last word. Including that last word it makes me feel all warm and fuzzy and full of confidence in Apple's protection of my data.

A part of me wonders who in the PR department thought this was a good announcement to make and whether he/she still works at Apple.


I grew to like that Google could give me handy data.

I agree that this is useful; but sometimes I think Google tries a bit too hard to be helpful. For a few months, I visited my girlfriend every Thursday evening (and at other times as well, of course, but Thursday evenings were consistent). After we broke up, my phone would helpfully let me know every Thursday evening how long it would take to drive to her apartment.


I realize that the reminder might cause unwanted emotion so soon after a breakup, but as long as it's easy to permanently dismiss the reminder, I would guess that the informativeness of the reminders when they were applicable far outweighed the brief heartache. But then again, I generally prefer to be given a stream of relevant information forthrightly and unfiltered, whether from humans or machines. I'll take the convenience even if it comes with the occasional unpleasantness.

But just wait. Someone will be really freaked out when one Thursday Google fails to show him the traffic to the girlfriend's house, then when he gets there, she breaks up with him.


The last bit reminds me of the father who found out his daughter was pregnant via Target's advertising.[0]

[0] http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h...


After a few months of frequently working weekends, Google started giving me directions to work on Saturdays when I stayed at home. I very much like the idea behind this kind of prompt but something like this has so much surface area that it's mostly impossible for them to handle every single edge case (like ours).


And each time you swipe it away, it'll ask you if you no longer want to receive directions to this location.


Ha! In my case it sometimes insists that I travel to work at random places at not really normal office times. Wish I could tell it I work from home or better yet it figure that out!


The aggressive location-of-work detection is something I've run into as well. Last year it took four trips in three months before my phone decided that I worked at the airport.

Air travel is the context of another "overly helpful" issue: The "remind me where I parked" feature doesn't seem to make any attempt to distinguish between modes of transport, with the effect that when I enter the airport terminal after a flight my phone invariably decides that I need to be reminded where "I" parked the plane.


There's actually an option for this in the Google Now settings, but I don't have my Nexus anymore so I can't tell you exactly where. Should be something like unchecking "give me recommendations based on where I go" or something. I know you can also set home and work locations manually.


I run into this despite having set my work location manually. Either it's failing to handle identical work and home locations or it ignores the manual settings given enough "evidence".


Same here man, eventually Google learned that I didn't want to do that anymore, but for a while it sucked to see that pop up every couple days.


One wonders what Google's cost is to run these services, what margin they'd be happy with, and whether people would pay that in exchange for privacy.

Google's latest quarterly website revenue is about US$11B; assuming 500M active users, primitive math gives $88/user/year. Interestingly, that number is right between Google Apps' two paid tiers. However, I'm not sure if adopting paid Google Apps means you've totally escaped the advertising and privacy implications of all of Google's services.


> I wonder to myself, "how can Google use this data against me" and come up short.

You obviously are of no importance to the world. You do not carry any politically weighted message and you are the crossroads to no important information. The best proof I have: Google and those intermediators of Google have no interest in you.

Which doesn't mean you shouldn't participate to the privacy of journalists and members of a political party


I think you've mistaken what Apple is saying here. Many of the little touches you mentioned are also part of iOS, and Apple receives much of the same information. The statements provided here clarify how Apple will use, or not use, that information.


This motherly relationship Google has with you is going to turn bad. At some point, you'll be betrayed. The more you rely on Google, the more it's going to hurt you.


> "how can Google use this data against me"

I look at it the other way: What do I gain from allowing Google to build this profile of me, including information from potentially multiple accounts, devices, location history, search history and communication history.

Because if Google is offering their services for free to all comers, that data must be valuable to them. And if I can't deduce why the value is what it is, that's far more likely a failure of understanding on my part than any indication of fairness or harmlessness in the exchange. [1]

And to me, all I see is a convenience function I can't rely upon. It can identify flight or meeting information. Maybe (who knows what parser errors exist or will exist as things change). And it can only even try some of the time, as google doesn't have access to every channel by which my itinerary is set and changed (emails to non-Google accounts, phone calls, text messages, verbal communication, etc.)

And if I can't rely upon it, I have to double-check that it is operating off the right information. And at that point, I might as well have explicitly handled the bits Google Now might have gotten right as well.

The rest of your examples are replaceable with client-side data-detection features -- that have been identifying likely dates, addresses, flights, etc and offering convenience functions to simplify dealing with them for some time -- and location-snapshot information [2]. Nothing there requires indexing all my communications and locations across history.

Maybe Google Now can become something that's worth the trade-off. Or maybe Google will offer a (for-pay?) version with guarantees about who has control over what is retained and who has access to it. But, to me, it's not a good deal right now.

And besides all that, I'm personally a pretty private person, so I have a particularly good imagination of the downside risks, even beyond weighing my benefits vs google's benefits.

[1] It's that old saw: "There's a sucker at every table..." Applied to internet services it suggests: If you can't tell why a company with a free service has a massive valuation and a willingness to pay through the nose for users, you literally have no idea what you're trading away by using it. So you better be damn sure you're at least getting something worth the per-user valuation in return.

[2] Yes, it requires some degree of trust or self-deception to believe that there's a difference between a company that states it's collecting your location history and one that claims it doesn't -- as providing snapshots implies the ability to log said snapshots. But such is life. At some point you make peace with the level of paranoia you can handle, or you opt out of society altogether.


Tim Cook has been uncharacteristically (well, for Apple, not for himself) direct in stating that Google is their top competitor and making implicit jabs at their business model.

On the other hand, what this also says (to people who like awesome cloud services) is :: "Unfortunately this also means that our predictive keyboard, Siri, and other predictive services will never be that good."


Apple was very clear in the 80s that IBM was their main competitor, and was very clear in the 90s that Microsoft was their main competitor: in both cases the competition was not just argued against, but ridiculed with unflattering characterizations (IBM as Big Brother and their users as peons, Microsoft's users as not just awkward but physically unattractive businessmen). Steve Jobs himself took shots at Android, including my "favorite": """You know, there’s a porn store for Android. You can download nothing but porn. You can download porn, your kids can download porn. That’s a place we don’t want to go – so we’re not going to go there."""... can you say more about how this is "uncharacteristic for Apple"? Is it just that the jabs are specifically at business models? If so I bet I can dig up some examples of that from the Mac vs. PC as series, if not from an earlier Steve Jobs iPhone keynote ;P.


I see your point, but Steve always seemed to think Android never stood a chance anyway. He didn't have to say WHY Apple was better than Android, he just pointed out some major (philosophical, not product) negatives, as you say the porn store, etc.


"We don’t build a profile based on your email content or web browsing habits to sell to advertisers. We don’t “monetize” the information you store on your iPhone or in iCloud. And we don’t read your email or your messages to get information to market to you."

I don't think this precludes Apple from ever building better predictive services. Tim Cook is specifically saying that Apple does not sell or give away your data to third-parties for advertising purposes. He is NOT saying that Apple would never ask permission to read your e-mail if it meant that they could offer a compelling Apple-created product and user experience as a result.

For example, with Apple Pay, Apple is technically using and storing my private information (my credit card data) to offer me a compelling service. They are not going to then figure out which credit card I own and/or how much I spend, and sell that info to banks so that they can target me with specific credit offers.


Apple Pay is actually designed specifically so that Apple does not see your individual credit card data, purchase history, etc.

Apparently they will know total aggregate spending through Apple Pay since it's been reported that they get a cut. But that's completely different from, "Hey we noticed you just bought X, why don't you also buy Y, or maybe next time get a better price at store Z"... which is more like the Google approach, and my personal preference is to avoid such 'features' like the plague.


Apple does not store your credit card data. They generate some sort of tokenised account information with your card issuer when you first add your credit card. They do not store the tokenised account information, it is only ever stored within the secure area of the SoC on the device.

From that point, Apple is completely out of the payment equation. The tokenised data is used to generate a once-off payment authorisation with your bank when you pay for a product using Apple Pay.

But I agree with the point of your post: if Apple thought collecting data would result in a significantly better product, they would probably do it.


I think he is explicitly saying that. They're not in the business of using your personal data, they're in another business.


How exactly would it make predictive typing and Siri better if Apple decided to invest engineering resources in new and clever ways to sell your private information to advertisers?


well, seeing that every time i enter my own address into google maps, i am first given a result in another state and have to manually type the zip code (and i do have cookies enabled) i don't think they have much to worry about yet....


If you were "in" the Google ecosystem you would just need to type "Home".


And that business model explains why Apple shares have a price-to-earnings ratio of 16 while Google has a P/E of 30. It's very tough to keep designing great products decade after decade - just ask Sony. It will be much easier for Google to keep collecting more personal information than anyone else, decade after decade.


Nope. The price to earnings ratio is based on investors understanding of the technology business.

It's very hard to maintain an advertising based search business for decades. Just ask Yahoo.


During the Charlie Rose interview, Tim Cook says that Google is a direct competitor. And that: "I think everyone has to ask: how do companies make their money. Follow the money."[1]

[1] http://www.macobserver.com/tmo/article/apple-ceo-tim-cook-on...


There are a bazillion of social startups out there without even a beginning of business model. What will they do with my data?

It might very well be a shot at Google, but it can also be a form of dismissal of an entire segment of the software industry perceived as "less noble" by Apple.


I bought a year's worth of fastmail I haven't set up yet. I should probably get going with that.


... but if Apple Pay ( https://www.apple.com/apple-pay/ ) turns into a big success, it could have the potential to unbalance the relationship Apple has with it's customers.


Not if you look at how it actually works. Apple is involved in the process in a way where they don’t have to do or know much. It seems it’s deliberately set up that way.


I think what strikes me the most about the new privacy policy* (and the associated mini-site) is just how /readable/ it is. No legalese. No 50 pages of undecipherable jargon. It's plain text an Apple clearly /wants/ you to read it and understand what their privacy policy is and privacy practices are. Yes, there's still the questions about PRISM etc and the NSA, but in terms of everyday use of their products, what they share with 3rd parties, and their law enforcement policies - I find the language very refreshing.

*https://www.apple.com/privacy/privacy-policy/


This is actually part of a broader and encouraging trend of non-shitty terms of use and privacy policies. Microsoft and Google actually also write in relatively readable, plain English.

Microsoft - http://www.microsoft.com/privacystatement/en-us/windowsservi... (unfortunately the page itself is a bit of an eyesore, though.)

Google - http://www.google.com/policies/privacy/


Google actually takes a similar approach: http://www.google.com/policies/privacy/


So what about this vanishing of the warrant canary? https://gigaom.com/2014/09/18/apples-warrant-canary-disappea...


This is getting weird. (Background interview with Charlie Rose http://youtu.be/Bmm5faI_mLo?t=35s)

I dont understand:

> Finally, I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will

Either he's hiding the fact that they'll provide what data they have (timings, ip addresses, contact metadata) when the warrant comes, he doesn't think it's valuable to publicize the ways they cooperate with the government legally, or he is actually thinks Apple will break the law when the lawful requests come in.

Second, he doesn't address exactly how everyone lost their data in the nude celebrity thing. A straigtforward answer would address the elephant in the room, and give us facts to think about, what we need to look out for, and how their recent moves address the vulnerabilities. Right now all we have is informed guess work and inferences from (admittedly great) detectives like https://twitter.com/nikcub Still different than hearing it from people with the facts.


I doubt he's hiding it; there is a whole page about their responses to government requests:

http://www.apple.com/privacy/government-information-requests...

He seems to be drawing a distinction between providing some information to the government in response to a request, vs. providing "access to our servers."


"On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8."

But law enforcement can still brute force the encrypted image right?


From a Feb. 2014 Apple Security whitepaper[0]:

"The passcode is “tangled” with the device’s UID, so brute-force attempts must be per-formed on the device under attack. A large iteration count is used to make each attempt slower. The iteration count is calibrated so that one attempt takes approximately 80 milliseconds. This means it would take more than 5½ years to try all combinations of a six-character alphanumeric passcode with lowercase letters and numbers."

There's more info on the file encryption in the paper, around page 8.

[0] http://www.apple.com/ipad/business/docs/iOS_Security_Feb14.p...


How were the iCloud accounts then attacked if they didn't have physical access to the device?

The passcode is “tangled” with the device’s UID, so brute-force attempts must be per-formed on the device under attack

Even with the password you need the device, which they didn't.


I think a lot of it was social engineering and bad passwords. If you are using multiple services and one is compromised you can pretty much give up access to everything if you use the same password all around.


Guesses based on known properties of the (well-known, high-exposure) targets.


You can restore from iCloud backup to any device (which is presumably how elcomsoft software works). The only thing that you need original device for is the keychain, but I am not sure if this has changed since keychain in the cloud


Social engineering was used to get the answers to the users security questions.


I think the "Cloud" may have been involved.


iCloud backups aren't encrypted with the same key as on device storage.


Isn't that 5.5 years of sequential trying? This is a parallelizable problem. There's also no restriction on how fast a cracking machine is allowed to be, so I doubt 80ms is the lower limit.


Did you not RTFA? The file system keys are stored in the SoC secure storage area and there are no facilities for retrieving them, meaning you must perform the decryption attempts with the phone's own SoC.

Of course an iCloud backup file is another matter... At least they are turning on two factor auth for that, but it's a difficult problem if someone has your password and the keys are on a phone that you no longer have.


That's assuming you can't clone it, and try on 1000 devices at the same time, correct?


I was also under the impression that some of the Snowden slides contradict this point.


I think more interesting is the section about what Law Enforcement can access and what not - which is explained in detail on this page: http://www.apple.com/privacy/government-information-requests... The page also contains three PDFs with more details for US/EMEA/APAC.

I always thought with the iOS7 encryption everything is encrypted but it looks like this is only the case with iOS8.


Pretty interesting - this must have changed in iOS 8. At least according to Jonathan Zdziarski Apple previously had the ability to bypass the iPhone encryption for law enforcement if the phone was physically sent to Apple [1].

[1] http://www.zdziarski.com/blog/?p=2589


iPhone's full-disk hardware encryption is not active after the device has booted and has been unlocked for the first time. There's second layer of software encryption for important data, which includes now much more (but not everything) in iOS8.


That's not quite true; iOS doesn't use "full-disk encryption", but only file encryption.

You can read about it under File Data Protection in http://images.apple.com/privacy/docs/iOS_Security_Guide_Sept....


Actually, the parent is correct - the PDF you linked suggests otherwise:

> (NSFileProtectionNone): This class key is protected only with the UID, and is kept in Effaceable Storage. Since all the keys needed to decrypt files in this class are stored on the device, the encryption only affords the benefit of fast remote wipe. If a file is not assigned a Data Protection class, it is still stored in encrypted form (as is all data on an iOS device).

but AFAIK, the way this is actually implemented is that the non-None file protection settings are an additional layer on top of full disk encryption. On my jailbroken iPhone 5s on iOS 7.1, the /var partition is mounted from /dev/disk0s1s2 - where the double partition is due to a CoreStorage block layer between the filesystem and the actual disk. If you dump some data /dev/rdisk0s1s2, you'll find a HFS+ filesystem with plenty of strings, but if you look at /dev/rdisk0 itself, they're nowhere to be found, i.e. the CoreStorage volume is encrypted. (There's probably a more direct way to determine this, but CoreStorage is closed source and undocumented, so meh...)


"Finally, I want to be absolutely clear that we have never worked with any government agency from any country to create a backdoor in any of our products or services. We have also never allowed access to our servers. And we never will."

The way this was worded is so plain and to the point, it's refreshing. I sincerely hope it's true! Great job on Apple and Tim Cook for laying it all out there.


So when the NSA's internal Top Secret slides say that Apple was added to the PRISM program as a "provider" in October 2012, that provides "Email, chat, videos, photos, stored data, VOIP, file transfers, video conferencing, logins, online social networking details" to the NSA, how then should we reconcile these statements?

Is the NSA lying in its internal slides to itself?

Is Tim Cook lying in his statement to the external world?

Is Tim Cook splitting hairs in some fashion? For instance, is he defining "backdoor" to mean "illegal backdoor", which therefore excludes everything done with the FBI/NSA under a veneer of legality?


People really want to make something out of this, but it's very simple: the NSA found the "goto fail" bug and exploited it. (We know they also exploited heartbleed.)

We know they have active programs looking for holes in open source code and fuzzing commercial services looking for vulnerabilities. How is that so hard to believe?


> We know they also exploited heartbleed.

Do you have links that show this is true?


Based on the public reporting, PRISM collects data via FISA-authorized requests to companies. So Apple is probably covering it in their short paragraph about "National Security Orders from the U.S. government."

https://www.apple.com/privacy/government-information-request...


Maybe it was a frontdoor to their products/services instead of a backdoor. (Only half joking.)


I think you are way over thinking it. The NSA's internal slides said nothing about Apple helping the NSA at all.

I read these statements just a Mr. Cook laid them out.


You'll see similar wording from Yahoo, Google, and Microsoft. The way it works is they deny providing access to their servers but then push out the data to the NSA's servers and are compensated for the cost of doing so. Access to the servers owned by the corporations is not what's important, it's availability of your data. None of these corporations say "we do not provide access to your data" they say "we do not provide access to our servers". As we all know data can easily be copied from one entity to another. Who needs backdoors when all the data is shipped out directly to NSA?


"we have never worked with any government agency from any country to create a backdoor in any of our products or services" seems to rule that out.


A backdoor is a mechanism for someone to be given access to an internal system, or internal data. In the case of data being "pushed", it most certainly could be done so, and the same statement would remain true.

If your comment was true, you would also assume data handed directly to authorities in criminal cases where subpoena's are issued would also be ruled out, but we know that information is handed out to authorities as required by law under subpoena.


Well, like GP said, it depends what is meant by "backdoor." Laws requiring data to be turned over to governmental parties, and in some cases gagging anyone from saying it happened, seems like exactly the function of "backdoor" even if it doesn't take the form of a master password the NSA has or something.

So long as a company can be compelled to do this in secrecy, their assurances about "no backdoor", "no direct access", "we process every request ourselves" are basically meaningless, since not even they have control over their customers' privacy (if they comply with law.)


To me it seems that these statements are needlessly specific. They remain technically truthful if: - The back door was created by a third party. Apple didn't create it. - Apple created a backdoor without working with a government agency. - Apple provides a "frontdoor". Some sort of bulk access to the data that is not considered a "backdoor." - many variations on the same theme.

Same thing with allowing access to their servers. They may provide access to some other part of their infrastructure. The government agency may provide their own servers that are a mirror of Apple's in a way that does not require direct access to Apple's servers. And so on...

Tim Cook should have said something like "We do not provide bulk data to any third party."


Tim Cook should have said something like "We do not provide bulk data to any third party."

In a strict sense that's probably not true, they may have to provide some form of bulk data to outside parties for accounting purpose for example. Trying to craft a statement to include those exceptions would just invite more hyper-parsing of the language.


What about a backdoor in their routers? Or even a "front" door in their entire network infrastructure? Is it really that clear?


plain and to the point? please, it's obvious that they're now working with Al-Qaeda because the statement didn't explicitly mention them.

Goddamn I hate the tinfoil-hat nonsense that happens here...


Snowden pretty much validated those tinfoil-hat people's worries.


It must feel great to have a excuse for everything ever. Just utter his name an all your paranoia is legit!


Well, that might be the most impressive official comment I've ever seen from a large business. It wasn't too technical (from a legal or technological perspective), but also wasn't too dumbed down. It was clear and frank, and not once did my "BS alarm" go off.

But I'm generally a fan of Apple, and a frequent user of iOS devices, and I've always liked Tim Cook. Maybe I'm biased.


> In the first six months of 2014, we received 250 or fewer [national security requests from the US government]. Though we would like to be more specific, by law this is the most precise information we are currently allowed to disclose.

http://www.apple.com/privacy/government-information-requests...


The way user data fits into Apple's business model (ie. to make the product better vs. generating ad revenue) is appealing, but unfortunately it's gotten to the point where pledges of privacy from US companies have a hollow ring to them. I expect a gradual migration away from US-based SaaS/PaaS and OS offerings to open source alternatives and self-hosted utility computing, particularly for business and government applications.


Exactly. Once trust is broken, it's hard to regain.

An entity changing behavior after being caught doesn't really inspire confidence in what it might do when it believes no one is watching.

Trust is a fragile thing. Hard to build, easy to destroy.

I don't necessarily blame these companies though. They are profit oriented and they almost certainly have to play by certain rules to grow to any size. But the surveillance crew certainly has certainly shot themselves (and us) in the foot to some degree. I just hope what they got out of it was worth it, but I'm doubtful. If they had just played a little more by the rules and vetted important matters rather than being lazy, their jobs (and everyone in the Western tech industry) might be a lot easier going forward.

Unfounded hubris and lack of foresight seem to be very common pitfalls for military, security and policing services. At least in the U.S. And probably everywhere throughout history.


It's Celine's First Law.


Really great to see a company acknowledging that privacy is a major concern.

It's also a great business decision, because privacy is one thing that Google will never be able to beat Apple on. They pretty much have no way to for them to respond if Apple starts running ads about privacy.


>We have also never allowed access to our servers. And we never will.

That's good to read but, technically, it's not their choice, correct?


They can hand over copies of data (after being presented with a warrant) without giving "access to our servers."


Maybe I'm just really cynical, but now I'm imagining scenarios like mirroring their disks to copies someone like the NSA is allowed full access to. Would that be "access" to their "servers"? Maybe they don't include Apple's own private keys, so at least the NSA can't impersonate them, just read every byte of data they have?

Then again, it's sort of moot because they can probably be legally forced to provide full "access to [their] servers". Some comments here are pointing out they have the choice whether to obey the law, so is Cook actually saying they will disobey it?


Ahh I see, thanks for the comment.


> technically, it's not their choice, correct?

Technically, it very much is their choice to comply or not, if they are ever put in a position to do so.

The consequences of not complying are not up to them, but they can still not comply if they choose. (The consequences might be massive fines and/or jail time for execs, but it's still a choice)


He talked to Charlie Rose about this 2 days ago:

https://www.youtube.com/watch?v=Bmm5faI_mLo


As much as I like this message. I just don't believe him. We KNOW the US government will issue a gag order and not allow him or anyone else to speak about what they recieve.

The only way to be sure is to provide the source and allow people to complile it themselves, which is just never happening.


http://images.apple.com/pr/pdf/131105reportongovinforequests... has warrant canary "Apple has never received an order under Section 215 of the USA Patriot Act."



Yeah the timing is crazy


I find it amusing that this is being served on an unencrypted unauthenticated page. There is no HTTPS redirect or HSTS header.


Fortunately, it's not a secret message.


But how do I know a MITM is not modifying this message to claim things Apple hadn't intended to say? :)


That's some next-gen shit.. Also, using HTTP gives Apple plausible deniability. "No, we didn't actually say that, it was MITM..."


The broadcast isn't secret.

Maybe the fact that you received the broadcast should be.

Slowing down anyone who wants to build a massive database about you to identify your weaknesses, this is a good thing to do.

Edit: I love being modded down by people who don't reply with evidence based argument, it's better than fake internet points to let you know you're onto something. /me waves to our visitors from the NSA.


For example, a network adversary can make a list of everyone who was interested in Apple's government data access policy. They can also tell which of those people accessed that policy from an Apple device, thanks to unencrypted user-agents. One practical use of that might be in planning future investigations.

More generally, there was the report of an XKeyScore rule to identify people who were interested in Tails. It's hard to know what a particular network adversary uses that information for, but it's disquieting to think that, partly because information about OSes, privacy policies, and communications security measures is served up unencrypted, it's possible to profile people based on their platforms (or intended future platforms), interest in particular aspects of privacy and security, and technical sophistication.

Because of TCP fingerprinting and unique file sizes, as well as sites that have information only or mainly about a single topic, it may be hard to fix this problem with HTTPS alone. But it's important to think of the fact that someone read something as privacy-sensitive even when the thing they read is available to the public. If you don't care for the communications security examples or they just strike you as too meta, try browsing around WebMD for a bit. Nothing there is secret or even customized for any particular user!


I couldn't find anything that describes whether Apple can access your iCloud data in response to a law enforcement request. (I.e., whether it's encrypted in situ.) Does anyone know?


Apple will provide iCloud data in response to search warrants. Horse's mouth: https://www.apple.com/legal/more-resources/law-enforcement/


All your iCloud content is encrypted in transit and, in most cases, when stored (see below). If we use third-party vendors to store your data, we encrypt it and never give them the keys. Apple retains the encryption keys in our own data centers, so you can back up, sync, and share your iCloud data.

If I'm reading this correctly, it looks like most things are encrypted while on iCloud. Namely:

  - Photost
  - Documents
  - Calendars
  - Contacts
  - iCloud Keychain
  - Backup
  - Bookmarks
  - Reminders
  - Find My iPhone
  - Find My Friends
But these aren't encrypted on iCloud:

  - Mail and Notes (Though they are encrypted in transit)
I would think this means that Apple is able to hand over Mail and Notes data sans encryption?

On another page it reads: On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data.

I read this to apply only to items stored on the device itself. Not on iCloud.

But I hope I'm reading this wrong, and Mail storage is indeed encrypted on iCloud.


It doesn't matter what is encrypted, Apple has the keys, as the paragraph you quoted notes.


I thought that the data would be encrypted with your device's key, not Apple's key.

Here's a quote of his from the Charlie Rose interview: We’re not reading your email, we’re not reading your iMessages. If the government laid a subpoena on us to get your iMessages, we can’t provide it. It’s encrypted and we don’t have the key.

Perhaps that's not the case? Or rather, perhaps that's literally only the case with iMessage? I don't think iMessages are ever stored on iCloud, and instead only ever propagate from device to device...


I am pretty sure you can retrieve data on multiple devices and also restore without the phone.

All of which is moot since Apple can simply be ordered to write software to leak or capture passwords and decryption keys.


Actually within US law and case law, even the horrible Patriot act, that is not true. The government cannot require you to design your photo service (for example) to allow eavesdropping. They almost made that a law back in the 90s but enough tech companies freaked that it got shitcanned.

Similarly, you cannot be ordered to lie and say the NSA has not been given access to your servers, you can only be ordered not to discuss it if it has happened.


They do for certain technologies (see CALEA) but even for others they would be required to take reasonable effort to comply with court orders. Intercepting passwords isn't particularly difficult, assuming they don't have them already. Lavabit was required to do something like this.


That is not true. Remember Lavabit.


Hmm, yes. That makes sense. Being able to retrieve from a different device would blow the 'by device' idea apart. And you're also able to reset your password and have access to all of your stored information, which I think just further enforces the concept that nothing you're storing is unreadable by Apple(Or any requesting agency...).

I'm not sure how anything could be safely stored on their cloud(or any cloud), given these features.


And Apple can say "Nope" and make it all very public and take it all to the SCOTUS.


Besides the fact that they can't (as we have seen with NSLs) admitting that the outcome is contingent on policy means the (technical) security is broken.


Yeah, just like Lavabit and Yahoo, oh wait...


The paragraph he quoted notes that Apple does not have the keys.


From Apple's site:

> Even if you choose to use a third-party application to access your iCloud data, your username and password are sent over an encrypted SSL connection.

So user/pass is sufficient to access cleartext of your data, and Apple definitely has that.

This should be obvious: If there is any way to recover your data when you lose your phone (without using a secret key that is never shared with Apple) then the security is broken.


How do you propose to solve the problem of a user's house burning down with their phone, laptop, and iPad? Tell them sorry, your iCloud backups are encrypted and now all the keys are gone?

So encrypt the iCloud backup with a password you say... Except you need a password to get in to iCloud already so how is that any different?

And for the record, Apple stores password hashes, so they don't have your password. Unless you think they are just lying repeatedly for no good reason.

When was the last time you verified the chip layouts for your CPUS to ensure there were no back doors? Didn't think so.


I think you're being a bit overly defensive here. (S)he was just pointing out some facts about the system that I didn't grasp in my initial post. It's important that we remain critical of any system. And it's equally important that we remain conscious of their potential flaws and design trade-offs.


I didn't propose to solve any problem, I pointed out the obvious flaws in claims about Apple's security and resistance to gov requests.

Private keys are different than passwords.

Password hashes are computed server side where the password can be trivially intercepted.

CPU backdoors are a security problem, that doesn't negate the existence of other security flaws. Moreover CPU backdoors are far more costly to implement and effort should be directed first at cheaper/easier weaknesses, like ineffective encryption of data.


> If I'm reading this correctly, it looks like most things are encrypted while on iCloud. [...]

If that list is accurate, that's great.


How long until Apple buys DuckDuckGo, minutes or seconds?


Hope they don't buy it but at least donate! :)


So does the last statement about never sharing data with a government agency serve as a sort of warrant canary? If they were served with a national security letter to hand over encryption keys for an entire service, wouldn't they have to remove that line?


Commitment and capability are two things.

I am not saying Apple does not have the capability to protect user privacy, but I trust Google more in term of their capabilities.


But Google doesn't protect user privacy from itself.


Sorry, am I missing something here? What's the context? Is Apple accused of some recent scam? Is this because of them forcing U2 into our accounts?


In light of the recent "iCloud hacking" as well as the launch of health monitoring with iOS8, questions about security arose.


Context is The Fappening incident.


* celebrity leak * health integrations * google has the opposite strategy


The celebrity pic hack.


So in general, to be read as: "Please continue to upload nude selfies, they are more secure then ever" :)


The interesting thing is that they obviously think that there is now a market and selling point for privacy.


And yet, they had the gall to force an album down our throats during their reveal last week.

When I saw that U2 album in iTunes, I had to stop and ask myself why I was so pissed off about it. In theory, it was simply a nice gesture. But in reality, it was just a reminder that Apple is in control of my software, not me. It was a reminder that, ultimately, this company (and, in fairness, most companies) is going to fuck me over if it means more money for them.

So, sorry Tim Cook! but you are full of shit. You will totally roll over and fuck me when money is on the line, either as a business opportunity or to avoid a fine. And guess what? I am screwed no matter what I do, because the choice is between convenience and Richard Stallman-esque software self-cripling, and the Dead Kennedys already know how that plays out.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: