Hacker News new | past | comments | ask | show | jobs | submit login

I really find this whole thing hilarious.

First, it's Apple sticking it to the FBI by making a stand for privacy and receiving public accolades doing so as a champion of privacy. The FBI is the bad guy trying to invade our privacy.

Now, it's the FBI sending a giant "screw you" to Apple by not only letting them know they were able to hack into the phone without Apple's help but at the same time, making a mockery of Apple's entire security claims. And now, Apple is in panic mode, slowly realizing that they went from being the hero of privacy in the modern age to the company that wasn't able to secure its phone from the FBI. From the FBI. Of all things.

No wonder they are freaking out and they want to know how the FBI did it.

Except that if I were them, I certainly wouldn't ask that publicly, I would at least pretend I know how the FBI did it and claim that it's already fixed in the next version of the OS.

Which still means that tens of millions of iPhones are at risk today and will be for months, but at least, you get to pretend that you're ahead of the FBI while right now, it's pretty obvious that Apple has been outsmarted by a government agency.

The bottom line is that in this line of work, it pays to be discreet and humble.




>Now, it's the FBI sending a giant "screw you" to Apple by not only letting them know they were able to hack into the phone without Apple's help but at the same time, making a mockery of Apple's entire security claims.

>it's pretty obvious that Apple has been outsmarted by a government agency.

I disagree with this assessment. It's not very surprising that the FBI was able to hack into a particular iPhone by focusing all available resources on the task. Given their previous duplicity, I would not even believe the claim without real evidence. Furthermore, I would not be surprised if they'd already hacked into it weeks ago. After all, their objective here was never to hack into the phone, but to establish a legal precedent that would force companies like Apple to comply with their future wishes. Apple has well and truly won, and the FBI is at this point trying (and succeeding, sadly) to save face.

I would speculate that Apple are asking them about it publicly because they know that the FBI will not comply with such a demand, thus making the FBI look bad, since it is now (purportedly) putting the security of millions of consumer devices at stake. They may also suspect that the FBI has not legitimately hacked into the phone, which remains a possibility. Or, as is more likely, that the FBI's method required large resources and could not be applied to many iphones at once, for instance.


An interesting twist would be for Apple to encourage someone to sue Apple for having a hackable phone and then having Apple supply a legal request to require the FBI to disclose the hack in order to "defend" the lawsuit. Not sure this would work but you need a legal reason to force disclosure.


It's perfectly legal to use vulnerabilities on machines that you own, publicly advertise it, and refuse to disclose the details. It's a free country. Funny that the infosec community traditionally supports this and opposes mandatory disclosure laws. If you had to disclose any vulnerabilities that you find, you'd never get paid for it. There wouldn't even be bug bounties, since those exist because of the vulnerability market.

The fact that it's the government also adds in the great deference that the courts give the government on national security issues. And it would be with good reason in this case. Vulnerabilities are used to craft attacks against our country's enemies (see Stuxnet).


Let's find out.

As an iPhone user, I presume I have standing and could argue some form of damages. Now, is there a lawyer out there interested in discussing whether there is really any merit to bringing a suit forward?

Would / could this be a class action, or in this case, is it better to seek an individual out come?

Hypothetically speaking, of course, until there are proven grounds to stand on.


Legally, why couldn't you sue the government directly?

Ianal, but isn't the US government (via the FBI's court proceedings) publicly stating they have access to / have used a vulnerability actionable in some way?

If this were done in private that would be one thing, but they've now gone on the record as saying a usable vulnerability exists.


IANAL, but you would probably need to show that you were materially harmed by your iPhone's now-weakened security. Maybe someone could step forward and show that their new iPhone got hacked and that they lost money because of said hack.


This certainly isn't how it looks to me. The FBI comes off looking like complete fools to me. They said that they needed Apple's help to get into this phone, that there was no other way, and that Apple's refusal to help endangered national security, humanity, and the universe. Then they got called in front of Congress and schooled by a Congressman, one of the last places you'd expect to find anyone remotely technically competent, about NAND mirroring. A few weeks later, they call the whole thing off, saying they didn't need Apple's help after all.

They come off looking like complete fools at best, since their huge public spectacle was based on "only Apple can do this," and now that it turns out they could have done it themselves, it just looks like they're incompetent.

Meanwhile Apple comes off as fully committed to their customers' privacy even when it means standing up to the US government. The fact that the FBI was able to get into this phone doesn't really change much; the mere fact that Apple could have gotten into it already means that security was lacking on it, but it seems to me that everybody understood that this was an older model and newer ones are better. Which is funny, because I'm pretty sure both the FBI's proposed attack from Apple and whatever the FBI did themselves would work on the latest hardware too, but just about everyone is convinced that the Secure Enclave would prevent it. And then in September Apple will announce the iPhone 7 with Even Better Security, further demonstrating their commitment in this area.


> Meanwhile Apple comes off as fully committed to their customers' privacy even when it means standing up to the US government.

My conspiracy theory instincts tell me that this is a play to fool us into thinking that Apple is fighting for our privacy, in order to make us trust Apple unconditionally and in the meantime not develop new ways of hiding our communications. In reality I think the government always gets what it wants.


> it just looks like they're incompetent.

They just hacked into an iPhone, something Apple has been claiming for a long time was impossible. Even Apple has no clue how they did it.

Incompetent they are not.

You're looking at political theater, that's it. But there's little incompetence flying around.


>They come off looking like complete fools at best, since their huge public spectacle was based on "only Apple can do this," and now that it turns out they could have done it themselves, it just looks like they're incompetent.

No one ever doubted that there was some vulnerability that exists that could be used without Apple's help. There is almost always a vulnerability. The hard part is discovering it. As soon as a third-party told them about it, they paused the case against Apple. The new information changed the facts of the case and made compelling Apple unnecessary. For the FBI to do anything else would have been perjury of the highest order.


That's what makes them look incompetent. They insisted that they couldn't do it without Apple's help. Turns out they could, and they just didn't do their homework first. (Or, if you believe they had ulterior motives, then they didn't want to.)


Funny that they usually get grief from the same crowd for wanting to know everything, but now they they attacked for not knowing everything.


Those are two radically different values of "know everything" you're pretending are in conflict, there.

Yes, I expect the FBI to "know everything" when it comes to techniques available to do their job. No, I don't expect the FBI to "know everything" when it comes to the contents of my private messages. There's no conflict.


>Except that if I were them, I certainly wouldn't ask that publicly, I would at least pretend I know how the FBI did it and claim that it's already fixed in the next version of the OS

Wow, lying doesn't really look like the best course of action for a public traded company, I guess.

>Which still means that tens of millions of iPhones are at risk today and will be for months, but at least, you get to pretend that you're ahead of the FBI while right now, it's pretty obvious that Apple has been outsmarted by a government agency.

It's extremely probable that users are not in danger. If it's true that the FBI used Cellebrite's services, they might have used a slightly upgraded version of the company's solution that enabled the hacking of 32-bit devices (i.e. no secure enclave). The commercial solution touted publicly by Cellebrite is used by law enforcement all over the world to unlock iOS 8.x iPhones and iPads in a matter of 24h. There was a recent case in Milan, where the Court expert was able to unlock an iPhone 5 in 2 days and retrieve data thanks to the services of Cellebrite's offices in Munich.

So: 32-bit devices with iOS 9, no secure enclave, implication of a company known for providing solutions to break into iPhones since forever.

Icing on the cake: they might as well have used a system like the one suggested by ACLU and other experts: NVRAM cloning. That's a solution that's extremely phone specific and doesn't require to exploit any big scary bug to be carried out.

Most probable outcome: Apple will keep on hardening iOS security and it's own cloud security even more. In the end the common user wins.

The FBI is STILL the bad guy, acting like criminals who won't disclose a potential vulnerability (which might as well not exists) to the manufacturer.


>The FBI is STILL the bad guy, acting like criminals who won't disclose a potential vulnerability (which might as well not exists) to the manufacturer.

There is no obligation to disclose vulnerabilities. You are free to sell them to the highest bidder, including the government. The government has some damn good uses for them, see Stuxnet.

Of course, big tech companies would love it to be required to disclose vulnerabilities. That way they get the whole security community's labor for free and don't have to pay bug bounties.


What's the objective behind Apple wanting to know the method then? It sounds like any of these could be possible, so it's not like there's a big mystery (especially given the Cellebrite Purchase Order found on some government website).


It could be something else. That’s the problem. Apple doesn’t know for sure and they would like to.


I think you're reading this completely wrong.

The FBI asked for access, Apple said no (because they knew the case was about precedent rather than capabilities). Apple knows the older phones had vulnerabilities (see this faux-apple computer - https://youtu.be/zsjZ2r9Ygzw?t=15m50s commercial).

This is the follow-up from Apple saying "oh, you needed our help to crack it huh? How did you suddenly find a way to do it on your own without us as soon as you realized public perception wasn't proceeding as you hoped?

EFF seems to think that the FBI is legally required to disclose the method (https://www.eff.org/deeplinks/2016/03/fbi-breaks-iphone-and-...) due to their VEP process.


The VEP seems to be policy rather than legislation. Meaning that there is no legal obligation for the US Government to abide by it.

If anything, the question is whether the US Government is morally obliged to reveal the vulnerability, given that the risk of not doing so is much higher than the value the government gets from exploiting it as a tool against terrorism. That, I believe, is the EFF's strategy – getting public support and appealing to the government's moral obligation to protect its people.


So the same company that didn't want to help investigate somebody who killed 14 people because of privacy concerns believes the government has a moral obligation to help them debug their software / hardware platforms. Yup, that makes sense.


You seem to think holding both positions is incoherent, but I have no idea why.


> I think you're reading this completely wrong.

It may be the wrong opinion, but it's the popular opinion among many people I've spoken to. Namely, they think Apple has egg on its face and isn't as good at security as they claimed to be. Right and wrong don't always matter in the court of public opinion.


>Now, it's the FBI sending a giant "screw you" to Apple by not only letting them know they were able to hack into the phone without Apple's help

Everyone knew they could do this and they only tried to use this whole thing to get easy access to all phones from Apple.


...or the FBI didn't succeed to hack the phone, and just tries to justifies why it won't proceed to trial.

By the way, since the FBI said this phone will allow arresting other terrorists... did we find anything relevant on the phone?


More likely that they did manage to find a 3rd party that could unlock it and found nothing, this wasn't the suspects personal phone (those were also recovered) but it was his work phone, the 6 week old backups had no information on it, the attack was planned more than 6 weeks in advance and it is highly unlikely that he used his work phone to contact any one involved in this.

They've probably spent six to seven figures on hiring a 3rd party for a bespoke service just to find what any analyst could've assumed - nothing.

That said given the severity of the situation even a 1 in a 1000 chance to get some additional information would'be been worth the kings ransom they must have had to pay to get it unlocked. At the end think what you want of the FBI but that's exactly what their duty should entail, forcing Apple to unlock it is a murky business but investing resources in unlock the phone themselves is perfectly within their right and even obligation to do.

I don't understand why people are going apeshit about this no one would bat an eye when the FBI picks a lock or breaks into a safe how is an iPhone different?


> I don't understand why people are going apeshit about this no one would bat an eye when the FBI picks a lock or breaks into a safe how is an iPhone different?

I have no problem with the FBI breaking into one phone that they have both a warrant for and have in their physical possession. That is how it's supposed to work.

I did, however, have problems with several other issues around this case (none of these have really gone away, only been postponed for now):

1. The FBI potentially being able to force a company to assist and do work for an investigation, as well as betray the privacy of its users

2. The government moving, partially based on this case, to force vendors to create back doors for everything, weakening security and privacy for everyone

3. The slippery slope that all of this would create


> no one would bat an eye when the FBI picks a lock or breaks into a safe how is an iPhone different?

Because it was not about that one iPhone but the planned precedent with immense effects: using the court-order to "write" the new laws, and FBI started the whole thing exactly with that goal:

http://9to5mac.com/2016/02/26/fbi-apple-iphone-precedent-pol...


And? I've specifically stated that I don't think that the FBI demanding Apple to either unlock the phone or reduce the level of security is appropriate, the FBI breaking into it them selves is perfectly fine.


If you were a lock or safe maker, wouldn't you want to know how people were able to crack your product so you could improve it in the next version? Or would you just keep trying to sell a version of the thing that people buy to keep their stuff safe with a known exploit? Good luck.


What lock do you have on your door? Something standard like this: http://goo.gl/cuIenh (Image of lock)? That can be opened easily in a number of seconds without damaging the lock with at least 2 well known techniques. Yet that is the lock installed on new doors on peoples homes. They don't care, people don't want/need actual security they just want to keep the idiot burglars out and be able to claim the value of their losses via insurance. While professionals will get in just fine via the front door without any trace of breaking and entering. And because the people don't care, the lock-makers don't care. The ones who care the most are the professional burglars who would like nothing more than for everyone to stay asleep and preferrably for the techniques to be forgotten.

In short; lock-makers continue to make exploitable locks just fine, because they sell just fine. The moment they stop selling is the moment they'll fix the issues. Nobody is going to skip the next iPhone because the FBI allegedly managed to crack one, security is not one of the features people care about (in general).


Where do you live, that new homes have locks like those? Because here new houses pretty much require class C anti-burglar doors (which roughly means, they should be impenetrable in less than 10 minutes) - they have non-standard keys and complicated locking mechanisms, door-frame is locked into concrete etc. etc. Otherwise cost of your insurance will go through the roof.


Where do you live? "Here" could mean South Africa, in which case you'd have a point. But it's also very much a special circumstance, as anyone from that part of the world would attest.


It's not that special. I live in Brooklyn, New York, and we're pretty keen on locks here too.


Where do you live such that those things are required to get a decent insurance rate? My insurance never mentioned a single one of those requirements.


I live in Sweden and I got a discount on my home insurance if my outside door is of a certain class (although apparently most reasonably modern outside doors you can buy cleared that bar so it can't have been that high).


Where do you live that your insurance company charges the same rate no matter how likely it is that they'll eventually need to pay out?


The locks I'm talking about are common throughout Western Europe.


The whole analogy is flawed anyway. If you can't unlock the door, just break a window. The stuff inside the house is the same in either case. We accept that there's a limit to the utility of a lock.


The amount of support Apple got in this whole affair would seem to indicate that security is a feature many people care about.


Sure, but you'd tuck your tail between your legs a bit after all the grandstanding.


I wouldn't consider lock makers as a good example on one hand they ask for feedback from locksmiths to improve their designs and on the other actually train licensed locksmiths how to break locks.

The vast majority of safes also have a backup lock or code.


Then why did you use locks and safes as an example in the first place?


Locks and safes are a good example for me, lock makers are a bad example for your argument because they do cooperate openly with law enforcement.


Gosh...


There's a reasonable chance either way, but I'd assume you'll not hear about it for quite a while unless you happen to be an agent.


>And now, Apple is in panic mode, slowly realizing that they went from being the hero of privacy in the modern age to the company that wasn't able to secure its phone from the FBI. From the FBI. Of all things.

Some applications that I cracked, yghm illegally, had TOS or EULA with something like 'you're not allowed to RE our app" or "if you find a vulnerability you must let us know about it in the first order". Doesn't Apple have something like that for their software? That would make it easier for them to legally force FBI to reveal the method. If, FBI, does not lie, which can be the case, they got data from different source and iOS is still safe.


Such TOS/EULA are horseshit, and even if FBI do nothing else right, they're right to ignore such.


Was there ever a proof that FBI actually hacked the phone? There could be lots of not-exactly-exploits that could be applied to this situation and give the solution. For all we know, the company they hired said: "you don't really lose anything by checking 1236 and 1478 first, so why not just do that?" and the first one worked.

Anything about successful / unsuccessful / advanced / trivial solution is just speculation at this point as far as I know. The only thing that FBI said is that they got the data they wanted - and that could be done in thousands of ways. And it doesn't even have to be something that can be prevented or fixed by Apple (what if they found a CCTV recording of the owner tapping the code in?)


You think Apple should have lied to the public?


I assume he means this would be the best strategy (if kept under wraps) for Apple's reputation; the public's interests notwithstanding. Regardless, grandstanding twice just looks foolish for everyone except the FBI.


You think this case makes the FBI look good? It took a month long media circus, made Apple look like the good guys, pissing off most of the tech community—all to unlock a single phone that likely has no useful data on it. If they had wanted to advertise their ability to unlock iPhones, they would have just done it, and done a press conference about how they unlocked it and got lots of good information. Instead, they look like they were desperately scrambling, before some "Canadian girlfriend" showed up at the last minute and saved them from looked even more foolish.

Sure, from Apple's side, it would have been better had it gone to court and been established that the government isn't allowed to do what it was trying to do, but... I don't see the FBI trying the same tactic again, and, here's the kicker—if they can't establish the authority to force Apple to unlock phones on non-secure enclave machines, then when they need a secure enclave equipped phone unlocked, it's going to be a lot harder to use the All Writs Act to force Apple's hand.


> You think this case makes the FBI look good?

Right now, I'd say they are looking pretty good: they hacked into the phone of a terrorist without breaking any laws.


If only it was really about just that phone, and didn't create a month long media circus power grab only to end it with "uh.... nevermind."


Of course it makes them look bad to anyone who thinks their looking good looks bad.


If they had just hacked into the phone, and said "hey we hacked into the terrorist's phone", I think they'd look pretty good. They'd be competent, and did what they needed to do. But that's not what they were going for; they over-reached, and then weakly backed off. That's what doesn't look good.


I'm specifically talking about Apple asking for the hacking info. The FBI doesn't look anything (yet) because it's just the recipient of the request and AFAIK hasn't yet responded.


Why would they do it? Right?


>letting them know they were able to hack into the phone without Apple's help but at the same time, making a mockery of Apple's entire security claims.

What proof do you have that the FBI hacked the phone? The only piece of evidence we have is that the FBI withdrew the case. That could be either 1.) They were able to crack the phone (unsubstantiated TMK) 2.) They realize they lost the debate regarding encryption but didn't want to risk setting precedent


> And now, Apple is in panic mode, slowly realizing that they went from being the hero of privacy in the modern age to the company that wasn't able to secure its phone from the FBI. From the FBI. Of all things.

> No wonder they are freaking out and they want to know how the FBI did it.

Not really sure how you reached that conclusion, as there wasn't even a single quote from an Apple employee or legal representative in the article. The only quote even related to the FBI giving the exploit method to Apple is from the product counsel at AVG Technologies.

So how exactly is Apple "freaking out"?


It's not working as intended for either party. They both come out looking bad.

Apple looks bad because they claimed that if they produced this software for the FBI it would eventually get in the hands of international criminals and put their users at risk (this before the FBI tried compelling them so it would not have set court precedent). So the FBI did an about run and got someone to come up with a method to achieve the same aim, and so the end result to Apple users is the same, except Apple are in the dark as to what the vuln is. That's a position they'd like to not be in.

On the other hand, the FBI coming in and trying to compel Apple, after an initial rebuff, looks like it was unnecessary and makes them look less competent.

Now, given it took a third party to do this, the FBI can't very well disclose the third party's trade secret.

Lose-lose for both.

This should be titled "The iPhone was never as secure as we wanted to think it was"


> and claim that it's already fixed in the next version of the OS

And look like a lying fool when details eventually come out and reveal that it's a hardware security flaw, not something that can be patched by OS.


Doesn't the federal government have a policy strongly encouraging reporting vulnerabilities discovered in computer systema?


Oh, you mean the password their agents set, AFTER the city unlocked the iPhone using the standard administration interface? The phone was a City Phone, was managed by the City, and had all the standard controls of a Managed iPhone.

Well, ask the agent what they set and/or read their logs. This isn't rocket surgery.


The password that was reset was the iCloud password, preventing the phone from automatically backing up until the new iCloud password was updated. The unlock pin was not changed.


So apple mocked, FBI screwed. This round's winner: Cellebrite.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: