Hacker News new | past | comments | ask | show | jobs | submit login
iOS 11.4 to Disable USB Port After 7 Days: What It Means for Mobile Forensics (elcomsoft.com)
544 points by louis-paul on May 8, 2018 | hide | past | favorite | 370 comments



I’m so unused to seeing a corporation act in the interests of their customers explicitly counter to the wishes of law enforcement and the intelligence community that I’m racking my brain trying to think of ulterior motives that explain why Apple might have this.

Either way, on the surface, I’m quite pleased by this development.


I think it's as simple as: Apple's business is determined much more by end users than by government regulation. Unlike telecoms or increasingly, Google/Facebook. And this is amplified by Apple deciding that, as the others can't follow it, it's a good differentiator to invest in.


This is exactly the position I want in someone who I need to trust. Supposed corporate altruism and any promises of "don't be evil" goes out the window when enough money comes into play. However, if my interests and the monetary interests of the other party is aligned, then it compels the other party to behave in a way that helps my interests.


dont be evil is a pretty low bar to start with, because it does not even give you any direction of what you should be instead, and evil can be interpreted in multiple ways.


You said it. It's so tepid.

In a world ruled by "chaotic evil", even "lawful neutral" is culpable.

"In order for evil to flourish, all that is required is for good people to do nothing."


I absolutely agree.

When you remove government actors from industry, except where absolutely needed (e.g., preventing monopoly) this becomes the default modus operandi.


My wife is Chinese and we go there quite often. Regulation of consumer goods and services is very light and easily circumvented, and as a result the consumer market is a cesspit of shoddy, dangerous products and deceptive practices. Its so bad one of the most prized gifts you can take there is baby formula powder, because the local versions have frequently been found to be cut with dangerous chemicals. I’ve seen and have family with experience of a deregulated market and it’s not at all pretty. Individual consumers just don’t have the resources to deal equally with big companies that have no reason to care about consumer interests, without the ability to exercise their collective power - which is what a representative government is.


I think what I’m hearing you say is that China’s government regulations are “light”... :)

China has 8 different agencies involved in food regulation.

A government that doesn’t allow its subjects to express freedom of speech to its citizens is not “light” in anything.

The dangerousness of items produced in China is a symbol of the corruptability of the monopolization of power by government, underscored by the fact that business in mainland China starts with payoffs to local government officials.

Meanwhile, in Florida, where people are free to package up food items for sale with no license of any kind and no commercial kitchen (up to a certain volume), I don’t hear about many cases of people receiving brain damage from lead poisoning after eating cookies from their local coffee shop.

People that don’t have a foot on their neck are typically not evil by default, because they don’t have to be in order to just survive. That’s why things just work here in the US.

People accustomed to oppressive control just don’t understand these things.


Totally off-topic, but I found "baby formula powder" to be endearingly amusing. ️


Poisoning babies with melamine is amusing?


No, the turn of phrase used to describe “baby milk formula” in powder form. “Baby formula powder” implies a kind of powder that produces a baby. Hence, the off-topic comment.


OK, I get it.

It is, in a way, a powder that produces babies. Larger babies, anyway.

And by the way, the legend about Gerber baby food in Africa is reportedly bullshit.[0]

0) https://www.snopes.com/fact-check/label-fable/


melamine?! Jesus, that's what my dinnerware is made out of.


That's the monomer, yes.

Simple desire for profit drove adding melamine to milk formula for babies. In that melamine cost less than milk powder. And that it reacted like proteins in the simple test that was commonly used.

Melamine is (C-N)3 in a ring, with NH2 on each of the carbons. And all amino acids have C-NH2 on one end. Thus the name. I gather that the test scored all C-NH2 moieties. So each melamine molecule looks like three amino acids, and contains less other stuff than amino acids on average. Making it a very efficient adulterant.


that is clearly an issue with the lack of political freedom, not just regulations.


Political freedom alone won't solve tainted milk in China. People don't have the time or energy to audit every food product they purchase, establishing minimum standards to prevent death or injury from harmful food is one of the least costly ways to deal with the root issue.


Political freedom can lead to elect officials who care about such issues and take actual measures to create and enforce regulations instead of the current ones doing nothing or accepting bribes. A monopoly on politics is not optimal.


I don't see the FDA being disassembled anytime soon. Hopefully.


If only they had the political freedom to hold the government accountable for... what exactly, if not enforced regulation?


Thats what i am saying. There can be no accountability of officials if you cant replace officials with due process with folks coming out of several parties and not just one.


This is a pretty ludicrous statement. We wouldn't need any consumer protection laws, anti-fraud laws, or a really a majority of business law if the default modus operandi was pro-consumer. The reason Apple is different here is because their financial incentive aligns with the consumer. That is not the case with many businesses regardless of government involvement.


That's why we never had to do anything to address rampant fraud, cutting food with dangerous materials, straight up lying about what you're selling, con artists, snake oil salesmen, or irresponsible management of hazardous waste, right?


Not that I support the OPs absolutist statement, but courts could and do handle quite a few of those (I mean fraud is handled almost entirely by the courts). Especially if the effort over the last century was put into strengthening the law and property rights, instead of creating endless agencies, government economic power brokers, and pre-emptive hoops for companies to jump through, which encourage state-backed oligopolies to flourish at the expense of competition and any firm small enough to not afford a team of lawyers. Not to mention measuring efficacy and ROI on each individual agency involved in market intervention is largely absent once the agencies are in place.

Unless you're conflating 'removing government actors' as completely removing the justice system and law enforcement? Which are two things which libertarians are very much in support of being government responsibility...Smaller government != no government.


Laws are regulations, the courts you’re praising didn’t pull the laws they enforced out of their asses. If you remove government agencies specialised in regulating specific domains, then that responsibility will just fall on general law enforcement or just go unenforced.

Of course there are costs to any system of regulation, but most consumer regulation is there to prevent companies doing things some of them absolutely did. Busses in London used to have “no spitting” signs. Now they don’t. Why the change?

In general you regulate because hard earned experience shows you have to, not because you just feel like it. If regulations become unnecessary, ok it’s time to revisit it, but managing this stuff is what we elect people for.


On what grounds? Caveat emptor was the rule for a long, long time. It was only later when regulation and standards started coming into play that it was a thing. The courts can't just declare a legal thing to be bad.


You can only be libertarian if a) you've never bothered to study history or b) it's a cover story for an ulterior motive.

The US was mostly a libertarian's paradise from 1850-1950. It didn't work. Federal agencies and government regulations were created because the courts were unable to adequately respond to ongoing problems. The proximate cause of death for libertarianism was the sequence of massive bank panics and depressions leading up to the final "Great Depression" in 1929, but there were many causes across wide-ranging areas of society.

To give just one example: The FDA was created (and later strengthened) in response to a long succession of disasters where well-established drug companies added known toxic (or lethal!) chemicals to their drugs, then placed them on the market without testing. Thousands of people were killed.

No legal decision can bring back the dead.

Most of these companies already had reputations to protect and judgements against them were expensive. Yet they continued to screw up royally.

We've tried libertarianism. It simply doesn't work. It has never worked. It will never work. It is always less efficient to force millions of consumers to extensively research every aspect of the products they buy, then seek redress in the courts after suffering injury. It will always be a net win to set basic safety standards (for established product categories) and force manufacturers to follow the standards.

If you're a rich oligarch who hates paying taxes then libertarianism is a convenient excuse to shrink government (regulations = expense) and/or foist as much of the tax burden onto others as possible.

Libertarianism is also attractive to people who have grown up in a sheltered society and so see regulations and standards as unnecessary restraints. They don't have any basis for comparison.

It's similar to anti-vaxxers: Vaccines were so successful that whole generations have grown up without watching their kids or friends die and be crippled by disease, so they don't value vaccines any more than they value oxygen in the air.

For that matter it is the same as the current Boomer generation's FYGM attitude: growing up in a post-war boom when the effective wage was ~$18/hr and college cost 1/4 as much of course it was easy to work part time while getting your degree. And with a growing population and society of course there will be plenty of jobs waiting for you. Like oxygen in the air or water in the sea such conditions are completely beneath their notice and thus later generations "must" be lazy moochers and "of course" they should just "work hard like I did".

Sometimes I think humans really are doomed. As a society every time we get a good thing going we completely forget the toil, blood, sweat, and tears required to get there.


> You can only be libertarian if a) you've never bothered to study history or b) it's a cover story for an ulterior motive.

While I somewhat agree and don't consider myself a libertarian, I'm much more of Thomas Sowell supply-side economics fiscal conservative and social liberal, I believe this smug, self-righteous tone, littered with broad absolute dismissives, ("We've tried libertarianism. It simply doesn't work") through-out your post perfectly typifies the problem with US tribal politics, especially the left-leaning sort.

You could easily change the wording and throw in some similar greatly over-simplfied examples, and you could say "We tried socialism and it completely failed", as a dismissal for modern big-government liberalism. Which is ridiculous and unhelpful.

These endless trite left vs right debates on the internet always seem to pigeonhole unique and complex historical moments (with distinct geography, historial circumstance, economic situations, cultural differences, broad incentives, birth rates, technological differences, etc, etc) into some ideal fantasy governments that never really existed or even marginally fit into the molds of ideologies being questioned.

I mean... even scale is a huge difference maker. I believe smaller country's governments function far better (see: Canada, Scandinavia). As do "early-stage" countries in the growth stage after being up-ended. To apply some generic economic political system broadly across every country, big or small, financially stable or not, old population, cultural work, etc) is not a interesting or helpful as people seem to think.

Example: I love hear people explain how "Iceland nationalized banks and look how great it worked", meanwhile Iceland has a total population of a small US city, 0.01% the size of the US.

So it's entirely possible you're right. A more pure form of libertarianism, which may have worked well in the past when the country was 5% the size with immature industry, is likely going to be a disaster if it was imposed today. That doesn't mean it's not a good or superior model more fundamentally as a guiding force when shaping current polcy, or even within the larger system in thousands of isolated situations (such as schooling for example). Nor does it mean it wouldn't be ideal for a different culturally or smaller or geographically distinct group of people or for certain states within a heavily federated system.


A few days ago I was pondering what it would be like if we treated government like a software project. You can never address all the issues at once unless you're doing a rewrite (at which point the old project is effectively dead). You just have to refactor as you go. Practically each section becomes organized in a way that more or less reflects the values and style of the author. Sometimes a codebse is able to maintain an overall style, but try as we might you can't delete the programmer entirely.

So then I had an idea for a "single focus president". This would be someone who is entirealy indifferent to everything except the one focus area they call out in their campaign e.g. healthcare. It's not that progress wouldn't be made in other areas it would just be entirely congressional and judicial. Once the president addresses their focus issue, they step down. There are probably anecdotes of how we've tried similar things and failed, but I know I would be open to considering a campaign on those type of grounds.


This argument can be used to justify any form of government as long as some share of extortions get invested for the benefit of extorted.

But I agree all of this is great if I only have to pay <$5K yearly but not so much otherwise. Not to mention having to emigrate to cancel "the services". That sucks too.


> The US was mostly a libertarian's paradise from [insert date range of the most prosperous human development in the US].

And, the great depression was a direct result of government and bank collusion. The Federal Reserve tightened the money supply exactly after the market crashed. Take the federal government, which sanctioned the federal reserve, out of the picture, and you have a much smaller crash that weeds out all the idiots who fell for the securities fraud perpetrated by the Shenandoah Corporation, which precipitated the crash in the first place.

In fact, get rid of the federal reserve banking system that was created by the federal government in 1913, and you don’t have any of the major crashes in the ensuing 95 years.

Go one step further and remove the federal government’s control over the money supply in general and you have a system of multiple currencies, all controlled by their constituent markets, many backed by silver and/or gold. and you don’t have a nationwide gold seizure by the federal government in 1933, whereupon our distributed sovereign wealth was gifted to internationalist bankers. You have instead a wealthy population of the descendants of the colonialists that conquered this country for us.

They don’t really teach this kind of stuff in stated-funded “school”, now do they?


I think crypto currency is the best example of what happens in a wild west, unregulated market -- lots of fraud and instability.


Yeah, and look what a failure crypto currency has become…

Combined crypto currency market capitalization is currently nearing 1% of the market capitalization of the entire planet.

IOW, great thesis.


I concur, and I would add that Apple is fundamentally different than the Facebooks and Googles in that Apple makes money off their hardware. They have a natural incentive to protect user data because it enhances the value of their products.


*Offer does not apply in China.


I wonder if it’s worth it to let corporations know when they do something we like.

They can’t do much about China. But they can get away with a lot in other countries.


They could choose not to sell in China


And who would benefit from that?


Like.. buying a product?


I'm not sure that's fine grained enough. Apple isn't perfect. But they do specific things that I like enough to keep buying Apple stuff.


This is why I am deeply skeptical of Apple's supposed security. As long as the company forfeits data to the Chinese government then there is no real data protection on their devices.

I need that to change, or where I see Apple's products heading is not going to be what I'm hoping. What I'm hoping: Apple devices become the computer. Totally secure. The only one that can access the information on it is myself with no exceptions. This device (watch, phone, glasses whatever) would carry every possible detail about myself and my life on it, making it the perfect form of digital identification. Accepted at hospitals, accepted as a drivers license, accepted at banks, accepted as a log in for websites. It doesn't need to provide identification if I don't want it to. But if I do, then there is a mathematical certainty that I am who I am claiming to be, and no one can claim otherwise.


So, you want to put that much power into Apple's hands? How do you expect them to remain on your side in that case?


That's why it's not going to happen with Apple's two faced position on privacy.

As for what I want Apple to be as a company, if their devices become the theoretical vault where I am the only one with the key, it wouldn't matter if they remain on my side or not. They fulfill the role of design and engineering, not data hoarder, and wouldn't be able to change that even if they wanted to. At least that is the image that they have been pushing recently.


From what I've read, enterprises certainly love iOS for security. And for better or worse, LEA provides well-reported test cases. So it's not that Apple wants to protect criminals, it's that criminals are canaries.

Indeed, the whole "going dark" vs privacy debate mostly misses the point.[0]

> The heightened tension produced by the introduction of encryption by default into an environment where terrorism has magnified the need for efficient law enforcement access (surveillance) supported by a newly-expanded CALEA framework is often framed as a contest between privacy and security. It is, however, more accurately framed as a security issue on both sides, one side which integrates traditional privacy concerns with the growing focus upon cybersecurity ... The cybersecurity and, incidentally, pro-privacy position rejects exceptional access as a dangerous fiction that would, among other things, create new attack surfaces, rendering networks more vulnerable to every form of predation, from financial crime and IP theft to cyber espionage, ultimately generating unacceptable risks to our national and economic security.

0) https://scholarship.law.unc.edu/cgi/viewcontent.cgi?article=...


Even simpler. Apple doesn't have "enterprise" products that it sells to governments. Therefore, they can't be forced to comply through any wallet-based incentive.

Having said all that. . . I'm not sure that the rights of criminals' privacy outweigh the rights of citizens to be protected by law enforcement. I wish there was a clear way for law enforcement to act on behalf of citizens without such risk of corruption/abuse.


One of the basic principles in the US justice system is the presumption of innocence. Therefore, Apple is actually protecting the rights of innocent people, until law enforcement can legally prove otherwise, at which point they usually don't need access to the phone anymore.

This is why it is impossible to sacrifice phone security for "criminals" while keeping the rest of us safe.


Yes, I was thinking about the Dells of the world, too. But it's bigger than that as, say, Dell hasn't sold any valuable consumer targets since the Venue smartphone in 2012. I suppose telecoms have enterprise products, but they're more interested in cooperating with the government where it doesn't cost them much (hey, consumers already hate them), in exchange for regulation/legislation that protects their business (hey, consumers have nowhere else to go now).

Valid point on how we let law enforcement do its job. The other aspect of Apple's success here is timing. We don't trust government or law enforcement as much as we used to, and a new balance has to be found, perhaps by lawmakers who have foresight of both technology and human rights.


I highly doubt it’s that simple. Most users don’t value these extra security measures let alone know about them. This leads me to believe that there’s a different reason why they’re upping their security more than other user oriented companies.


Apple has been clear from the beginning. They do not supplement revenue with personal data sharing or ads. There is not hidden agenda here. Their major competition shares data with other companies and is very forthcoming with the government. Facebook, Google, Microsoft, and Samsung all sell access to their user's data. This is a winning strategy because Apple already charged a premium for it's products. Now more than ever that value added is win for the customer.

Just to be clear Apple does give information to the government, but they stopped short of unlocking devices for the government. I see this as a very big important precedent.


It could also be a product of their engineering teams and upper management. Most of the Apple development staff are in California and are highly technical. If a majority of the top dogs really believe in protecting privacy then it’s easy for them to get it done.


Isn't that the point, though? Touch ID was a major security feature (industry-changing even) that was added to the phone without changing the behavior of users. It's nearly a perfect scenario of adding an incredible amount of functionality for users without requiring them to do anything. Apple has always been about user security regardless of whether or not it was obvious to users.


Don't fool yourself. The NSA has a nice backdoor into IOS[0]. [0]: https://wikileaks.org/ciav7p1/


In no way does that article claim there is a backdoor. It claims CIA and others have heavily invested in finding 0day exploits to gain access to the iphone; this news item is about apple closing some of those.


I'm not engaged in international crime or fighting nation states, I just want cops to have to get a warrant and my lawyer to get a chance to argue with them in order to read my phone.


Here ya go: https://motherboard.vice.com/en_us/article/7xdxg9/fbi-hackin...

This is not the CIA hacking into Kim Un Iphone in North Korea, this is local cops using these advance surveillance tools on US civilians.


Don't fool yourself. That's not possible with the current iOS configuration and the link you provided doesn't dispute that. If anything, Apple has closed any potential backdoors that could have been exploited.


I trust Wikileaks about as far as Assange ventures outside of the Ecuadorian embassy.


Don't they use Google Cloud for ICloud storage?


They use multiple types of cloud storage; the data is encrypted and Google isn’t processing it, just storing it.


A common misconception - parts of iCloud data are encrypted at rest, but a good chunk of it is not. They've indicated they want to get there at various points in the past, but unless I've missed an update it's not there currently.


The files are still encrypted at rest (using convergent encryption) to obscure their contents from the underlying storage service, but Apple holds the keys:

> Each file is broken into chunks and encrypted by iCloud using AES-128 and a key derived from each chunk’s contents that utilizes SHA-256. The keys and the file’s metadata are stored by Apple in the user’s iCloud account. The encrypted chunks of the file are stored, without any user-identifying information, using third-party storage services, such as S3 and Google Cloud Platform.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf (page 56)


>Apple holds the keys

Unless you live in China, in which case Apple and the Chinese government hold the keys.


We don't know this to be the case.

Unless you some evidence to the contrary ?


The Chinese government made Apple hand over control ofiCloud infrastructure in China to a Chinese company. So those encryption keys stored in iCloud are now in the hands of aChinese company subject to Chinese government control.

Not exactly an ideal arrangement, but it was likely that or switch off iCloud in China, or pull out of China completely. Which to be fair Google actually did.


Again. There is no evidence that has been presented to date that indicates that hardware keys were given to the Chinese government.

We suspect it may have happened. But nobody actually knows.


Files encrypted at rest on Apple’s servers represents protection for Apple against external threats, not for the user.

These are security schemes that do not enhance the user’s privacy.

It’s cool that some companies are security conscious enough to do this, but for the user’s privacy remember that ... if it’s not end to end encrypted, it doesn’t matter for privacy, just for security and those two notions are very different ;-)


According to this (https://support.apple.com/en-us/HT202303) everything is encrypted at rest on the server, except for mail.

Everything isn't end-to-end encrypted, is that what you are talking about?


Agreed. It's the main reason why iCloud isn't HIPAA compliant


Using multiple storage providers makes iCloud non-HIPAA compliant? Or did you mean something else?


I believe so but that’s an implementation detail. They could switch to AWS or Azure.


They heavily use AWS, Google Cloud, and Azure.


Not anymore for Azure. Google Cloud replaced Azure for them.

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

Control (Command) + F and there's no mention of Azure anymore.


Looks like they use Google Cloud and AWS. I assume they’ve built their platform in a way that they can easily use many different providers.

https://www.theverge.com/2018/2/26/17053496/apple-google-clo...


No kidding! Everyone in my immediate circle has iPhone, I get the occasional friend with the “Android better”... but it’s things like this that make me remember Apple is a company that is very clearly for money which means for the most part they really need to deliver to the customer - where google is a company that tries to do everything they can to imply they aren’t about sales they “give away” this and that, it’s a “free and open” ecosystem etc etc.

I’ll take upfront and honest at the issue of more limited and expensive any day.


Google is very clearly all about the money as well. It's just that they understand there is no money that will come from the "customer". Instead, it will all come from other corporations that are seeking to leverage the relationship Google has established with you. I think it is healthy to remember that companies aren't some do good no matter what it costs them. At the end of the day, all companies are trying to make money the best way possible. Frankly, I am glad that there is both Apple and Google. I think they both do well while making money.


No, it very clearly comes from the customer with Google.

However, with Google, the End User is almost never the customer.


The argument that the user isn’t the customer needs to die please!

It’s a overly simplified statement and misleads any serious conversation on the related topics!

Thank you


Do you pay Google? No? Then you’re not their customer. Yes? Then you are. It isn’t a value judgement.


Still, saying you're not the customer but the product is wrong. If you watch TV, you're also not the product. I'd probably classify TV audience as customers of the stations, even if the money is made with advertisement. But viewers are certainly not products.

Google isn't different from a TV station. You as a user are a customer and pay Google by consuming the ads they present you in various products.


To extend that argument: NPR listeners for example are also not the product just because they don't pay NPR (but instead its funded by taxes).


> NPR listeners for example are also not the product just because they don't pay NPR (but instead its funded by taxes).

NPR is mostly not funded (even indirectly) by taxes, and has both actual targeted advertising on their own and platform (for digital properties), and “underwriting spots” from sponsors on its broadcasts, that while regulated more than traditional advertising and are, and have been acknowledged by NPR to be, a form of advertising by the sponsors driven by the same factors and concerns that drive traditional advertising.


To be pedantic, even if you paid google, they wouldn't be treating you differently. Prime example is Google apps for your domain customers...


Now that you mention it, I wonder whether the security thing plays into the reason why I need to remove and add apps I compile myself every n days? This limit is pretty much my only gripe against Apple at this point but if it is for privacy/security, I don't know what a good balance would be...

Sorry if I sound like a broken record [1]. I don't own an iOS device at the moment. I would love to get an iPad but it kind of sucks that I can't compile and run my own apps.

[1] Previously on HN https://news.ycombinator.com/item?id=16993157


That one is about platform control. It's one thing to build a platform that only executes signed code--that's good for security--but it's another thing to refuse to give users control over who the trust authority is for their system. Apple still maintains that under lock and key effectively refusing to let you run your own software on your iPhone.


It's not that clear cut of a tradeoff. If Apple were to allow limitless sideloading of apps and that became the standard way that people installed apps on iOS devices, it would seriously limit their ability to keep the platform secure against malicious apps.


I'm not sure that's a valid concern given that Android has allowed that since its earliest days, and most people are content to just use the market.


But all you need is that one person to side load something malicious and everyone in their contacts may be compromised.


No no. I'm talking about keys to my phone alone. If I choose to trust my own certificate then that only impacts what software I can run on my phone. An app cannot "sideload itself onto all your contacts". This extends to the OS itself. Let me change the certificate governing the OS software. Only once you allow that do I truly own my phone.


2muchcoffeeman doesn’t say apps will migrate to other phones.

(S)he implies that your friend’s phone being compromised will leak data about you (email address, physical address, birth date, mails you sent, etc)


If you give your friend that data, it's out of your hands. If you don't want your email address or birthdate to be known, don't give it to anyone.

I pretty much opertate from the point of view that anything I send in an email or text is public, and temper what I write accordingly.


The problem here is social engineering. If someone says "click these buttons in your phone's settings, sideload our app, and you'll be able to do X" (where X is "pirate movies," "mine bitcoin," etc), a surprising number of people would follow the instructions, ignore any big red warnings, and end up with their device pwned. You'd need a way to make the certificate trust settings accessible only to those who know what they're doing.


Apple places their customers first and developers second. They build devices for humans, not the mutant lizard people from space who develop software.


I am a human, and I develop software. This is a moronic statement.


Gee, I was going for ironic, missed, and hit moronic. Close enough for horseshoes. :)

I've started referring to software developers (like myself) with intentional irony as "mutant lizard people" after hearing one too many times how a particular OS/tool/programming language needs to be designed "for humans" (i.e, n00bs with no prior exposure to it).


That would be a terrible idea. Isn’t that how JavaScript started?


Without that time limit, an entire black market of non-app-store software distribution could spring up, and Apple would not be able to protect little Bobby who wants an emulator from bad app actors.


The real limitation isn't "seven days", as that's not that annoying: it is that you can have at most three such apps. But frankly, that still doesn't provide much "protection" from bad apps. All it is doing is prevent all of the apps on your phone from being pirated (capping that at three), which bounds the damage from adding this feature.


So? The entire promise of computers is programability. If you can't make it do what you want it to do, you might as well just get an old nokia.


This is simply not true. You’ve been lied to, and you don’t own an iOS device to check for yourself. I’m pasting something I shared in another thread here. I have no idea where this lie came from but it seems like someone is trying to keep people away from developing apps on iOS.

I’ve written a couple personal apps that sync with health kit data collected from my Apple Watch and nutrition apps to help me track things in a way that is useful for my fitness goals. In 2016-2017 I used these apps to put on 30 lbs and stay relatively lean but I’ve never wanted to publish them in the app store. I don’t pay for any developer program. Just need to download some (free) developer tools / SDK for XCode, build the source, and put it on my phone. The install is tethered, but afterwards it stays there until I delete it. Restarting the phone doesn’t delete the app either. Only time I needed to re-install was when my 5s crapped out and I had to get a new phone.


You can compile and run your own apps though. You would need an apple developer account and pay the fee to be able to side load your own apps. There is a process that allows you to do it if you're serious about it.

Edit: Disregard my comment. I clicked the link and understand the context of your comment now. Still, if you want to code and deploy your own apps you can do it but there are conditions.


> You would need an apple developer account and pay the fee to be able to side load your own apps

You don't need to pay to deploy to your own devices these days.


They're both doing the same thing. Just in Google's case, you and I aren't the customer. We're the user. Not the same thing.


Ok, but, if my buying choices are going to promote one company over another - I think I’ll choose to promote the one I’m the customer of.


Kind of a funny comment given the sibling comment noting that this is Apple catching up to a long-existing Android securitt feature, file transfer over USB requires first unlocking the device.


No, iOS has done that for years, since like iOS 2 or 3 maybe. This is different- it locks itself down completely, even to channels that have previously been granted access, if it hasn't been unlocked for 7 days


My Android phone (and most recent Android phones I know) completely locks USB data, including from devices that had previously been granted access. You have to unlock, touch a notification and enable some file transfer option instead of "charging only". No 7 days limit, the access is denied immediately once it disconnects.


Apple is awful at providing a reasonable spread of devices though. There's no $50 iphone that does everything I need, I'd have to spend at least ten times that. Then I'd have to pay MORE to be able to develop and run my own programs?! At that point, why even have a microprocessor rather than using a simple fixed function ASIC?

Would I be happier if I could get a phone with the equivalent of BIOS and nothing else and vet and install my own software? You bet. If nothing else, I'd not have to install all those Play ____ apps I never use. But given the choices available, Apple is a bad one.


$50 is a pretty amazing price point for something as flexible as a smartphone. However, you can get a perfectly usable used 5S for close to that range. Spend $15-40 extra and you can outfit it with new glass, screen, and battery, and it will feel like new, and probably perform at least as well as a brand new $50 Android.

You can get a free developer account that lets you develop and run your own programs since a couple of years.

I don’t know what you’re getting on about with ASICS, but I guess you may just need to accept being in a niche market segment.


No one would buy or make your low margin low cost phone, that’s why it doesn’t exist.

Almost like Apple has probably evaluated the business case of an open ecosystem low margin low cost phone, and then immediately went back to printing money.


The $50 price point being deliberately chosen seems to be a reference to Android Go, whose USP is that the max price of the phone, new, no subsidy, is $50. (Though in the US, ZTE actually sells them for $80.)

Of course, they're miserable little devices, but they are technically smartphones that technically run android.


and then immediately went back to printing money.

I love the phrase "printing money". It's time proven. But it just occurred to me that it doesn't do Apple justice.

Apple's free cash flow for the trailing 12 months was $44.6 Billion. That's 446 million $100 banknotes. How many printing presses does it take to print that many bills? How many sheets of cotton/linen paper?

Apple's financial numbers are astounding.


> I’m so unused to seeing a corporation act in the interests of their customers explicitly counter to the wishes of law enforcement and the intelligence community that I’m racking my brain trying to think of ulterior motives that explain why Apple might have this.

Apple started to assist chinese government and used their walled garden to ban all their users from using secure communications and VPNs. As soon as actual dollars are on the line Apple just like other corporations chooses dollars. I'd be vary of trusting them too much.


But dollars are always on the line, and that’s why they’re protecting customers outside of China and complying with the authoritarian Chinese gov in China. Most countries don’t have draconian measures like China does to make sure that companies comply with their every demand.


Other "evil" companies like Google instead refused to do business with such regime. Apple actively helps the regime by banning VPN apps and preventing Chinese people from installing them from 3rd party sources.


I am not quite sure the cases are analogous: Google is a website, Apple sells physical objects. When Google "refused to do business" with China, they doubtless knew that users would instead use a VPN and access some other national version of Google. Meanwhile, if Apple doesn't do business in China, it's not like people in China getting genuine smuggled iPhones that work normally. They're either going to use other platforms or use untrustworthy knock-off iPhones, and all of those probably comply with government regulations too.

Still, your point above stands: a company as large as Apple or Google has too many individuals with their own personal interests for "trustworthy" to really be a meaningful term. I trust Apple mobile phones much more than Google mobile phones, but, for instance, I trust Google laptops much more than I trust Apple laptops. I am sure there are good people working on all four products, but for some reason Apple's mobile phone division and Google's laptop division have consistently put out better products.


When Google "refused to do business" with China, they doubtless knew that users would instead use a VPN and access some other national version of Google.

From the same vein that "we're not Google's customers", neither are those Chinese users using a VPN. By leaving China, they're cut off from Chinese businesses who could pay to advertise.


Why the inverse trust in companies regarding laptops?


Chromebooks have the same sort of solid design principles that iOS devices do - hardware-based boot attestation and isolated cryptographic coprocessor, an entire OS design that lends itself towards sandboxing, a team that clearly cares about pushing the state-of-the-art in sandboxing forwards, security support for many years, etc. And both products have consistently evidenced solid security design for several years.

Apple laptops and Android phones are much weaker on all of these fronts.

I don't have a good explanation for why this is so. I think there may not be a good one, other than that complex engineering organizations are very random systems with tons of inputs and changing engineering culture is very hard, and people early in both the Chromebook and iOS projects were able to set and maintain the right culture, and people in the Android and Mac projects (despite being skilled people who care about security!) weren't able to pull it off, essentially by random chance.


Google only "refused" to do business in China because it was a minuscule part of their revenue.


My point exactly. Apple forgot about their privacy values the first second dollars were in danger.


That is Apple complying with local laws - which is what I expect every company to do.


Then you should read older post by Elcomsoft (1), about Apple consciously degrading several layers of security in iOS11 (compared to iOS10) essentially to a PIN-code. Currently if you know PIN and have iPhone you can extract everything out of it, out of backups and out of iCloud.

(1) https://blog.elcomsoft.com/2017/11/ios-11-horror-story-the-r...


I read the blog post, it shows that if you give up your devices passcode then you give up access to essentially everything on that device. It would be nice to have additional layers of security, but do you think this was done as an anti-user move to support law enforcement? I can't quite see that as being the case.


Graybox should have anyone security conscious ensure that they and all people they care about are using custom alphanumeric passwords for their iOS devices.

Even "999999!" would be hard to guess if the domain space is unbounded.


I recently switched to a 15-character alpha-numeric/special characters passcode after reading an article by a security researcher.

A snippet from that article:

iOS estimated passcode cracking times (assumes random decimal passcode + an exploit that breaks SEP throttling):

4 digits: ~13min worst (~6.5avg) 6 digits: ~22.2hrs worst (~11.1avg) 8 digits: ~92.5days worst (~46avg) 10 digits: ~9259days worst (~4629avg)



These are easy to calculate, the iOS whitepaper[0] specifies that it uses a PBKDF2 iteration count tuned for 80ms.

The passcode is 'entangled' with a per-device 'UID' that only exists in silicon, not accessible by any firmware.

It seems that the current GrayKey attacks are closer to ~1s/guess.

My last post on the topic: https://news.ycombinator.com/item?id=16833802

[0] Page 15 https://www.apple.com/business/docs/iOS_Security_Guide.pdf


What is SEP throttling?


I believe SEP = Secure Enclave Processor - iOS has it throttle passcode input requests. Visibly this results messages like "iPad is Disabled. Try again in 5 minutes".

I'm not sure how GrayKey bypasses this...


Ah OK sure. Interesting that the rate limiting is done at the hardware layers and not the the OS.


The type of login be changed though, to anything you want; e.g. it can be a long text passphrase.


The point is - it is now a single point of failure and it is probably not very feasible to expect to enter long password every time you unlock the phone. And no, biometrics is not a replacement. Biometrics is a login, not a password. You can't change your compromised biometrics unlike passwords.


But you can disable the biometric logon automatically. Either screwing it up too many times (like touching the wrong finger) or by quick-tapping the power button a number of times - upon which it falls back to the stronger password.

If you're in a position where you know bad guys with guns are coming, you can just power the phone down.

If you know they're coming but don't have much time, or you're unsure, you just thwack the power button a few times.

If you don't know they're coming, you're kind of screwed, but then again you would be anyways.


Can we stop making the iPhone unlock scenarios involve guns? That's a super atypical case.

Snoopy spouse seems like a far more believable user of forced biometric unlocking (e.g. capturing your fingerprint for TouchID while you sleep).


If the bad guys with guns are coming and they want to decrypt your phone, they can beat it out of you - rubber hose decryption is remarkably effective.


What if we could?


"We're better at privacy" is a fairly obvious way for Apple to differentiate itself, especially when the business model of many of its competitors kind of depend on not being very good at privacy (contextual ads and so on). This fits that brand fairly well.


They are not counter law enforcement. They are pro-user and the fact that their device works for the user and not advertisers, law enforcement, or negative external actors.


Android has literally had this feature for ages. By default plugging your device in puts it in "charging only" mode and you have to tap a notification and explicitly select MTP or PTP mode before it even attempts to talk to the computer.


That is not the same as disabling the entire USB data stack. iPhones have been charging by default, and prompting to sync, since time immemorial.

This is a new feature, that logically disables the USB port after 7 days without unlocking.


On Android no data connection is made, when a connection on the data pins is detected one is offered, but as far as I can tell no device is even detected by the OS until after picking a mode, nothing to attack.

Let me know if I'm missing something, but I get no change in Device Manager/dmesg when plugging my phone in, indicating no data connection to me. It would appear the entire data connection is disabled until a mode is picked.


This is the normal behavior in iOS, and has been for years.

What Apple is doing is _additionally_ disabling the USB port on an even lower level. Currently, the port could still exchange data if it were tricked or hacked, but disabling the port on a controller level will prevent accessing the device entirely.

Or at least that's the theory. Since it can obviously be reconnected after entering a passcode, there are conceivably ways to get it to open up. But that will have to be tested.


Correct me if I'm wrong as I don't have any recent experience with them, but don't iOS devices expose an authentication interface even without unlocked interaction from the user?

Apple's own guide doesn't seem to indicate any form of interaction is needed to enable that interface. That interface is what's attacked by devices like GreyKey if I'm not mistaken. Android devices when not manually unlocked and toggled present no such interface.


Correct, unless you enable adb in developer mode.


That's different. iOS has the same "Trust this computer?" prompt before it attempts to pair. But in both cases the phone still has a data connection, which leaves it vulnerable to any kind of security compromise (such as GreyKey), plus if the computer has a "lockdown record" it gets to skip that "Trust this computer?" prompt anyway (a lockdown record is a thing a computer gets after pairing with the phone that lets the computer prove it's already trusted to talk to the device).

But what this article is talking about is after 7 days of not being unlocked, iOS 11.4 won't even enable the data channel on USB, which means computers with lockdown records still can't talk to it, and presumably devices like GreyKey can't compromise the device.


On Android no such pairing system exists (except for USB debugging), you have to explicitly allow it every time you want to mount the device and no device shows up whatsoever, the USB data connection is disabled until after you pick a choice other than charging. You must unlock the phone and explicitly enabled the connection, I'd argue that's still better than the iOS implementation where an authentication interface is still available for 7 days even when locked.


Correct for Android unless you went out of your way to enable adb in developer mode.


I assume that Apple is seeing some traction in their latest push to be the consumer privacy company and are putting more wood behind that arrow.


Well, the iPhone has always been a locked down device. Every iteration of it has plugged up holes where access can get around safeguards.

1. It protects the store monopoly by guaranteeing the device will only ever load signed code from Apple. This is a huge financial incentive for Apple to invest engineering time into it.

2. Tim Cook has successfully recast the lockdown as a user privacy issue and is winning in the court of public opinion on the matter. Again, another huge financial incentive as it is something Android cannot deliver on.

3. There are real consequences to enabling the wishes of US law enforcement that I firmly believe the executives at Apple do not want to open that pandora's box. Our government may truly want to solve real crimes, and this may frustrate them, but there are governments who very concretely want that same power to lock up opposition voices and witch hunts.


In most circumstances it's not clear to me that a >48 hour timeout is really a major impediment to law enforcement in a major metro. Of course, if they arrest like 10,000 people on the same day from a major protest or something it could create a capacity problem, but otherwise it seems like more of an irritant - a new time constraint - rather than a truly effective countermeasure to vulns in their LI/USB stack.

I would be much more impressed if they allowed users to wholesale disable all non-charging functionality.


The reason why this is annoying for law enforcement is because it effectively shuts down many forms of brute-force password cracking attacks, which take time to perform.


Because Tim Cook strongly cares about privacy. Furthermore, privacy can be a competitive advantage. It’s an extremely smart move to double down on protecting users given what we’ve seen from the Facebook-types.


Apple's direction on privacy and using their premium brand image to look after their customers first is refreshing.

I just wish there was some way they could also play fair with their tax responsibilities. Rather than having the EU force their hand and showing the Irish deal to be illegal for example.

Granted the governments that enter into such deals (or refuse to fix the loop holes) are also part of the problem.

Considering the pile of cash they sit on if anyone can set a good example it's Apple.


It makes them seem unlike Microsoft Windows and Google Analytics and win points with their customer base.


To kill the competition? Not saying that that is the case, only that it maybe a possible motive


it is NOT about law enforcement.

Law enforcement could get this before 1 week (NSA) or force the company to add a backdoor (FBI)

This is more likely about reducing the market for stolen phones.


google does this ALL the time. you just don’t read about it in the press. for example, they encrypt the inter-data center links. for which they own the fiber. they do all kinds of stuff like that and are always doing new things. of course they don’t go 100% and encrypt email or docs but still they go to quite far lengths as long as it is within scope of their business model. (same as apple)

other companies do this kind of thing also. again, they just aren’t in the news cycle.

then there’s yahoo.


Google covering their _own_ ass and encrypting their internal data at-rest and over-the-wire has nothing to do with their approach to mobile device security. Any company worth their salt will encrypt traffic that transits any untrusted network after Snowden's revelations.


> google does this ALL the time. you just don’t read about it in the press. for example, they encrypt the inter-data center links

This has been well covered in the press as part of the NSA story. To the extent that the belief Apple getting more coverage is true, it’s because most people know tech companies protect themselves but Apple’s moves affect millions of people who otherwise would not have the resources to do so.


This seems like it’ll just make police departments go to a judge more often alleging probable cause immediately, and judges might be more inclined to grant given the time pressure, thus paradoxically it might end up with more phones being opportunistically subject to warrants by the police as the justice system would be given less time for duebl consideration. A “ticking bomb” tends to produce anti civil liberty behavior on the authorities.

they should have a setting to disable it almost immediately.

I almost never us the data connection on my iPhone usb except for headphones, yet another downside of losing the audio jack :)


I'd rather have this playing out in courts proceedings which are part of public record, than entirely within the US's notoriously unaccountable police systems.


>part of public record

>entirely within the US

Court documents are only public in the US if the government wants them public.


IANAL, but I believe that the court documents are public by default, unless the judge decides that there is a good reason not to disclose the.


If you're really concerned about people exfiltrating data off your phone via usb, fill the lightning port with epoxy and just use wireless charging and bluetooth headphones.


A determined individual can remove epoxy without damaging the electronics.


Apple should allow to do this in Software. Make the lockdown mentioned in the link happen immediately when USB is disconnected, forcing you to retype the PIN every time.

Could be a cool feature for the privacy concerned crowd.


Unless I'm mistaken, this is the way that Android already works. If you plug in a locked phone, you must unlock it to get any USB data connection to the phone. When a phone with a data connection is unplugged, the permissions immediately reset and you have to unlock the phone again to restart a data connection.

Not only that, on my Nexus 5x, I have to manually switch from charging mode to data transfer mode every time (the setting doesn't stick).

Unless I've misunderstood the significance of this change, Android actually provides much more security since you can't download files off a locked phone, not even within 7 days.


Yes, the iPhone is the same. You get a ‘Trust this computer?’ prompt which forces you to unlock the phone first.

The trouble is, this is still a data path that has been opened via USB. This new change disables the data path altogether and just allows charging via the USB power pins only - like its a USB battery or USB fan, simply power and nothing else. Which is pretty neat.


Why doesn't the iPhone disable the data transfer pins until unlocked?


Probably because for any decent charging speed (beyond 500mA) you need to talk to the other side. Atleast per standard.


How does Android work around this?


It communicates power needs but nothing else until the user unlocks the phone and enables data transfer.


Because then you can’t plug in the device and have it sync automatically. It’s a security/convenience trade off.

By the way they don’t use the same pins as USB of course because they have their own connectors. And disabling access to the audio dongle while the device is locked wouldn’t make sense.


Yes. I would like customized timeouts. 7 days is too long.


I think they could probably just open the case and wire a new port anyway.


I would love to know their strategy. I can barely remove lint from the lightning port without damaging the electronics!


Acetone softens many plastics nicely


Ahhhhhh interesting! I kind of want to try it on the stuff I've got kicking around here, just to see...


A hair dryer would likely work.


We might be talking about different types of epoxy... I have two part "high temperature" putty epoxy at home that is stopping a loose bracket from rattling on my furnace. Significantly hotter than a hair dryer in that spot. It was $5 at Home Depot.


There's also test pads to wire up to and/or build a new port.


To those considering doing this, please keep in mind that if you do this, you will not be able to "unbrick" or receive authorized repairs for your phone. That's probably exactly what's desired, but as "wireless charging" models are nearly a thousand dollars, epoxyer beware.


Or, you could make your device supervised via MDM or Apple Device Configurator, and disable "Allow devices to pair with other computers". This will generate a keypair on the machine running Device Configurator + the iPhone, and no other USB data connection will be allowed - you won't get prompted for "Trust" at all (except for CarPlay etc.).

On top of that, since iOS 11 (or maybe 10), trusting a USB device requires the passcode by default, so even having your phone ripped out of your hands unlocked does not lead to the ability to take a USB backup directly anymore.


> Or, you could make your device supervised via MDM or Apple Device Configurator, ...

I haven't looked into this so I apologize if it's an easily answered question but... is this (MDM/ADC) something an individual can do for their own devices? I know it can be done at the organizational level for, e.g., employer-owned devices, but how hard is this for Joe Blow to set up for, say, his kids' iPads and iPhones?


Yes. This is a very old article I had written back in the iOS 9 days, it still mostly applies.

A few warnings:

1 - Changing an iOS device to "Supervised" will wipe it (so you can't supervise someone else's device and data).

2 - Backing up iOS devices over USB after requires that you keep the keypair, and makes restoring to new devices much more complicated. I would not recommend relying on USB backups if using this method unless you play around and understand it well.

3 - Be sure you set the profile to be removable only by wiping the device, or using Device Configurator, otherwise, the profile could be deleted to enable USB.

4 - One could wipe the keys completely to be sure that NO machine has USB access to the phone, meaning it would need to be restored to factory defaults for USB to work again.

https://blog.rapid7.com/2015/11/26/reduced-annoyances-and-in...


Configurator is awesome, especially for VPN profiles. But do note that many of its features can only deploy to managed devices under an Apple enterprise cert.


The iPhone 8 starts at $700, the same price the iPhone has been for a long time.


Thanks for the correction. I can’t afford $700 any more than $1000, so I left my wording intact. Note that the exact price point is entirely irrelevant until it drops to “throwaway”, which $700 is not.


iPhone 7 started at $650


iPhone 7 doesn't have wireless charging.


They’re right. I guess Apple upped the base price $50.


iPhone 8 starts at 950$ in Europe. Other phones (e.g. first tier Chinese brands) don't suffer the same insane markup. 8plus is 1075$ and X is 1365$. All for basic memory models.


I think it's articles like this that disprove the "insane markup" comments. There is a differentiator that justifies the increased cost. Increase privacy is not more expensive, decreased privacy is just cheaper.


You usually don’t even need that. 99% of the time someone posting flame bait like that will be comparing unlike things - slower CPUs, worse displays, etc. Equivalent hardware tends to be very close because markets usually do work efficiently.


I'm comparing the lowest models that meet my needs. Why should I have to pretend there's not cheaper, less powerful handsets that can fulfill my needs just to make Apple look better?

(Granted, my strict needs could be met with one of TI's sub-$0.50 microcontrollers and a seven-segment display since they're "make phone calls" and "generate two-factor authentication codes")


Because it’s mislesding, just as it would be to say that a restaurant is overpriced because you wanted a takeout burrito.

If all you need is enough to run a TOTP app, you could get a dumb phone with a $10 token but you could also make an honest comparison with the $350 iPhone SE or $450 iPhone 6S rather than comparing the iPhone X/8 to devices which weren’t intended to compete with current generation flagship models.


Chinese brands aren't known from protecting your data. Usually the opposite.

Android handset makers just manufacture the handset. Google provides the OS, and the underlying services for the Android ecosystem.

Apple bears both the cost of handset manufacturing, and the entirety of costs for the smartphone's ecosystem. That cost is reflected in the price, making the markup less "insane."

The Chinese handset makers in question may provide their own software, but it's typically considered "bloatware" and no where near the quality of software that Google provides.


That only works if you have an iPhone X, and why should I destroy a phone permanently for this when bette software design and hardening could fix it.


Or 8.


   police departments go to a judge more often
I can't see how that is a bad thing overall.


With a hard deadline looming the police might be able to use the exigent circumstances exception to search warrants.


They still need to have an available exploit. Now, they don’t have weeks or months to find one.


Google's Cached version if you're having issues accessing it:

http://webcache.googleusercontent.com/search?q=cache:https:/...


Your link isn't working either. Here is the text-only version: http://webcache.googleusercontent.com/search?q=cache:https:/...


Is it just me or is the google cache link timing out as well?


Click "text-only version" in the Google cache header.


It's happening to me as well.


One test I would carry out and well within the remit of geeks and enforcement - would be a femtocell/base station with a time update (which mobiles accept blindly if you let them). Forever keep connected devices in a Groundhog day.

That would certainly be the go to test for many I suspect, a tried and tested hack from days of old, brought into modern times.


They could just as easily use time since boot, which is usually a separate counter.


They could do a lot of things. The question is if they do.


One would hope they would have a better solution than DateTime.Now - dateOfLastUnlock.


A step in the right direction, but I'd like to see this interval reduced (12 hours? 1 hour?) or brought down completely (I should have the option to require an unlock before any connection is established). There's no reasonable use case where I would want to make a connection while not wanting to unlock the phone.


the quote from apple specifically mentions USB _accessories_. i would guess they have things like headphones in mind, and it's reasonable to expect that if i have music playing with the screen locked and i connect a pair of headphones, the playback would switch to the headphones.

7 days still seems much higher than necessary.


As someone who only uses wireless accessories (including a wireless charger), I’d love to be able to configure this interval or require it immediately.


How is this different from how Android works, where you have to unlock your phone and explicitly tell it to connect every time you want to use a USB connection for anything other than charging?


It already does that, but now after 7 days it won’t even open a data connection to check if the computer is trusted.

This also removes the last obvious channel for exploits. Sometimes there have been jailbreaks/unlocks which just need physical access to the lightning port, but those wouldn’t work after 7 days now.


Not true. Lightning headphones work without any authorization at all.


Just because you don't have to explicitly give your lightning headphones permission to receive audio from your phone does not mean that there isn't some sort of authorization going on.

In 11.04, if you leave a phone locked for greater than 7 days, you would need the unlock PIN in order to connect any headphones.

Obviously, Apple would recognize the device as an audio device, and it would be automatically added, but the connection wouldn't even be established until you entered your PIN.


The Apple driver creates some sort of cryptographic keypair between the device and the computer to "remember" that you accepted the connection and not prompt you to connect in the future.

If this file was recovered from a victim's computer, forensic software can sue it to bypass the prompt. According to the article, however there is now also a 7-day expiration on these cryptographic keypairs.


> If this file was recovered from a victim's computer, forensic software can sue it to bypass the prompt.

That typo gave me a good laugh


Not quite - after 7 days, the phone has to be PIN unlocked to reenable the existing keypairs.


Ah. This would make more sense and clear up some of my confusion. So far as I know, Android doesn't have this dubious feature. You have to manually give permission every time. So this change is essentially setting a time limit for anyone to recover your keys.


iOS already has the "Trust this computer?" dialog[1], but this sounds like it disables the port to the point where the dialog wouldn't even appear and no connection could get through, trusted or not.

[1] https://support.apple.com/en-us/HT202778


(The elcomsoft article is down for me, so I don't know the details but) I think it's to do with DFU ("Definitely Fucked-up") mode in iPhones that lets you install a new OS version when the main iOS won't boot-up successfully.


it's probably not talking about DFU mode

>To improve security, for a locked iOS device to communicate with USB accessories you must connect an accessory via lightning connector to the device while unlocked – or enter your device passcode while connected – at least once a week

"accessories" makes me think it's a "regular" connection, not the recovery one.


Do you know if Lightning headphones count? Aside from charging, it's the only thing I plug into my Lightning port.


If you're actively using your device, this wouldn't apply anyway. You unlock your device far more often than once a week.


Nope. What changes is that any device for which you've previously been prompted to "Trust this device?" saying yes will no longer make the iPhone trust that computer forever, only seven days. Unless you unlock the iPhone again.


Blog seems to be hugged to death, so I might be uninformed.

What exactly happens after the 7 days? My girlfriend's iPad got blocked on vacation (bluetooth keyboard in a bag causing random inputs). To get it fixed, we needed to connect to a computer. Would this mean that if you don't get to a computer within 7 days it would be essentially be bricked?


If the device is not unlocked for 7 days, the USB port stops working (for anything but charging.) It won't brick the device, which can still be unlocked any time, it just prevents someone from taking more than 7 days to hack open your phone, via the USB.

edit: so in your case, perhaps yes, you would have been hooped.


I would assume DFU mode would still work to reinstall the OS.



Maybe I missed this in the article, but does anyone know if this feature can be turned off? Or if it's enabled by default?

What happens in the scenario of a consumer having an old iOS device sitting around, they forget the passcode, but now can't reset it using iTunes?


As far as I understand, there's still the hardware key combination to put the device into DFU mode. In that mode it can still be connected to and a new firmware can be written, but no access to the data is possible.

So in order to un-brick an old device sitting around, you put it into DFU mode (the key combination varies from device to device) and restore it that way.

Of course you don't ever get your data back, but that's totally the expected behaviour.


Do you have any source for that information? I am very interested in that as well but couldn't find any information regarding that so far.

I have always thought (though without any source) that they re-flashed the iPhone by putting it into the DFU mode (and tricking the iPhone bootloader into accepting their key) and then just brute force the key.


There is no scenario which allows re-flashing a device from DFU while retaining user data. This only appears to work in typical user scenarios because iCloud or iTunes creates a backup from the unencrypted device as a first step before flashing it.


That makes sense that DFU would still work, since that would let the data continue to be protected and wiped.


The article mentions the ability to disable this feature on "Managed phones" (presumably through and apple configuration profile).

To answer your scenario though, you can always reboot a "locked-down" iPhone into Recovery mode or DFU mode and wipe the device without being able to recover the data.


I’m guessing you would have to restore the device.


Glad to see Apple is at least trying to protect their users.


Sounds like they just patched a glaring security hole, late in the game? I'm not even sure how useful the "7 days" thing is - it's certainly better than leaving "permanent" keys on various "once trusted" computers.

But still leaves the window wide open to sneak-and-peek to mirror the drive of a matchbook left in a hotel room, and then later acquire the phone.

So better than keeping the old behaviour which was obviously broken, I guess?


> But still leaves the window wide open to sneak-and-peek to mirror the drive of a matchbook left in a hotel room, and then later acquire the phone.

MacBooks have full disk encryption so you’d have to break that, too, but since we’re talking movie plots rather than real-world risks for most people take a moment to think about what else they could do with physical access: install a camera to record passwords, drug your water glass, or simply have someone demand you unlock it or else. If you’re Jason Bourne you need more than a consumer OS gives you out of the box.


A disgruntled spouse swiping your phone and running up to a awarded hotel room to hook it up to a trusted macbook protected by your secret "1234" password isn't exactly Jason Bourne territory. I suppose they might know your phone passcode as well - all I'm saying is that a phone in general use will have been unlocked the passed few days, and is likely in proximity to a trusted device.

Now, we don't know how the greylock stuff works (afaik) - so maybe this will harden phones against imaging with such tools. And maybe not.

I actually have a hard time seeing how this is: "aimed squarely at police". If you're picket up at a demonstration or traffic stop - is it really that common that they won't get to your phone in 7 days? They're only allowed to hold you for 48 hours or so anyway?

Don't get me wrong 7 is better than forever - but I'd like to see it made available as a user setting; eg never / 30 seconds etc.

As for this not being available on Android; my impression was that given an encrypted, locked, Android phone with pin/pw lock and debugging disabled - you'd need to unlock before being able to access phone data via USB?

Now if debugging is activated, I believe a "trusted computer" (anyone who holds the keys) can gain access even if the screen is locked?


> A disgruntled spouse swiping your phone and running up to a awarded hotel room to hook it up to a trusted macbook protected by your secret "1234" password isn't exactly Jason Bourne territory.

Think about that from the perspective of a security threat model: what are the odds that your disgruntled spouse has access to your computer and knows your laptop password, but doesn't know your phone password? This is an extremely hard problem to solve since they have all kinds of sensitive information and access.

> I actually have a hard time seeing how this is: "aimed squarely at police". If you're picket up at a demonstration or traffic stop - is it really that common that they won't get to your phone in 7 days? They're only allowed to hold you for 48 hours or so anyway?

What they're allowed to do and what they actually do are not necessarily the same. This prevents the case where, say, they've seized phones but haven't legally compelled the users to unlock them since it means that if an exploit is discovered in the future it won't be usable against any devices which were stolen/seized more than a week earlier.

It would also make it hard to do something like seize a bunch of protesters phones and then attempt to obtain the keys from each person's trusted home computer which would take more time to do at any significant scale.

It's not a huge game changer but it adds a layer of hardening against certain attacks. Given the limited downside, that seems like a good thing.


> Think about that from the perspective of a security threat model: what are the odds that your disgruntled spouse has access to your computer and knows your laptop password, but doesn't know your phone password?

Well, you might not have a password for your desktop - but might have a pin for your phone.

Or maybe it isn't your spouse, but your kid; they might share the computer - but not access to the phone?

<ed: i don't really disagree, but I also struggle with the 7 days (and not quite in the "perfect is the enemy of good"-sense: >

Either way, I'm not sure I understand how 7 days make sense (seems too long, still).

Seems like either it should be ~14 hours to a day (sync at home every evening) - or it should be: phone has to be unlocked.


How is Apple "late in the game" with a feature that Android has never shipped?


Late in the sense that they out of convenience over security circumvented device protection. I'm not even entirely sure for whom they prioritzed convenience.


The obvious flaw in a time-based lockout is that it needs a trusted measurement of the current time.

If law enforcement wants to bypass this, the obvious approach would be to just remove the battery (to remove power from any internal RTC chip) and put the device in a Faraday cage (to block external time signals like GPS and the cell network). Then the shutdown clock would literally stop ticking until they turn it on again.


That drastically increases the difficulty in forcing access to a phone. Regional police departments likely don't have a faraday cage and phone disassembly resources on hand, while they may have been sold on grey lock or some other automagic tool.

Sure, this might not stop the NSA of FBI, but if it limits the threat vector to 3 letter agencies, it is still a win.


Hopefully they considered this and the condition is actually "7 days passed or total loss of battery power occurred"


Security isn't all or nothing. If something isn't "perfect", that doesn't mean it's useless. What you just described is a shit-ton of work, which most attackers will not be willing to do (how many police departments have Faraday cages lying around?).


>how many police departments have Faraday cages lying around?

Police departments use Faraday bags to isolate the communications of a device once they take it into custody. I assume they do this to prevent RemoteWipe() commands from running from external entities to preserve evidence.


If you shutdown or cut power to the phone the file system key is gone and you can’t unlock the file system without the pin. That would gain you nothing.


>the obvious approach would be to just remove the battery

Might this constitute 'tampering with evidence'?


    that it needs a trusted measurement of the current time.
That's easily generated by doing it internally. Sure, you can't easily have a precise, trusted measurement that doesn't suffer from drift - but in this case it doesn't practically matter.

So you test ~7days runtime or external clock says you've gone > 7days. Lock on either condition. You can mess with the latter as you say, but not the former. Well I suppose you can de-power the device but that doesn't really help you either. Am I missing something?


i was wondering about this as well...

so perhaps the only point of this "band-aid fix" is that if the user hasn't connected his iPhone to his computer for 7 days (i'm guessing most non-tech people rarely connect their phones to their computer), the chances are greatly increased that the lightning port is already disabled once it's captured by law enforcement/whoever.


It's not 7 days from when you last connect it to a computer. It's 7 days since the phone was last unlocked.


Hmmz, i'm not convinced. The quote in the article says:

“To improve security, for a locked iOS device to communicate with USB accessories you must connect an accessory via lightning connector to the device while unlocked – or enter your device passcode while connected – at least once a week.”


Within seven days of the last time you entered your pin, the USB port will function without requiring you to enter your pin.


Where did you get that info?

Because the quote says "enter your device passcode while connected"

= the device has to be connected while entering the pin.


If it's been seven days since you unlocked the phone you will need to enter the device passcode while connected to the device, the prompt coming up because you connected it.


Why 7 days? I'd like a feature to never allow any USB communication until I've unlocked my phone, and then to allow it only for as long as they remain continuously connected.

Or to activate this feature with 'Emergency Mode' (5 power button presses).


They can't because they deleted the headphone jack. Your earbuds would not function if Lightning was defaulted to off.


Sure they would, I'd just have to unlock after connecting them. Which seems like something I'd probably do anyways.

Except I use Bluetooth headphones so I don't care about Lightning audio at all. Misplaced the stupid dongle within days of getting a jack-less iPhone...


Not really. They could allow accessory protocols while disabling everything else. The only sensitive thing we care about is "usbmuxd" or whatever the protocol for iOS sync & backup is.


That's not true, there's the large attack surface of those other protocols to worry about too.


This is more or less what happens in android. The usb port is only enabled for charging, anything else requires you to unlock the phone.


I concur, would be great if this feature could be enabled with 'Emergency Mode'.


Can someone explain how it is better than android? When I plug in my android, only charging works. I need to unlock the phone and enable data to make the data connection work. There is never trust this computer prompt. I have to do this always, even when due to bad wire the connection is lost for a split second.

Some people mentioned that android never shipped this feature and that Apple is first, but it seems to me that android never had this problem in the first place.


There are Kiosk uses of iPads where this could be an issue. Often those devices are mounted 24/7 inside a secure housing and left on but communicate with external devices. Now someone will have to reset them once a week.

Edit: thanks for the clarification below. I had the implementation wrong in my head. And yes, I realize this is a fairly edge use case, just one that affects my industry.


If they're left on, they should be fine - it's when the device is locked and idle for 7 days that the port is shut down. Also, I'm not familiar with any Kiosk type implementations that use lightning for data xfer except POS devices, and thos would obviously be unlocked whenever they are in use - so again, no issue.


Kiosks should be fine - I think you've missed some details on what is going on.

The update won't lock iPads that are unlocked. If a kiosk iOS device is "always on" and never requires a passcode, then it it won't ever lock hence won't start the 7 day timer.


There's a way to disable the feature through MDM (Mobile Device Management), which you'd want on the kiosks anyway.


I wonder if just putting the power outlet on a timer would work. Cut power to the charger for a few minutes at night and you're good.

Edit: disregard, I misunderstood the issue


If the edge use-case you have prevented is a drawback to the 0.000001% of users, I think it's worth it for the greater good, and obviously Apple does too.


The time deadline might have some unintended consequences. Maybe law enforcement will proactively image people's phones early knowing it will be harder later. eg you are stopped at the airport. A judge may give give quick search warrants since its "now or never".


I'm really not a fan of the hardware design choices of apple devices recently, but the focus on security/privacy might pull me back in.


I wish I could just disable the data connection permanently

> Restricted USB Mode requires an iPhone running 11.3 to be unlocked at least once every 7 days. Otherwise, the Lightning port will lock down to charge only mode. The iPhone or iPad will still charge, but it will no longer attempt to establish a data connection. Even the “Trust this computer?” prompt will not be displayed once the device is connected to the computer


I wonder how iOS keeps track of the 7 days in question.

For an iOS device still connected to the internet or to a mobile phone network, I presume it will periodically make an NTP request or get the date/time from the mobile phone carrier, to adjust its clock. What if those requests are MITM'ed?


Though that's kind of a good question, GENERALLY speaking you should use "monotonic" timers for this sort of thing which don't have that problem. They're based on a system time which doesn't change.

Of course, that's no guarantee.. this is done wrong in a lot of places all the time. But I would expect Apple to get this sort of basic thing right in their security code. Hopefully :)


Is it known how those greykey devices even work? AFAIK iOS blocks many consecutive attempts to enter pin, so brute force would take too much time. It seems that greykey device can bypass this restriction using USB. Why Apple didn't just patch this vulnerability instead of disabling USB?


To patch the vulnerability, they need to be aware of the vulnerability. An individual who possessed such a vulnerability would likely be more inclined to go into business for themselves (such as the hackers that helped the FBI crack the San Bernardino iPhone, or the Israeli firm Celebrate) than hand it over to Apple for a one time fee. Although I imagine that Apple would probably pay pretty well for it.

IIRC these sort of vulnerabilities don't totally bypass the phone's lock mechanism, but rather disable the pin code attempt delay and allow bruteforcing of the pin code via software.


One time fee? Apple don't pay for such things. Possibly why there is a black market for iOS exploits.



The bounty security program (announced at BlackHat 2016) was created to deal with these kinds of scenarios. They will pay depending on the severity of the bug and the affected subsystem.

Of course, now that this mechanism exists, I'm just waiting for Apple to sue GreyKey and Cellebrite out of existence, confiscate all the devices, and charge the founders with aiding industrial espionage or overreach related to pursuing terrorism.

(I'd also like to see the same thing happen with the NRA, but alas that doesn't seem to be in the cards for the current circus in Washington)

The difference between more legit researchers and these guys is that they will work with anybody as long as they cut a check. Real R&D has more scruples than to do that.


As much as I dislike this Israeli firm how have they done anything illegal? Hacking a device in your physical possession should not be illegal. Your comment about the NRA makes me think you're just being hyperbolic here?


Celebrite is actually the better of those out there, while they do sell their data acquisition terminals to LEO in bulk their "unlock services" are done in person by their staff with a court order for each case (including multiple court orders in some jurisdiction when different datasets on the phone are protected separately by law).


It makes me extremely nervous to see that a third party can even create this capability. If I had a say, I would just tell these guys this level of access to low level code just isn't possible for third parties.

As odd as it might be, I trust Apple more because they don't want my data and aren't enabling methods for other people to acquire it.

On aome level, this back and forth on encryption is an endless cat and mouse charade, but the fundamental assumption behind cryptographic security is absolute.

"You can't outlaw math"


What are you talking about? You want to make it so that a company like Apple can just draw arbitrary bounds and say "no messing around beyond this point" and have that be internationally, legally, enforced?

We got that with the DMCA and DRM modules, phone unlocking, and console rooting.


Companies can write nigh-any clause into their EULA or T&C and people generally have little recourse. There still seems to be enough wiggle room legally because they control the platform. Some places (think EU) fight this, but I don't think it's in any way settled at this point.

They've done this in the past in subtle ways - cautioning developers about using private API, which they reserve the right to change at any time, thus breaking applications. For a practical example, Google "Apple kext signing certificate". It's not simply a matter of paying $99 and off you go, the barrier to entry is higher.

There have also been no-so-subtle warnings - see Charlie Miller's blacklisting - that even a proof of concept for a bug is not allowed because it could get out in the open and cause widespread damage.

> We got that with the DMCA and DRM modules, phone unlocking, and console rooting.

Record labels had little choice and needed to ditch these restrictions in order to have a viable business. TV studios, cellular providers, and console makers fight to this day to preserve these limits as a means of competitive differentiation.

I'm not saying I agree with it, but that is still largely the reality we have to deal with.


Encryption is useless if there is a way to brute force crack passwords or acquire private keys which could be used to decrypt a device. In most ways, yes, physical access is 'game over for security, so I could see Cook & co. make the case that these devices are so dangerous in that regard that they shouldn't be able to be used by anyone.

Tl;dr It's covering your butt when Apple can say to the FBI, 'We support your efforts and want to help you, but literally cannot because it would require dismantling our own security architecture.' No engineer would agree to that - they would quit in protest.

The NRA crack is more about them being the GOP's puppeteer. Under a competent (and probably Democratic administration), being labeled a domestic terror organization would kill their funding in 0.02s.


>I could see Cook & co. make the case that these devices are so dangerous in that regard that they shouldn't be able to be used by anyone.

OK but that is not a legal strategy. You're not providing any basis other than that Apple should have some magical power to prevent people from touching devices they legally have access to.

OK and if you got Planned Parenthood or the Humane Society listed as a domestic terror organization it would hurt their funding, too. What's your point?


How is the NRA vulnerable to these machinations?


The vulnerability that GrayKey exploits isn't known: https://www.forbes.com/sites/thomasbrewster/2018/03/05/apple...


This both takes care of that and any other similar attacks that haven't been publicized or discovered yet.


I fear this is really only going to have the reverse effect, instead of carefully examining whether or not 4th Amndment protections apply, "Out of an abundance of caution", courts will immediately seize and decrypt your phone.

Not to nitpick, but I wish these things were opt-in. For instance, I don't really care if I've restarted my mac and have to use my password again to log in, I'd rather use my fingerprint. I just need to prevent casual attackers, there is _literally nothing_ on here that needs to be protected with fort-knox level security.


Two ideas:

What about paired hardware? Imagine buying an iPhone and pairing it with your charger and they share keys. Any other charger used would immediately wipe the phone. There could be settings to tweak this.

what about wiping the phone if it has not been logged into a certain amount of time with a certain password (not normal PIN)?

The current crop of phone busters completely bypasses the 10 wrong pin and wipe option. The idea is to immediately wipe the phone without using Find my iPhone (defeated with airplane mode).


What about paired hardware? Imagine buying an iPhone and pairing it with your charger and they share keys. Any other charger used would immediately wipe the phone. There could be settings to tweak this.

Open the phone, plug charger directly to battery.


Disable that function.


What function? I'm talking about physically opening the hardware and sticking two cables to the battery connections.


Can I still kick the device into recovery mode with a cable after 7 days with this mode? Or would I have to unlock the device to re-enable recovery mode?


DFU mode is a key combo on the device, so yes, you’d be fine.


Say a user drops their phone in a desk drawer and goes on an 8 day hiking trip.

He/she comes back and can't remember their passcode. Is the phone now a brick?


I would imagine that "disable USB port" doesn't mean it completely disables the power lanes, only stops responding on the data lanes. So when you come back from your 8 day trip you charge your phone, log in, and theoretically ios would re-enable the USB port once successfully logged in.


Not sure why you're being downvoted, you're correct and it's stated in the article that if you unlock with a passcode after 7 days the USB port begins to respond again.

Apple isn't going to design a feature that bricks a device after 7 days. It really simple:

1) No unlock for 7 days = USB turned off.

2) Unlock phone any time after 7 days (lets imagine unlocking at 12 days) and USB turns on.

3) Charging remains active at all times.

Simple.


>He/she comes back and can't remember their passcode.

>2) Unlock phone any time after 7 days

?!?


Pretty sure you can always wipe the phone.


I was out drinking with my friends and had watched my buddy "draw" his pass code so many times (5x5 grid) that I tried my hand at it. After watching me try and fail a few times, he said "alright, enough" and took his phone back.

At which point he totally forgot how to draw his own pass code after watching my attempts. We figured he'd remember it in the morning. But a day later he couldn't and had to wipe it. A reminder how fragile memory can be.

What gave us a laugh was when a Tinder notification appeared on his home screen from the girl that was going to come out with us, but he of course couldn't get into his phone.

By the time he reset his phone and reinstalled Tinder, she must've thought he stood her up: she'd already blocked him.


To me, it feels like Apple is trying to figure out how GreyKey and Cellebrite are getting in - and patching every vector they can think of in the meantime. I suspect that if law enforcement agencies are suddenly told they have to unlock new Apple devices within 7 days of acquisition, Apple will find out and can infer that the exploits have (e.g.) something to do with USB accessory access.


Can a device still be wiped when this happens? I'm wondering how to recycle or recover locked devices if the USB port is disabled...


The assumption is that the correct passcode will remove the usb shutoff. If you fail to enter the passcode in the required amount of times, you get a wipe. Many of the law enforcement ways to access these devices rely on the USB port being active to root the phone or reset it in a way that allows faster/more passcode attempts (or by simply letting it sit on a shelf for months or years until a known exploit allows access via usb.


Will this cause warrants to be rushed through and much more often? Just to get the phone unlocked in case something is in there, even if there may be no burden of proof. Better to overnight it to a facility with a tool to unlock it and sign off on a quick warrant.


Why seven days? How about 24 hours? Or even better if the device is locked I have to unlock it to use the port for anything besides charging (and it can then lock on its usual schedule)


This doesn't quite sound as amazing as a first look, because this is not a full "data connectivity" kill. Data connectivity is always required whenever people use headphones due to headphone jack removal.


How do we know this feature doesn't also disable the headphones-over-usb if the phone hasn't been unlocked in a week? I don't think that would be a problem for most users either.


You can install iOS 11.4 beta and find out one way or the other; it doesn’t seem like anyone has done so yet, and since no one thought to do so before today’s post, no one will know for 7 days assuming they start today.


But you'd also need to unlock the phone to use the headphones... It is as it sounds.


I assume so. Otherwise GreyKey could masquerade as a pair of EarPods to get authorized simply by plugging in.


Well it could masquerade as a pair of EarPods and get access to the audio output yeah. But it still wouldn't get access to the iOS sync & backup/restore service (because normal EarPods have no reason to access that).


Not directly. But sending input masquerading as the vol up/dn and button on the EarPods is a potential attack surface.


Are you going to listen on headphones, for 7 days, with a locked phone?


403 Forbidden in Germany.



Same in Northeast US on university WiFi.


i suspect this will be a net negative. now that law enforcement has a time limit, graybox sales will flourish, and law enforcement will access your phone ASAP before collecting other evidence. then the phone evidence itself will give them the clues they want and they’ll get the warrant after the fact. or the court may even be complicit and issue a warrant without enough supporting evidence due to the risk of evidence destruction.


Why 7 days though. It's should disable within 2 hours at most, and users should have the option to disable USB when ever the phone is locked.


Why 7 days? 24 hours should be enough - who connects devices but doesn’t unlock in that time? Can’t think of a scenario for that.


Like everyone else, I'm curious how you recover your forgotten pass code after 7 days.

Also, what happens if you don't use a passcode?


> I'm curious how you recover your forgotten pass code after 7 days

You don't. If you've lost your passcode, you can't get it back.


Is there any security reason someone would purchase a security-focused Android phone (Blackphone, Blackberry) over an iPhone?


Hardened kernel (e.g Coppherhead OS), custom open-source ROM to disable all telemetry and audit the code (among others), FOSS APK provider (f-droid), disabling online tracking (AdAway, AFWall), full filesystem access. iPhone hardly has any of this.

Note that this is true of any bootloader-unlocked Android phone, not just security-focused ones.


Please bear with me, I'm not OS-savvy, are many of these features designed to protect against untrusted processes? If both phones have no additional apps installed, what would be the most likely way for the data within to be compromised? Is it fair of me to believe it's fair to distinguish telemetry and security as separate concerns?


Very telling that this is downvoted.


In general HN has a strong bias toward Apple over Android RE: security. Many of the points are entirely valid, but it's also true that some important advantages on the Android side (such as those listed above) as often understated or overlooked here.

For me, a closed-source OS is a dealbreaker on its own, regardless of any other major HW or SW advantages.


Why not 7 hours or 7 minutes or immediately?


There's a slice of the population that needs accessibility devices to unlock their iDevices. It'd be unfortunate if they couldn't unlock their devices anymore due to a security feature. 7 days sounds like a compromise between usability and security.


Yeah, 7 days feels awfully long. When would you actually need that?

Edit: Okay, I suppose there might be an edge case where you break your screen and want to take a backup on an already paired device.


Well, why not 7 days?


The advantage of a smaller time period is obvious: if 7 days is secure, then 7 hours is more secure.

But presumably there's a good reason they chose that number, I'm just wondering why that might be.


It seems a reasonable default, but would be nice if there was a custom setting for a shorter time frame.


I assume it's a number they chose to strike a balance between security and eventualities that create support burdens.

It could probably be user-defined, but when it comes to setting a one-size-fits-all default and offering users an option, Apple usually selects the former.


I'm pretty sure they can just remove the chip and dump the contents.


Kind of amazing seeing this story right next to the Google Duplex story.


Why should it even be enabled if you're not logged on?


this is why I will never use google pixel tho I will still be forced to use gmail, search, and Youtube because of its conveniences, hopefully in the future something new that comes out that has mathematical open source decentralized form of censorship-proof algorithms will come out


if the device time is synced from elsewhere maybe one could spoof a ntpd server and provide a time in the past?


And/or spoof a cell tower.

If they are careful with their implementation, they could protect against it. The naive way is to store the time of last unlock and simply compare that against the current time, but there are other ways:

(1) Once 7 days elapses, set a flag that can only be cleared by unlocking the device, and check that flag in addition to the time.

(2) If there is an internal hardware clock that isn't synced to real time, just count relative to that clock. You don't need to know absolute time to check how long it has been since last unlock.


> (2) If there is an internal hardware clock that isn't synced to real time, just count relative to that clock.

This is known as a montonic clock and it’s been built in to most hardware for exactly this reason. Mach, Linux, etc. have encouraged use for anything where things like leap seconds or time changes aren’t desirable.


Minor point of clarification, but I meant something slightly different than a monotonic clock, hence why I said hardware clock.

For the approach I described, a clock would need to keep ticking while the system is powered off or in various power-saving modes. And it shouldn't get reset at boot time. Not all monotonic clocks have both these properties. (Obviously iPhone isn't Linux, but one example is that Linux's CLOCK_MONOTONIC seems to lack both properties, and its CLOCK_BOOTTIME seems to lack the second one.)

Though if you have a clock that resets at boot, you can work around that by disabling USB data on bootup and not enabling it until first unlock.


On Linux, I believe what you’re looking for is CLOCK_MONOTONIC_RAW but that’ll also depend on your particular hardware and its security.

On iOS, I’m not sure it matters for the reason you mentioned: at least on my devices I don’t see hotplug events until the device has been unlocked once.


> If there is an internal hardware clock that isn't synced to real time, just count relative to that clock.

There is. All modern operating systems, including iOS, have one. Without it, commonly used features like animations and media playback would misbehave during network time synchronization.


Not if they use the system's uptime as the counter, which is guaranteed to be monotonically increasing.


How is this relevant when law enforcement can buy a $15k device that unlocks the phone?


Because "GrayKey" is using USB which won't work if USB is ... turned off.


The article speculates that this breaks GreyKey. Presumably that was the direct motivation for the change.


Does this mean that companies will now try to exploit the USB/Lightning driver to gain access?

The cat and mouse game continues.


I don't thin there would really be anything to exploit because it sounds like the data lines in the actual usb connection are turned off and only the power lines remain enabled to allow for charging.


But they're still disabled at a software level.


But how will you influence that said software if there's no way for you to talk to the device? This closes one exploitable avenue after 7 days - now attackers need to find something else.


You could just pull the flash chip and image it. You would need to figure out how to get the key, but pulling the flash chip and reading it doesn't look too hard if you can use a heat gun. If you lived in Shenzhen you could go the market and buy a flash reader.

Strange Parts is youtube channel where the guy does this.

https://www.youtube.com/watch?v=rHP-OPXK2ig


The filesystem is encrypted and the key is in the secure enclave. So pulling the chips and reading them directly no longer works.

I think this is how the FBI got into the bombers iPhone after losing the court case to try and force them into releasing an iOS version that allows unlimited pin tries.


> You would need to figure out how to get the key

I mean, that's the $10M question.

Literally. There's probably $10M in it for you if you can answer that (and have the right connections to build a company around it)


The PIN or any other user secret can't decrypt the contents of the flash chip. So turning the device off and removing the flash just makes it infinitely more difficult.


Why would make it any more difficult? You can just put the chip in place after you are done.


I guess because now you have no way of exploiting any bugs that might've been exploitable. Within 7 days the phone will still try to talk to others connected to its port, after 7 days it'll just charge. I assume that "power off" will also disable data over the port until the phone gets unlocked?


...

Every 6 days from point of collection: Place phone in caged room. Turn on your cell phone network interceptor device. Set interceptor's network time to device collection time. boot phone, await for it to update network time from cell interceptor.

...

So many edge cases/ways to defeat this that need to be handled.


Type "uptime" in your computer. Change the date. Run "uptime" again and note that it is the appropriate amount of time longer than the first report, even though the date is different.

It's not always useful to assume that engineers will miss even the most obvious workarounds.


Unless it's actually just got a counter that counts up from the last time it's been unlocked. Powering it down wouldn't save you, because it requires an unlock at boot anyway.


If I understand iOS correctly, the reboot will cause the data partition to be strongly encrypted -- so that's worse.

But maybe the device can be tricked into time passing reaaally slow?


Or it's checking the monotonic time, not using the synchronized time (which is probably recorded as an offset anyway)


I would like to read this article but, the website doesn't load. I guess it's not optimized to front page HN.

May I suggest to loadtest your website or article before posting it?

https://ddostest.me/load-test/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: