Hacker News new | past | comments | ask | show | jobs | submit login
Tethered Jailbreaks Are Back (trailofbits.com)
143 points by dguido on Sept 29, 2019 | hide | past | favorite | 117 comments



Yay!

I have fond memories of my friends (and eventually me, on the family iPad) jailbreaking our devices and doing stuff with them.

A lot of the things I saw from jailbreaks were incorporated into later iOS updates- I'm curious (and excited!) to see what develops out of this wave.


a proper firewall (not just safari content filtering or dns blocking) would be wonderful, one with a nice enough UI, like hands off or little snitch on mac.

there used to be firewall ip and protect my privacy on cydia, but both of those seem to no longer be maintained.


What's the point now that we have Android?


GarageBand. iMovie. Both of which can have their projects opened in Logic Pro or Final Cut, an irreplaceable workflow for which there is no Android equivalent. Audio Bus. Not Google. For me, ARkit is huge for some hobby projects.


If Apple would sell me one of their dev phones that gives me root, I'd switch from Android in a second.

I don't understand why this is such a big deal for them. They could even charge more for it, or make it so tech support is excluded from those phones, or whatever they need to do make money.

They'd bring all the hackers back to their platform. Do you realize how much effort is spent on Android mods? Can you imagine if those people were busy making iPhones do more thing? Because that's what'd happen if they'd sell me a rooted phone.


Meh, there’s a small but dedicated community for mods, but they can affect system stability and that’s not a compromise Apple wants to make.

Not having root has never affected my almost ten year iOS dev career.



So far, only iOS can run on iDevices, which means if you want to use Apple hardware you have to use iOS.


Before this exploit there were other Bootrom exploits that allowed for running alternative operating systems (albeit on obsolete hardware).


I hardly see any compelling reason to stick to Apple hardware.


Well it appears Apple are really working towards user privacy as their main sell where android is locked in with Google Play services so though I do not personally have a preference I do know why some people choose Apple devices.


You can disable Google Play Services on a rooted Android. Besides I'm pretty sure all baseband CPUs have backdoors the carriers can tap into at any time.


I keep hearing this argument, but if I'm buying an Android device I'm still supporting Google and Android. The thing is, I don't want to support them at all.


On an iPhone (and probably a good number of Androids) the baseband is just a USB peripheral.


Apple is marketing privacy while storing encryption keys in China and becoming a services business. The more their revenue shifts away from hardware the more they'll be compelled to collect data to improve their services. There's no way around this.


I don't even know where to start with this comment.

"Storing encryption keys in china" could mean anything, it could mean having edge services with private TLS keys for instance. Which means semantically you're correct, but it doesn't mean anything.

In fact I'd argue it's no different than hosting in the US or UK.

The UK for example has laws that forbid you from withholding encryption keys or passwords. And there is an equivalent of the NSL (National Security Letter) which forbids you from even telling anyone that you have complied with a government or law enforcement request.

So how is this relevant?


That would be an argument for apple software. The question above was, if you don't care about the software why would you buy apple hardware?


Android appears to be about to lose the ability to run downloaded executables (see Termux), but this also has nothing to do with the discussion.


Are you talking about this?

https://github.com/termux/termux-app/issues/655

Or is there some new development I [1] should be concerned about?

1. Just a regular user of Termux.


iOS doesn't sent your every click to Google?

https://digitalcontentnext.org/wp-content/uploads/2018/08/DC...


as the paper notes, ios itself doesn’t send much to google. it’s primarily installed apps that send data to google, and that’s primarily advertising related. google apps will additionally send all your location data to google.

moral of the story: don’t install any google apps and limit the number of apps you install.


I'm not going to ruin my phone experience just for that. Why should I care that my apps have telemetry for ads. It keeps them free.


Honestly, if there's a real security risk, I'm surprised Apple hasn't recalled the phones or offered to repair them. Unpatchable firmware flaws are (or should be) no different from hardware flaws in this respect.


> if there's a real security risk

There is, but it's not that great. You need physical access to the device and it won't be persistent (a reboot will clean it).


It certainly will _feel_ persistent if you're successfully attacked with this technique.

If your iOS software is swapped out for a version with a backdoor, then the attacker will have collected your passwords and authentication tokens to services you use. If you reboot to clear the backdoor (and let's be honest: no one reboots their phones), then you won't also "clear" your attacker's memory of all your passwords.


If your iOS version is swapped out with one that is backdoored, it won’t boot after you reboot it without using this boot loader exploit on a computer again.

This makes you ever so slightly more vulnerable to an evil maid attack, but we don’t even have a jailbreak yet using this so it’s to be determined how it all shakes out.


You don't need to modify the kernel or iBoot on disk to inject a patched OS (or malware). redsn0w didn't, it would boot over dfu every time.

It's totally possible to rootkit a phone and have it reboot just fine (with the rootkit removed).


I believe a reboot will cause you to boot stock iOS again.


I reboot my phone once in a blue moon, but my phone reboots itself roughly every other day (usually because I space on charging it). Am I that unusual, or is "the phone is rarely going to reboot" not really a reliable predicate for attackers?


I similarly reboot my phone rarely, but my phone never runs out of battery. I never charge it during the day (except if I'm using it for GPS in my car), and it's just a routine to charge it at night. I don't think my current phone, which I've had for about a year, has ever run out of battery.


Anecdotal, but my family members reboot slightly less frequently than iOS updates come out (they miss a couple).


Only on update, and only if the update requires it.


My reboot frequency is slightly more than 1.0 times per update, but less than 2.0 times per update.


Ha, I wonder if an attacker could use this bug to prevent or fake the rebooting process by changing the behavior of the lock/volume buttons when they’re held.

I know there’s also a “hard reset” you can do with volume up -> volume down -> power, not sure if that works at a lower level.


Yes, an attacker using checkm8 could do that if they had a separate exploit for persistence. That exploit would be in iOS and take over at a later point in the boot process, and it would be possible for Apple to patch it with an iOS update. Those bugs are hard to find, but there have been dozens discovered in the past.


I’m a little confused, which part of my comment would require persistence? I was suggesting a lulzy payload that would prevent the user from _actually_ restarting their device by changing the behavior of the power button. As a means of bypassing the “just restart your phone every time you use it” countermeasure.


Would not work, as this is indeed a lower-level function.


Seems likely that the hard reset works on a lower level as it works even if the phone is hung.


> You need physical access

I don't understand why people keep downplaying this. The whole point of a secure phone is that the data can't be accessed even with physical access.


You need physical access AND the device pin. None of these hacks allow you to decrypt the device without the pin. The best you can do is load malware that would grab the pin when the user types it in so the defense for this is if the government ever takes your phone for inspection make sure to reboot it before typing in a pin.


Even in the case of an evil maid attack, a device that has been out of your sight and then demands that you enter the passcode instead of allowing you to use biometrics is immediately suspicious.


Uh - this is the standard on iOS - after a certain amount of time or reboot or the power button x5 shortcut, iOS will demand your passcode instead of TouchID/FaceID.


But if the phone has never left your area of trust during that time, there's no problem. If the phone has, then force a reboot of the thing before typing in your PIN. Say you have to walk into a place that demands you relinquish your personal device, but when it is returned to you it requests your pin. The suggestion here is that you reboot your phone to help ensure this jailbreak wasn't done to you. It seems like a simple thing, and fairly painless in this case. Just because you're paranoid doesn't mean...


Good point - I always turn off my devices when entering a checkpoint/border.


Threat modelling. In most models, if someone has uninterrupted physical access to a device, it's theirs.

Phones are more important in that you want to protect the assets from thieves, so we do add non-destructive physical access to our scope, but it's with a higher bug-bar. Someone being able to take your phone, compromise it, then give it back to you so you can input new assets means that a vulnerability has to be severe to be as important as a minor remote vulnerability.


The attack where you implant some kind of backdoor to capture the data is possible even without this exploit, it just makes it easier.


How would such an attack (without this exploit) be pulled off?


For example, you could implant a hardware backdoor that monitors the touchscreen inputs


To do that, you'd need to disassemble the phone to insert your implant. That might be hard to do in the field (ie. not in a repair shop/lab setting with plenty of tools lying around). Not to mention the difficulties of designing and manufacturing an implant. How are you going to get it to fit? I don't think there's a lot of empty space inside a phone. How many variants would you need to design and carry around? I'd imagine that the iPhone SE would need a different implant than the iPhone XS, for example.

A bootrom attack allows you to replace all of that with plugging in your victim's device into your "hackbox" for 10 seconds. Vastly simpler to execute for your typical goon/henchmen and way less likely to get detected.


Agreed there is a substantial difference in difficulty between the attacks. I am only speaking to the parent's point about the phone somehow previously being secure and now not being secure. The only thing that's changed is the difficulty of the attack.

There are easier physical attacks too: for example just replace the whole device with an identical one you control. Replicate the target's lock screen in software and capture their inputs.


Anything electronic connected via the lightening port has physical access for example: a charger. A charger could be programmed to let a device in a low battery state to run the rest of the way down to empty to cause a reboot before starting to recharge. Not undetectable. But typical users would probably assume user error or a faulty charger before suspecting malware.


The exploit only works in DFU mode. The user would have to press a button chord in order to reboot into DFU for that to work, and it’s not easy to do accidentally


Are you saying if an evil government puts a backdoor on my phone while I'm at customs, can I just reboot twice to completely get rid of it reliably?

But this hack also allows exfiltration of data from your phone, doesn't it?


Apple isn’t really known for doing recalls until they absolutely have to


"Take the number of 'iPhones' in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one."

A modified version of the movie quote to fit the discussion


"the movie quote" = Fight Club, since yes, there's people who have not seen that movie, but we don't talk about that.


Which makes sense too. If Apple was negligent in the security space and these flaws were common then I would be all for a recall.

But Apple clearly has not been negligent in this space and they really have put forth best effort.


You're talking about 100s of millions of devices - I can see why Apple would prefer not.


With roughly 1 billion iOS devices in use that might be an expensive recall.


Interesting that the writers of this article are a company that sells a library to help developers detect their app running on jailbroken devices. https://blog.trailofbits.com/2017/10/12/ios-jailbreak-detect...


Trail of Bits sells a jailbreak detection app the same way Ikea sells Swedish Meatballs. It's good jailbreak detection, but it's hardly what Trail is about.


lol that's a great analogy! Yes, this is a small project for roughly one person at our company of 50. It's something we felt like we could contribute so we had fun with it. We already invested all the time elsewhere to master iOS. It's one of those 30 minutes + 15 years experience things.


Makes sense given how they try and spin this is only good for pirates (and researchers), because they would be the only ones who would like a jailbroken device.


A decade of corporate brainwashing has convinced people that only criminals want to control the device they own.


Most people don't own, they use payment plans


And if a person loses the phone one month into their payment plan they still have to pay for the entire price of the phone. So you take ownership the moment you make an agreement with the company to make payments.


Payment plans seem like absolute insanity to me. Everyone I know on one is paying $100+ AUD per month perpetually since as soon as they have paid it off they get a new phone.

I have found that it is insanely cheap to just buy last years phone second hand. I picked up a pixel 2 recently for $300 AUD when it was about $1000 the year before.


If one is going to buy the phone anyway, payment plans with 0% interest make a lot of sense (from the buyer's perspective).


I suspect that for a lot of people, without the payment plan they would not buy anyway since they can't afford to buy these outrageously expensive devices outright.


To everyone their own. I very much respect your approach. But when I'm upgrading I don't want last year's model. My phone is the single most used "possession" in my life and after using one for two years, I like to treat myself.


The thing is if you always sit a year or 2 behind the cutting edge you still get to treat yourself because every upgrade is a huge jump in technology from what you had before. Unless you are comparing yourself to others it really doesn't matter because you still get something better and new to you.


Indeed. To be honest it seems like madness to me that phone contracts don't include some kind of insurance as standard.

I've always bought my phones outright but my girlfriend recently had her phone stolen in London about a month after getting it and you can imagine how painful that was for her.


From a technical perspective, both users have (or lack) equal access to their hardware. So the actual method of ownership isn't really relevant.


Jailbreak is a good way to let criminals to control your phone.


Ah, yes, the bane of Apple engineers everywhere.


>library to help developers detect their app running on jailbroken devices

How does this work? I thought iOS apps are sandboxed to an extent where it shouldn't be possible to snoop around to determine which processes are running and such.


A jailbroken device allows apps to do things that a non-jailbroken device does not.

I maintain my company's in-house mobile app crash reporting system and I had to remove jailbreak checks from our iOS SDK. It turned out that some of the checks were causing crashes themselves due to buggy anti-jailbreak-detection code some jailbroken devices had in place. e.g. checking whether a file could be accessed that normally iOS disallows would end up causing a crash instead of just a permission error.

Instead, I just do some basic server-side detection. Basically, looking for libraries loaded into the app (e.g. cydia) that are only present on jailbroken devices. Some jailbreaks don't even try to hide their presence.

I don't know what iVerify does. I hadn't heard of it before. I'm curious how it avoids crashes though... perhaps it avoids invoking any dynamic system calls.


The proper way of doing things should be that an app controls access to jailbreak features. By default nothing gets them and you can whitelist the ones which need it. I'm not sure if anything like this exists for ios but it should.


Depending on how much access you have, it's always possible for jailbreaks to hide their presence from applications (which are running at a lower privilege level). There are a couple of "jailbreak hider" implementations out there.


They also get tripped up by Apple internal devices, which as an Apple employee is quite annoying when I get locked out of a financial app.


I feel like Apple employees shouldn't be using special iPhones for personal use (out of good engineering practice; nothing special.


Who else would be the best to test new phones, software and features, and understand how they may create bugs with 3rd party apps?

In fact it’s critical to do so.


If that's your goal, shouldn't you be using a customer install?


Out of interest, why do you care if your users run your application on a jailbroken device? It’s been a question I’ve had for a while..


We don’t care but when a crash occurs we collect diagnostic data that we think will help us narrow down the source of the crash. Sometimes a crash happens only on jailbroken devices in which case we usually don’t spend time on it.


Same reason as why rooted Android devices get blocked by certain applications. Security for the user's sake. Usually it's financial applications (banking etc.) and stuff with sensitive user information.


As an Android user, this is super annoying: I rooted the device because I want to control it. Now some stupid app comes along and claims that, for my own protection (supposedly), they're going to break for me. It's insulting, really.


Jailbroken devices can attack application logic to cheat in multiplayer games, somewhat attack DRM systems for video content (though Fairplay isn't especially vulnerable here), gain access to chargeable features without paying for them (decompiled Spotify APKs that do not feature advertising without having to pay exist on Android and are a non-trivial revenue risk) etc etc.

In more open systems you are usually more able to run detection software for the above without sandboxing.


Piracy for Android doesn't require a jailbroken/rooted device.


It only requires one jailbroken device on iOS. Or, if you're using the right tools, none.


Companies (particularly financial institutions) that have enterprise apps deployed only to their employees care because a jailbroken phone imposes the risk of attackers MITMing traffic.


We discovered/developed a suite of side channels that let us indirectly read iOS system state from inside the sandbox. There are many different checks across unknown deviations, known jailbreak files and utilities, and runtime behaviors that help us narrow down whether your phone has been modified. It's not perfect, but it's the best you can do from within the Apple App sandbox.


I assume you make a new mach-o file that hasn't been signed by apple and try to exec into it. If it runs, definitely jailbroken. Obviously a jailbroken phone could load a kernel module that detects this and stops it from running in this specific case, leading to a cat and mouse game between jailbreak developers and these apps. iBooks tried this once.


Couldn’t tell you, I know as much as that article says. But I’m guessing that’s part of their secret sauce


I'm sure it's some combination of checking for files that shouldn't be accessible or exist, searching your own address space for things that shouldn't be there, and verifying that kernel calls produce the results that they should.


Interesting that Apple forces you to use your device on their terms :)


since the release of ios 13 it has had multiple updates :)



I jailbroke my old iPhones but that was in an earlier less featureful iOS era. I wonder what hackers will be able to provide such that I'd do it on an SE. Curious now as much as I am skeptical.


All I wanted is a mirrored CarPlay option with a cursor for people who use a joystick instead of a touchscreen. I know this exists in the jailbreaking community (minus the cursor).


> We strongly urge all journalists, activists, and politicians to upgrade to an iPhone that was released in the past two years with an A12 or higher CPU.

This makes no sense. The data of these VIPs is not in (more) danger due to this new jailbreak appearing. It sounds like a cheap trick to make people buy new phones.


"checkm8 doesn't allow law enforcement to decrypt the phone, but it does allow them to rootkit it with 30 seconds of unattended access. Once it's unlocked by the user they'd get everything they need."

That sounds like something more than a little worrying to the listed groups of people, no?


So if the phone is out of your sight, reboot. And now you're clean again.


This will delight the one person in ten thousand who wants to jailbreak their own phone, and the border police in Australia (mandatory scans of phone required on demand), or China, or stalkerware retailers, or repair shops who like to rat around on customers' phones.

Guess which will be the more common use?


You can already assume that states are sitting on exploits that they've found or bought, and that they can compel companies to provide some form of access via NSLs or secret courts.


only more advanced states.


Nah, anyone with a pile of money can buy exploits and hire professionals to discover them.

Kingdoms in the desert have had access to root certificates for almost a decade now.


Reboot your phone if you know you have given it to someone you don't trust.


This is a persistent compromise. That will not help.


No, it's exactly not that.


Even the title says "Tethered." It's far from persistent.


The title says tethered


If anything this should be a boon to users. It allows them fully to use their devices they own. Honestly, it is inexcusable that apple makes users have to hack their own devices. You should have the option similar to enabling or disabling secure boot on your PC.


Are there potential disadvantages involved with “demotion” to enable JTAG? From what I understand the process is permanent (eFUSE?) but it seems like a fun thing to play around with


Jailbreaking is software based and doesn't actually void your warranty. Demotion will.


Oh, here's a biz idea: build exploit into device, then when you've got something better/stronger/faster to sell, you leak the exploit and let the press urge people to buy your latest.


As I see it, the effect of this is twofold. While it's bad for (at least some of) the iDevice users who carry sensitive data - it might also be just the thing that makes those users buy a new device. I guess that would be a "good" security issue in Apple's book.

And no, I'm not implying that Apple has designed this security flaw in order to sell more devices.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: