Hacker News new | past | comments | ask | show | jobs | submit login
Crouching T2, Hidden Danger (ironpeak.be)
226 points by xrayarx on Oct 7, 2020 | hide | past | favorite | 110 comments



So to summarise:

- Requires physical device access.

- Cannot decrypt FileVault without you entering decryption key post-exploitation.

- Cannot access secrets in the Secure Enclave.

This isn't great, but it's not really a huge security issue? The main attack vector here is stolen Macs being re-sold as they can be "un-bricked".

I know part of the point of the T2 is to make physical access less interesting an attack vector, but it's typically doing so after a machine has been stolen. This makes it more attractive to do without stealing, but that's an attack style that is only really done by state level threats, and one that was still possible anyway, because hardware access is essentially game-over anyway at that point.

I feel like this is over-blown? Am I missing something? Is there another attack vector here that's worse for "regular users"?


> Requires physical device access.

For what it's worth, we're talking about mostly personal laptops, tablets, and smartphones here - not corporate servers. I wouldn't be so quick to dismiss the "physical access" requirement, although that does at least make it difficult for attackers to exploit the vulnerability at a large scale.


Sure, but as mentioned there are basically 2 versions: steal, or modify in place.

With the former, the device is already stolen. The only difference here is that the thief can re-sell it. That's not a huge win for the user who it was stolen from.

With the latter, you need ongoing physical access in order to install and maintain a presence on the device. That's a sophisticated attack, but sophisticated attackers already have ways of doing this.


> you need ongoing physical access in order to install and maintain a presence on the device

I don’t think that’s 100% true. Yes, you need persistent hardware to re-root the T2 on every reboot. But what about a one and done T2 attack - a drive by keylogger installer? Get credentials and then mount a more traditional attack to install APT/RAT.


Yeah so the user should reboot if they believe someone has tampered with the device, such as after a TSA check.


Will that help for all cases, though? Wouldn't this allow someone to install a persistent keylogger that would survive reboots?

Also from my reading of that, the T2 chip doesn't reboot with the laptop -- it stays powered on at all times after it first starts up. If that's really the case, a reboot won't do much except kill non-persistent things that are running on macOS itself.


The article says you need a hardware device to maintain persistence and also recommends using Apple Configurator to “ reinstall bridgeOS on your T2 chip“.

So, a) a T2 hack persists until such reinstall and b) you don’t need a persistent beachhead if you’ve taken the hill and maintain persistence there. This exploit opens the door to that, and you can install keyboard intercepts via the new System Extensions Interface. Did it yesterday myself via an update to Karibiner. Notarization can be disabled by interrupting the network pathway and you can definitely disable SIP after you capture the password. In essence, T2 is the lynchpin that prevents the defeat of other aspects of macOS security.


Or version 3, the powers that be ask you nicely to hand over your device. The fourth and fifth offers only very limited protection in the digital sphere and current precedents on decryption and access can easily be overrode by courts.


Yes, they mention that in their original post:

> This makes it more attractive to do without stealing, but that's an attack style that is only really done by state level threats, and one that was still possible anyway, because hardware access is essentially game-over anyway at that point.


physically target the macbooks of employees that have access to those servers, especially the enterprise device management servers, and you can probably exploit this at a larger scale. You could do a pretty trivial amount of LinkedIn research to figure out your targets.


What I thought from previous articles is that this helps brute forcing passwords. Note that I do not know this for sure.

My thought was that the T2 chip is responsible for rate-limiting access to the Secure Enclave. Specifically, for rate-limiting attempts to derive a key from as user-supplied password and the secret in the Secure Enclave.

This allows for weaker passwords to be used, because brute-forcing requires the secure-enclave in the loop, and access to that enclave is rate-limited. Drop that rate-limiting, and the brute-force attack becomes a lot more viable.

As far as I know, this was at stake when the FBI wanted apple's help a while ago to decrypt an Iphone. However, I have no idea whether this is the same scheme used for MacOS devices. Nor do I know whether the T2 chip is actually responsible for rate-limiting.


I think the rate-limiting is done in hardware by the Secure Enclave itself. It cannot physically respond any faster than it does right now.


This is correct.


True that physical possession lowers the risk barrier considerably, but still this threat is problematic in a couple of ways.

a) An unattended laptop can be compromised in under five mins, all without any obvious indications. There are also loaner situations, at schools etc. where machines are shared.

b) no specialized hardware is necessary, so it's really cheap to execute.

c) this is probably the worst part - power cycling the laptop does not reset the T2 ("The T2 chip is fully booted and stays on, even if your Mac device is shutdown.") - so there's no practical way to ensure that your machine is not compromised after a brief interlude. For extra fun, if it turns out the attack can be executed by a malicious wall port, then things start to get really interesting.


Every time an apple device is taken from you at the us border, the border patrol could use this vector to capture your password later and send it over the internet without your knowledge. Before you could just assume they couldn't infect you. You aren't safe until you reset the t2 - every time someone takes it from you.


But the same is true for any laptop - if someone untrustworthy takes your laptop and you’re at that level of concern then your next step is necessarily erasing the machine.


Since this is a tethered exploit, simply rebooting the device should prevent that.

They could also install a hardware keylogger, so a non-exploitable T2 doesn't really help here in the general case. We just have to hope that the TSA aren't at the level of your average state actor.


rebooting the mac doesn't reboot the T2


right on Spot, but also brings an ethical question ! lets say you identified an vulnerability or an hack or found something that will nullify a product's security. what would be the first step one should do?

1). send it to apple(or respective company) and if they don't respond after followups, put it on internet 2) put it on internet so it gains popularity and makes the aforesaid companies do something!

in some cases like these (those require physical access) this is fine, but what if someone uncovers a potential remote hack for every android or apple phone out there! which option would be ethical?


The right thing to do is follow Responsible Disclosure

https://www.bugcrowd.com/resource/what-is-responsible-disclo...


I'd say it depends on what that "security" is. If it is remotely exploitable and can harm user freedom, tell the companies quietly. On the other hand, if it can enable user freedom, keep it to yourself and those who you trust to be on your side.


Regular users don't care about security - that's just a matter of fact.

As for security for "other users" - users with high security requirements can be targeted, even with a locked system.

As an example - if someone messes with my laptop while I'm away, TPM will fail and Windows will demand I re-enter my bitlocker key upon start. If you can suppress that - that's a big issue.


> Regular users don't care about security

They don't care enough to be inconvenienced for it, but they do care when their identity is stolen, when they are blackmailed with photos, etc.

This is what's good about the T2, it secures those sorts of things for regular users in a very convenient way. This vulnerability does not appear to really change that for regular users (although it likely does for orgs with high security requirements, i.e. those considering an APT part of their threat model).


When you send your Mac to Apple Support, it is assumed that no technician is able to peep at your data, as T2 provide encrypted storage on SSD. With this loop hole, your data is wide open eg when your Mac die before you can wipe your data. Apple often would replace the whole Mac, including new SSD for you. Those who skip FileVault might want to reconsider.


It's a good idea to assume that if physical access is compromised, you are screwed. This just confirms this continues to be the case.

Even without this, many people could be owned just by installing a key logger on their external keyboard which would likely be easier to access and hack than the T2/ laptop.


I think to persist the attack you need a persistent hardware device attached so a reboot doesn't reset things?

The thing is, can't you just stick a keylogger onto the machine at this point and get an equivalent compromise? I log you login and then use that to authenticate?


I agree with you. Question for the group, since hardware security isn't my area. Will Apple be able to update this vulnerable in a T2.1 or T3? Gives me a new reason for yet another MBP I guess (just purchased the 16")


T2 is basically a bridge (EDIT: as others have pointed out for this term I meant a temporary fix for non ARM Macs) between Intel Macs and Arm Macs to allow for macOS 11 to work on both platforms with feature parity such as Siri.

Its probably a good idea to get a new Arm Mac as your next Macbook Pro because the T2 chip doesn't even need to be there anymore.


That’s not at all what the T2 is. It’s a multi-purpose processor that handles some security related services (like the Secure Enclave), acts as the SSD controller, handles the camera light, touchID, and other stuff.

It’s not there as a ‘bridge’ - plenty of macs without T2 processors run macOS 11 and Siri with full feature parity, and neither of those things relies on the T2.


What the GP is saying is that the T2 is effectively “the processor” of Macs it’s in, in the sense that it’s performing the function of a System Management Controller, managing early boot, etc. The Intel CPU in these systems is reduced to being an “application processor”—much like in ARM systems where boot and IO are handled through the GPU.

(This is why you can’t install OSes other than macOS on these devices. It’s not because of DRM or anything—though that’s an aspect of it. Rather, it’s because the T2 is in control of boot, and no other OS has had a second-stage bootloader written for it that will load onto the T2 chip to do hardware bringup from that side.)

Maybe “bridge” is the wrong word here—though I can see the meaning. A precise term, I think, would be “polyfill.” The T2 is the place where Apple is putting 1. its IP-core customizations that it can’t just plop into the Intel CPU, because it doesn’t control the Intel CPU; and 2. its top-level “BIOS-alike”, allowing them to bring up the hardware in a way they control, rather than having to convince the Intel CPU to do it in a way it understands.

The T2 and the Intel CPU acting together, roughly simulates an architecture akin to that of an ARM Mac (as with any other polyfill, where the old runtime and the polyfill acting together roughly simulate the capabilities of the new runtime.)

The interesting thing about this, is that it’s likely that because the polyfill roughly simulates the environment of an ARM mac (where these IP cores are accessible directly through the CPU, and where hardware bringup is always done in Apple’s preferred fashion without having to trick the Intel CPU into it), the drivers written for Intel-plus-T2 macOS, are also likely able to be run with only small changes, directly on ARM macOS.

That’s why the GP is calling the T2 a “bridge”: it’s a half-step across the chasm between the archtectures; an intermediate architectural target that allows some but not all of the work of developing ARM macOS to be done before any Apple Silicon ARM platforms like the A14Z were actually available to be developed on.


Linux installs[1] and runs on T2 Macs. The experience varies depending on the model, but it mostly works. The only thing likely to never be supported is Wi-Fi, which is due to the lack of compatible firmware rather than anything to do with the T2 chip.

[1]: https://github.com/Dunedan/mbp-2016-linux


Thanks for being more articulate in my viewpoint. Also, "bridgeos" is the operating system on the T2 also.


> handles the camera light

That’s not true. The camera use indicator light is hardwired into the camera circuitry and turns on when the sensor is powered up. There is no “software” involved in turning on the camera use indicator light.


Danke - that was enlightening. Would never have known those - great quick analysis.


It's probably not something an average joe will have to worry about, but it'll definitely be interesting for state agents, secret services, etc.


It's the same Checkm8 exploit that was hyped up as the end of the world for iPhones late last year (before it turned out that it was really only useful to allow sideloading software on your own device).

https://arstechnica.com/information-technology/2019/09/devel...


This is a fairly uncharitable description of the exploit.


It just means that including the T2 chip adds more locks that half work for criminals, but locks that fully work for casual users.


It's a huge security issue. Physical device access is reasonable. Stolen phones, people incarcerated, etc....

How difficult would it be to steal someone's phone, but illegal stuff on it, and call the police? Could you figure it out for 10k USD?


I'm continually reminded that a HN "HUGE!!" security issue is so rarely that.

I can think of so many more much more significant security issue, from remote root access to windows machines, potentially intentionally flawed VPN security, TPM on intel machines that are remotely exploitable and provide persistent access to machine and the list goes on and on.

There is something about Apple that makes folks blow up. Perhaps it's just they are one of the few with a bit of a reasonable reputation here.

Steal someone's phone? You realize iphones since Xs forward are using A12+ chips, they don't even have the intel / T2 combo being talked about as far as I know. And then to get access you need to install a keylogger PHYSCIALLY into the macbook. All this is "possible", but you can probably do a keylogger without breaking T2 and get equivalent access at the end of the day.


I think a lot of people are looking for any reason to put Apple down. I'm guessing it's both because Apple and its users love to brag about security and because HN users hate the cost that you pay for that security: a device locked down from its own owner.


That is true - apple is locking everyone out - including the person who paid for it. But that does make the iphone relatively better from a security standpoint.

The overseas android market is a wasteland security side - no one even pretends, they don't even keep the phones updated.


> It's a huge security issue.

Which exists on likely 99% of people's setups, Mac, Windows, whatever. Keyloggers on external keyboards, keyboard hacks, bios hacks...

> Stolen phones, people incarcerated

This hack doesn't unencrypted the drive, so this hack won't help if you just steal someone's laptop. You have to get physical access, hack the T2, then get it back into their hands so they can put their password in to decrypt the drive.

> How difficult would it be to steal someone's phone, but illegal stuff on it...

This hack affects MacBooks & Macs with the T2 security chip installed, not about phones.


Buy a burner phone, put illegal stuff on it, stick it in their gym bag, call police. Costs 1% of 10k.


You can do exactly the same on a machine without a T2 chip though, so I’m not sure what makes this special vs any other time you give someone untrusted physical access to a device


It's important to note that the checkra1n team have vehemently criticized this article for inaccuracy.

https://twitter.com/axi0mX/status/1313665047768391680

https://twitter.com/su_rickmark/status/1313733383453732864


Does anyone else see only two replies for the first tweet? It says "THREAD" but then I don't see the thread anywhere. Twitter is terrible.


Nothing very interesting but there's 3 replies I can see, you have to click on the body of the tweet to expand the replies, whereas clicking on the comments button starts your own reply.

Yeah, Twitter is terrible.


I highly recommend this extension to remove all the annoying parts of Twitter, including the unnecessary extra click when you open a tweet from elsewhere.

https://github.com/insin/tweak-new-twitter/


Ah, thank you!


The thread is in the body of the tweet. Here is the start: https://twitter.com/axi0mX/status/1313620262768635904


Claiming the T2 runs “SepOS” is a red flag right out of the box.


The first bit of stupid I noticed was “it’s rom for security”

Sigh


While as always, I love reading about hacking efforts on every front, this part:

>> This could be used to e.g. circumvent activation lock, allowing stolen iPhones or macOS devices to be reset and sold on the black market.

Is making me very sad. As someone who had things stolen in the past, thieves are the scum of the earth - I was hoping that the implementation of lock on apple devices would be a major roadblock to theft.


If you report the phone stolen to your cell phone carrier, they can usually add it to the blacklist.


Which doesn't really matter because carriers don't share those blacklists internationally - so stolen devices just end up in another country. Also, I believe this was specifically about MacBooks?


They don't even block them regionally. Across 2 countries I've tested the "shared blacklist" and there's ALWAYS one ISP who doesn't bother implementing the sync.

There's also times when accidental IMEI's are added to the list - or duplicate IMEI's are issued for the same device (common with cheap phones), I can't imagine going through the unblocking process internationally.


Isn't Activation Lock still enforced server-side?

During initial setup the iPhone checks in with Apple (presumably providing its serial number, etc) and gets some client certificates back for things like iMessage, push notifications, etc.

So even if you circumvent activation lock client-side and get to the phone's home screen, you shouldn't be able to really use it for much (beyond a simple phone and web browser).


But this is specifically about the MacBooks, no? Meaning that if you can get the machine to boot it will work as normal, maybe App Store won't work but it shouldn't have much impact on the usability of the machine, right?


You wouldn't get updates and would have to probably blackhole all of the Apple domains. May be enough to not make it resellable (or at least allow the person who bought it to file something with eBay or whoever to get a refund).


Well if the activation lock is still enforced server side you also lose iMessages, FaceTime, Continuity between Apple devices like your iPhone, maybe XCode if it checks for activation during install. If you could get the MacBook to boot it would technically be a usable PC, but it would probably lose a bunch of what makes a MacBook special to begin with.


It is one of the manipulation techniques used to convince people they should trade freedom to use their devices however they want for notion of security. Apple is known for opposing right to repair initiatives and prefer that faulty devices would rather be destroyed than repaired.


It's an interesting balance though. On one hand I don't want a thief(or anyone) to be able to use my device after it's stolen, it should be literally impossible for anyone to unlock that device, ever. On the other hand, I'm not happy with Apple positioning themselves where they are, aggressively going after independent repair shops and not selling spare parts. Can we have it both ways? I don't know.


Apple have the keys that let them control hardware pairing and firmware updates, you can have those same keys in an open ecosystem with open hardware where you can be in control of those keys. You can absolutely have it both ways.


How do you believe that would work? Do you think the average person is capable of operating a robust PKI management system? I want to like the idea but it seems most likely that this would just mean people having the keys on the device which was stolen (because they don't have redundant storage in a separate location), having a compromise mean that someone now has persistent control of all of your devices, and a spike in the kind of social engineering attacks we saw in the desktop space for decades where someone convinces the user to compromise their security because 99.99% of people using computers do not have the experience needed to understand the security decisions they're being asked to make.


I think people should accept the fact of life that stolen goods will be used one way or another anyway. I think good compromise is to make sure phone reset wipes all data irrecoverably. Rather than fighting the effect, maybe we should address why people steal and why government is ineffective at catching them (but seems to be doing just fine catching students relaxing with a wrong kind of plant)


Now this poses the question - if they made such security claims regarding the T2 chip, and if the T2 is effectively unsecure and unpatchable, are people who have devices with Apple T2 entitled to a refund, or at least compensation?

Didn't this become false advertising?


> Didn't this become false advertising?

I don't think this counts as false advertising. "False advertising" is the colloquial phrase but the actual law uses terms like "misleading" or "deceptive":

> The FTC Act prohibits unfair or deceptive advertising in any medium. That is, advertising must tell the truth and not mislead consumers. A claim can be misleading if relevant information is left out or if the claim implies something that's not true[1].

So maybe if Apple made some very strong statements to the tune of "This T2 chip will absolutely never ever be hacked, its impossible", that might be misleading. I suspect the Apple legal team is a little more clever than that, though.

General marketing fluff like "This car is good, you should buy this car" doesn't count as false advertising if your car breaks down. I think this falls into that category.

[1] https://www.ftc.gov/tips-advice/business-center/guidance/adv...


Point me at some of this 'false advertising' then, please?

No-one (at least, no-one reputable) claims that anything is 100% secure. You'll only hear that a product feature makes it 'more secure'. And that's still (slightly) true, since you could reasonably argue:

1) Even with an exploit, hackers have to do some extra work to bypass the feature.

2) The product was more secure for a time, until the exploit was discovered.


>It is one of the manipulation techniques used to convince people they should trade freedom to use their devices however they want for notion of security.

No, it absolutely is not. It is a 100% genuine trade off, and you are being deceptive out of your own personal interests if you lie and claim it isn't. The ability to do anything necessarily means the ability to do bad things, and the more power is available the more work, knowledge, and metaknowledge is needed to both make full use of it and avoid pitfalls.

Software lockdowns prevent power users like the typical HNer from doing useful and valuable things. But they also help ensure that non-power users (who, remember, may anything from doctors to engineers to diplomats to farmers, brilliant experts in their own fields key to society just not in computers) cannot even be social engineered into getting themselves too deep in trouble. And this is obviously, objectively a real problem, and one frankly the tech community brought in part on ourselves with constant "the user is at fault, the user is stupid" stuff for decades. But why SHOULDN'T users just be able to go easily find anything they think looks interesting and give it a whirl, and have a reasonable expectation that it'll meet certain isolation/privacy/payment standards, and if not that it'll get automatically removed and the developer banned? And even for power users, lockdowns can help shift the power balance away from developers and pool user buying power through a point so that devs are forced to obey certain basic standards whether they like it or not.

Of course, software lockdowns also destroy valuable innovation, and they create a single point of pressure that is in turn prone to abuse or (likely worse in theory and more prevalent in practice) pressure from even more powerful entities. The likes of China, the EU, or if things go bad enough the US can force censorship onto a vast array of people and devices via iOS in a way that they can't with traditional systems.

On the hardware side, lockdowns make repair more difficult/centralized. But they also prevent hardware hacks, and have proven make theft vastly less economic.

IMO, I'd like to see a single point, buy-time option to opt for the ability for an owner to load their own root keys for software, hardware, both, or neither. Personally, I'd go for software and not hardware: the devices are very reliable, I'm not a hardware hacker myself and in my own model I'd prefer to run the risk of buying replacement kit rather then even think about hardware subversion. For my parents and grandparents I'd strongly push them to continue with "neither" which should be the default. I'm sure many on HN would like both. Maybe some (particularly in countries where official repairs are much harder) would even prefer to keep the software side of things locked down for malware but as a practical matter allow repairs.

But it's intensely frustrating to still see techies who apparently never dealt with family support or Help Desk or whatever in their lives blithely repeat 1990s/early-00s memes about PEBKAC/PICNIC/lusers etc. The BOFH was a ton of fun to read but it's realworld application has limits. I don't feel bad about not being skilled at small engine work, or pharmaceuticals or whatever.


I think you have not explained why do you think lock ins are good apart from saying that it may prevent users from being subjected to something bad aka boogeyman companies who try to thwart right to repair use. In other words "you cannot repair your own device for your own good, now give us more of your money".


Indeed, it is always interesting to see these articles that never mention the positive side of these exploits, i.e. enabling freedom. I guess it shouldn't be that much of a surprise, given the authoritarian nature of the security industry (and most big corporations in general, including Apple.)


I had the opposite reaction. I thought -- great! We can return all this hardware that people have irresponsibly abandoned, and bring it back to successful reuse.


This was going to happen, the question was whether or not it would happen while the device was still popular.

IMO you're better off just using an encrypted filesystem and typing in a passphrase on boot than dealing with hardware this locked down.


That is exactly how FileVault 2 functions. It also adds defense-in-depth to the T2 hardware encryption.


The important part about this is that even if 100% true, it's only exploitable with physical access or a compromised USB device of some kind—it's not remotely exploitable. (Exploits also would not last beyond a clean boot.)

So while the headline is strictly true based on the contents of the article, I would call it excessively alarmist, at best. This is not something that the average user ever needs to worry about.


> Good news is that if you are using FileVault2 as disk encryption, they do not have access to your data on disk immediately. They can however inject a keylogger

Overall I’m pretty happy hear. A stolen laptop can’t steal my secrets either from the Secure Enclave or disk, and this exploits prevents us from amassing iBricks because the hardware can always be recovered.


Previous discussion when the vulnerability got published a week ago: https://news.ycombinator.com/item?id=24636166


Are equivalent exploits possible on commodity/business PC laptops? i.e. do normal laptops have an always-on SMC these days which could potentially be exploited into a keylogger? This is a terrifying concept.


And the real meaning of this for actual people:

1) DRM on these machines is dead (yay).

2) "Anti-theft" features, which are really there just as an excuse to put a dedicated DRM chip into your box, are dead.

3) Physical access to your machine by a sophisticated attacker is game over (unless your data is encrypted and at rest).

I hope #3 isn't news to you, despite having an Apple logo on your machine (or Phone).


Media DRM isn’t dead, as it runs in the Intel IME (DRM is the only part of IME Apple haven’t disabled)


Nay! 'tis merely a flesh wound.


Thats why i gpg really important stuff


It could still do a firmware-level keylogger, it's the same technique that it would use to bypass FireVault.


If "security" is just Apple marketing, can we safely say "privacy" is just marketing too?

Maybe today things seem fine, but with declining sales, desperate companies are likely to do whatever it takes to make money. Doing something Anti-consumer is not new to Apple's core philosophy.


There was prism. Whenever people talk about Apple and privacy I wonder if they just ... forgot?

With prism the NSA can reach directly into live servers of American companies and get into the content, not merely metadata of conversations.

https://www.google.co.in/amp/s/amp.theguardian.com/world/201...


Probably. Apple aren't particularly pro-consumer, so why would they actually care about privacy enough to hurt their bottom line.

I'm sure they do care a bit, but it conveniently differentiates them and locks their customers in.

Hence, regulation. Lay down the rules, let the free market fight to the bone over who does it best.


Obligatory xckd:

https://xkcd.com/538/

Seriously, if you're under the attention of "state actors" you need to worry about this. Otherwise, not so sure.


This impacts anyone who might have their iPhones stolen since ordinary thieves care too.


Actually this is specific to laptops.

Also, it's still better than Android phones where last time I checked there was a software based method to unlock.

Looks to me like Apple are at least trying.


How many times does this have to happen before everyone realizes that black-box hardware that runs proprietary software that has someone else's public keys hardcoded into it isn't really a "security chip"?

T2 chip is user-hostile. Hardcoded, non-overwritable trusted public keys are user-hostile. Simple stuff, really.


I genuinely don't understand this comment. A hacked T2 is currently less secure than it was a few weeks ago, but I think still more secure than a machine with no T2 at all, no?

That is, so far nobody has leveraged this to hack FileVault or SecureEnclave passwords generally, and until then, this is still marginally more secure than no T2 at all.

Once the above is no longer true, then a MacBook with T2 will be... exactly as secure as a machine with no T2 at all, right?

I guess I'm not seeing how this makes a MacBook less secure than alternatives. What am I missing?


It is making it less secure because it gives Apple more access to your hardware (I'd call that a backdoor with extra steps) than it does to yourself. Now with this vulnerability, you get to have equal access.

But then again, I don't quite understand what problem is T2 solving that isn't solved by simply encrypting the file system with FileVault.


Do you mean that you believe Apple has active access to my hardware? Or do you mean that Apple has put a feature in my hardware that neither I nor anyone else can explicitly access without this hack?

Because I believe the latter is true, not the former, but I believe nearly every manufacturer of complicated electronics has put things into their products that I personally cannot explicitly access, so it doesn't bother me as much as it might bother you.


I mean that ordinarily, as intended by design, Apple has the private key they can use to sign the T2 firmware. You don't have the ability to run any code on the T2 whatsoever. And considering that it sits in the middle of most system buses and is more privileged than the main CPU, effectively controlling the entire thing at all times...


Annoys me to end that people see this as security. The only security this provides is financial security for Apple and whoever pays them for using the DRM on this thing.


Yeah. Apple's "security" basically almost always boils down to "please trust us that we implemented this the way we claim we did", with many things trusting Apple servers unconditionally — like iMessage "end-to-end encryption" where an Apple server manages keys. Such end-to-end, much wow. But FileVault was at least reverse engineered enough to prove that it trusts the end user more than Apple.


This is infuriating. I spent over $5000 on a 16-inch MBP in only January this year.

If I'm a lawyer, CEO, or a human rights journalist (or just anyone) who professionally needs a reasonably secure device as the normal expectation, how can it be reasonable to be required to have your laptop with you at all times in order to maintain its security?

Is there precedent in consumer law that if security integrity of hardware is a normal feature of that product category and a computer model is fundamentally unfixable in this aspect, then you have the right to demand a refund or a replacement with a model not containing the same defect? (I know that this depends on your country. My country has strong consumer law.)

It's interesting to think about where the line is there. If someone really wants to compromise your device, then they could open it up and plant a bug anyway. But this feels over the line and grounds for being a manufacturer hardware fault, because attacking it would not require to physically modify the device but to merely use the device in the manner that it already came from the manufacturer.


Man - Apple has the worst customers.

Your filevault password has not been compromised by any T2 issue.

The secure enclave is rate limited.

If you are such a valuable target, then the key logger needed to get your credentials can be installed T2 or without T2 issues. Once "they" have that they can decrypt your drive.

The number of folks who are targeted at this level with physical direct access is relatively small. Even for state actors, REMOTE compromise is MUCH more appealing.


Hmm perhaps you should buy one of the competing cheaper laptops that have no built in encryption then?


Except that lots of laptops have hardware based root of trust, encryption, and security management using industry standards.

It's apple that does something exotic, breaking industry standards, and then calls what should be standard by some new name for marketing reasons. Same with how lots of people think a "retina" display is some wonderful apple invention and not just a standard samsung panel.


Having used many non-Apple laptops, "industry standard" means complete and unmitigated garbage to me. "Industry standard" laptops are awful to use and only get better once you approach Apple's price range - at which point why give up your Mac?

No, really, does anyone actually enjoy using cheap laptops? The only use case I think they would excel at is as a thin client for VMWare/Cisco virtualization solutions, or as a barebones terminal for Linux distros (at which point anything with a keyboard works for you).

Anyways, if laptops do security the same way laptops do touchpads, I would not be excited to depend on that. At all.


You mean something like the Intel TPM chip which was hacked last year?


Fury is going a bit far. I've always been taught it is fundamental if a device with secure data is not in a secure location you have already lost. Why the VA gets in such trouble for losing their laptops no matter how well encrypted.

It would be completely irresponsible not to physically secure your hardware if you were in a position of trust.


I feel for you, but your expectations are out of whack with reality. Any Windows laptop, properly maintained, is a reasonably secure device.

You can "restore" "reasonable" security to your Mac even in the almost unthinkable light of a possible actually available exploit, that can be reasonably be expected to affect you personally, by using a strong filevault password. Maybe you want to add a tripwire (file integrity) check at boot time, or a manual check when you mount any drive.

No, the precedent you ask for does not exist. In fact, the opposite is true.


Actually, in my country the consumer culture is extremely different to America. Consumers are not left high and dry if a manufacturer screws them over or was incompetent (such as design defects). Remedies are on sliding scales commensurate with the situation.

This is probably partially why I'm getting downvoted. Cultural differences. Americans are not aware of what's possible when things are actually fair for the consumer. They're used to 'tough luck' culture.

Upon further reading, I'm concluding this might not be a massive problem with other precautions in place, but the valid discussion point still remains. If a manufacturer designs a product which turns out to have a problem caused to the consumer which breaches reasonable expectations of its usability, and either needs repairing / recalling / replacing / refunding, many countries offer resource to the consumer. Under this principle, I wonder about unpatchable hardware security defects which cause a major problem...it needs to be explored more.


Apple never sold you an unhackable laptop. It isn't cultural differences, it's simply that you weren't lied to and your hardware didn't stop being "fit for purpose".

From their ad copy -

"Every MacBook Pro is equipped with the Apple T2 Security Chip — our second‑generation custom Mac silicon designed to make everything you do even more secure. It includes a Secure Enclave coprocessor that powers Touch ID and provides the foundation for secure boot and encrypted storage capabilities. It also consolidates many discrete controllers, including the system management controller, audio controller, and SSD controller, into one."


Firstly, let's make it clear that we are now talking about broad concepts and not necessarily how it applies to the example of this situation at hand.

Under many jurisdictions' consumer laws, advertised features or promises by the manufacturer are not everything that they are legally held to. There is also statutory warranty, and other parts of consumer law, which can include rules on basic expectations of how that category of consumer item is expected to perform (I'm not talking CPU speeds, but major issues like a keyboard fundamentally not working at a reasonable success rate), how long it's reasonably expected to work without failing (for that category of item), and so on.

Very broad principles, but with some clear examples provided by consumer bodies to consumers, and it's reviewed on a case by case basis. You can bring it to the proverbial small claims court (or consumer complaint body), and they can review the claim.

I suppose I just won't bring up this matter to HN before. It's too alien to the US consumer situation and mustn't apply to most readers here.


Consider an off-device key storage.

Any decent smartcard has physical security no worse than T2, but it will probably cost 100 times less, and it will at least allow you to chose a long enough password instead of 4 digit pin.


>Is there precedent in consumer law that if basic hardware security is a normal feature of that product category and a computer model has fundamentally unfixable hardware in this aspect, then you have the right to demand a refund or a replacement with a model not containing the same defect?

IANAL. You are.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: