Also things around physical access: if you steal my laptop, FDE prevents you from getting my data immediately but if you install malware which takes over the boot process, you get that data as soon as I type in my password.
If the process changes so the hardware only loads signed firmware, which only loads a signed boot loader, which only loads a signed kernel, etc. that avenue of attack is closed. It also makes it possible to trust a used computer.
The problem is that other than Apple nobody has really been committed to doing it well - it’s begrudging lowest-bidder compliance and clearly not something many vendors are taking pride in.
Secure Boot with factory keys has never prevented this attack, by design. You can take a valid, signed OS image from your favorite vendor (Microsoft, Red Hat, whatever), write some userspace code for it that asks for a passphrase and looks exactly like the legitimate paraphrase prompt, and configure the boot order to boot to it. It will pass the Secure Boot checks because it is completely valid. Secure Boot, as configured by default, never had userspace verification as a design goal.
There are at least two solutions:
1. Deploy your own Secure Boot keys and protect them with a firmware password whatever mechanism your particular system has to lock down Secure Boot settings.
2. Use TPM-based security so that even knowing the passphrase doesn’t unlock FDE unless the PCRs are correct.
#1 is a bit of a pain. #2 is a huge pain because getting PCR rules right is somewhere between miserable and impossible, especially if you don’t want to accidentally lock yourself out when you update firmware or your OS image.
Of course, people break PCR-based security on a somewhat regular basis, so maybe you want #1 and #2.
#2 is also something that a security expert needs to audit, so that booting an extracted stock recovery ISO (which has the kernel signed by the same keys as the real system) does NOT unlock the FDE.
First, you need a recovery image to be rejected by the TPM rules.
Second, you need an updated image that you prepare yourself, or that the distro prepares, etc, that will respect your security goals (e.g. does not allow you to boot it and copy files off) to be accepted.
Maybe a mainstream distro could distribute a UKI that will unlock a disk and run that disk’s userspace with no safe mode, recovery mode, etc without a password, but I’ve never seen such a thing.
Yes, hence my agreement with sillywalk. In my original comment I was thinking of the category of PC-like things which are more geared towards work and phones which are more limited.
Phones don’t have as much contrast because there are several vendors approaching Apple’s level of security, whereas on the classic PC side it’s just a mess. ChromeOS is an excellent addition to the comparison since it’s more locked down than a PC but still productive for many workers and really shows that the problem is coordination. Google cares about security and their ChromeOS devices are more secure than most PCs despite having a lot in common because they don’t leave it to the whims of the hardware vendor.
> It also makes it possible to trust a used computer.
Thankfully all this complexity is not the only thing that allows to trust a used computer. There are other options, like not having a modifiable SW (that is SW not stored in non-replaceable ROM) run prior to handing off control to bootloader loaded from external media.
> Also things around physical access: if you steal my laptop, FDE prevents you from getting my data immediately but if you install malware which takes over the boot process, you get that data as soon as I type in my password.
There's still simple vector of attack by installing hardware keylogger to the keyboard wires.
do folks in the business really simply steal a laptop and try to pull all data? or do they steal the laptop and wipe it and flip it...
if they wanted your data wouldnt they steal you, the human, too ?
the signing method only offers buying more time before the innevitable data is "breached" by a theat actor - its the same buying-time for any and all encryption. the system can get too complex, and the underlying problems of humans will always exist (and amplified by more points of failure).. (accidents, data breaches, exploits, ect). the system needs to be immutable, but also mutable at the same time (for updates, ect) - and thats not exactly something easy to accomplish.
and with apple.. they try yes, but it is forever a walled garden. we've already seen their secure enclave bloatloader shinanigans get exploited on phones- and it was not fun for those people where their phones were compromised. apple suffer from us humans, too (we will never be perfect, nor will our software)
> do folks in the business really simply steal a laptop and try to pull all data? or do they steal the laptop and wipe it and flip it... if they wanted your data wouldnt they steal you, the human, too ?
Governments definitely worry about it, and I’d be shocked if e.g. banks didn’t also put it into requirements. Access can be temporary, too: imagine if you get 15 minute alone in someone’s office or they have a kiosk in the lobby, etc. – not enough time to open the case up but plenty to toss a USB drive in and reboot. Repeat for lost devices or scenarios like the KnowBe4 attack disclosed yesterday where some dude might not be able to explain cracking the case open.
> the signing method only offers buying more time before the innevitable data is "breached" by a theat actor - its the same buying-time for any and all encryption.
You have to think about cost, too. It appears to be safe to buy a used Mac because Apple employs competent cryptographic engineers and very few targets are worth involving a lab with truly serious hardware. This could be the case on the PC side too, but it’s undercut by vendors skimping on execution and until Secure Boot is pervasive and robust, nobody can easily tell whether hardware they’ve lost control of can be trusted. People have been getting malware on used computers for years and a trusted boot process makes it easier both to tell if that’s happened and to be confident that you’ve fully wiped a system.
i only chose those questions as to pick on the concept of "stealing a laptop" - its more the hypothecial use case where majority of users, given the "my laptop got stolen" will never see their system again. folks in the business of stealing a laptop will resell it if they can - a laptop in a random car in SF.. sounds real profitiable to try to decrypt some aes 2tb data for a cat pic); secure boot has not guarnteed a password to access the bios in my experiences - and not all bios are created equal. just makes it harder for data on the drive to be accessed (and certainly prevents my neighbor from putting a rootkit in my bootloader)
of course govts worry about data loss - and implanted root-kits; yes we want to prevent those but my point is there are many steps along the path where the complexity can get out hand, and every added step to a system is another step of potential failure - and anything we invent will be vurnerable to human mistakes/errors/ect (like we've literally seen). the problem is the firmware is mutable, the os is mutable, ect ect. the signed stages are a bandaide (not that im smart enough to solve the problem) and it's a matter of time before something like a cert leak happens (again). its funny too because we worry about 1000's of folks computers having a rootkit (that needs physical access when things like my-pc-looks-tampered-with are not considered), and then we let location data be gathered by literally every company, hmmm
the scenario where 15 min alone in somebodys office, (this made me laugh actually - theres a countless amount of what-ifs): a company with any kind of compliance should never let an untrusted person be alone (especially with access to a computer); a smaller company, surly we'd assume would be less of a target, but not a guarntee - but thats also why all companies should not leave their vaults with raw cash open for any to access.
as far as used systems going; folks will always fall victim for that which they do not know. for a newly owned computer a user should be fresh installing the firmware and OS. but convience has folks trained to plug-and-play with 0 downtime, 0 setup, 0 knowledge of options. apple, of course, that cannot be done on the same level as my non-apple system is done. and from what i remember, apple folks need to have proof of reciept for a used-sale, and even then can still get trolled on a used-sale with the find-my-mac lockout - maybe its improved nowaday; i'll simply pass and rather buy new (not that im supporting apple)
<< if they wanted your data wouldnt they steal you, the human, too ?
As chilling as it may be to explore this line of thought, I think there are real, pragmatic considerations that make 'stealing' humans along with laptops less than ideal. Laptops get damaged, lost and so on all the time. Missing laptop raises some, but minimal suspicion and attention. Now, with a human missing, whoever did the deed, will likely have a difficult time moving around assuming LEOs in the area are competent.
there's a literal market to clone device data. you don't even have to steal them.
in the 90s Israeli celebright made millions of govt procurement for ...i forgot the acronyms. but basically devices where you plug a phone and it copies all contacts and messages.
If the process changes so the hardware only loads signed firmware, which only loads a signed boot loader, which only loads a signed kernel, etc. that avenue of attack is closed. It also makes it possible to trust a used computer.
The problem is that other than Apple nobody has really been committed to doing it well - it’s begrudging lowest-bidder compliance and clearly not something many vendors are taking pride in.