Hacker News new | past | comments | ask | show | jobs | submit login

It could help to compare to other makers for a minute: if you need to repair your Surface Pro, you can easily remove the SSD from the tray, send your machine and stick it back when it comes repaired (new or not)

And most laptops at this point have removable/exchangeable storage. Except for Apple.




> remove the SSD from the tray, send your machine and stick it back when it comes repaired

Apple has full-disk encryption backed by the secure enclave so its not by-passable.

Sure their standard question-set asks you for your password when you submit it for repair.

But you don't have to give it to them. They will happily repair your machine without it because they can boot their hardware-test suite off an external device.


I get your point, but we can also agree "send us your data, we can't access it anyway, right ?" is a completely different proposition from physically removing the data.

In particular if a flaw was to be revealed on the secure enclave or encryption, it would be too late to act on it after the machines have been sent in for years.

To be clear, I'm reacting on the "Apple is privacy focused" part. I wouldn't care if they snoop my bank statements on disk, but as a system I see them as behind what other players are doing in the market.


> if a flaw was ...

I hear the point you're making and I respect the angle, its fair-enough, but ...

The trouble with venturing into what-if territory is the same applies to you...

What if the disk you took out was subjected to an evil-maid attack ?

What if the crypto implementation used on the disk you took out was poor ?

What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?

The trouble with IT security is you have you trust someone and something because even with open-source, you're never going to sit and read the code (of the program AND its dependency tree), and even with open-hardware you still need to trust all those parts you bought that were made in China unless you're planning to open your own chip-fab and motherboard plant ?

Its the same with Let's Encrypt certs, every man and his dog are happy to use them these days. But there's still a lot of underlying trust going on there, no ?

So all things considered, if you did a risk-assessment, being able to trust Apple ? Most people would say that's a reasonable assumption ?


> even with open-source, you're never going to sit and read the code (of the program AND its dependency tree)

You don't have to. The fact that it's possible for you to do so, and the fact that there are many other people in the open source community able to do so and share their findings, already makes it much more trust-worthy than any closed apple product.


THIS!

Back when I was new to all of this, the idea of people evaluating their computing environment seemed crazy!

Who does that?

Almost nobody by percentage, but making sure any of us CAN is where the real value is.


Jia Tan has entered the chat.


I hope you bring that up as an example in favor on open-source, as an example that open-source works. In a closed-source situation it would either not be detected or reach the light of day.


In a closed source situation people using a pseudonym don't just randomly approach a company and say "hey can I help out with that?"

It was caught by sheer luck and chance, at the last minute - the project explicitly didn't have a bunch of eyeballs looking at it and providing a crowd-sourced verification of what it does.

I am all for open source - everything I produce through my company to make client work easier is open, and I've contributed to dozens of third party packages.

But let's not pretend that it's a magical wand which fixes all issues related to software development - open source means anyone could audit the code. Not that anyone necessarily does.


> What if the disk you took out was subjected to an evil-maid attack ?

Well, have fun with my encrypted data. Then I get my laptop back, and it's either a) running the unmodified, signed and encrypted system I set before or b) obviously tampered with to a comical degree.

> What if the crypto implementation used on the disk you took out was poor ?

I feel like that is 100x more likely to be a concern when you can't control disc cryptography in any meaningful way. The same question applies to literally all encryption schemes ever made, and if feds blow a zero day to crack my laptop that's a victory through attrition in anyone's book.

> What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?

What if aliens did it?

Openness is a response to a desire for accountability, not perfect security (because that's foolish to assume from anyone, Apple or otherwise). People promote Linux and BSD-like models not because they cherry-pick every exploit like Microsoft and Apple does but because deliberate backdoors must accept that they are being submit to a hostile environment. Small patches will be scrutinized line-by-line - large patches will be delayed until they are tested and verified by maintainers. Maybe my trust is misplaced in the maintainers, but no serious exploit developer is foolish enough to assume they'll never be found. They are publishing themselves to the world, irrevocably.


What if the disk could be removed, put inside a thunderbolt enclosure, and worked on another machine while waiting for the other? That's what I did with my Framework.

Framework has demonstrated in more than one way that Apple's soldered/glued-in hardware strategy is not necessary.


> Apple has full-disk encryption backed by the secure enclave so its not by-passable.

Any claims about security of apple hardware or software are meaningless. If you actually need a secure device, apple is not an option.


> Any claims about security of apple hardware or software are meaningless. If you actually need a secure device, apple is not an option.

I don't think this is precise, but the constraints seem a bit vague to me. What do you consider to be in the list of secure devices?


I'm not even here to troll, if you can give details on the list and why that'd be awesome


Seconded


It's also possible to say "nothing" and just leave it at that. A lot of people are desperate to defend Apple by looking at security from a relative perspective, but today's threats are so widespread that arguably Apple is both accomplice and adversary to many of them. Additionally, their security stance relies on publishing Whitepapers that have never been independently verified to my knowledge, and perpetuating a lack of software transparency on every platform they manage. Apple has also attempted to sue security researchers for enabling novel investigation of iOS and iPadOS, something Google is radically comfortable with on Android.

The fact that Apple refuses to let users bring their own keys, choose their disc encryption, and verify that they are secure makes their platforms no more "safe" than Bitlocker, in a relative sense.


I do not believe I understand your comment.

Early, you mention people defending Apple security in a relative sense.

Later, you mentioned Apple refusing user actions to verify security makes them no more safe in a relative sense.

Are you just talking about Apple employing security by obscurity?

I just want to understand your point better, or confirm my take is reasonable.

And for anyone reading, for the record I suppose, I do not consider much of anything secure right now. And yes, there are degrees. Fair enough.

I take steps in my own life to manage risk and keep that which needs to be really secure and or private off electronics or at the least off networks.


Using fully open hardware and software I guess ?


So why the hell do they ask for it then.


> So why the hell do they ask for it then.

I suppose so they can do a boot test post-repair or something like that. I have only used their repair process like twice in my life and both times I've just automatically said "no" and didn't bother asking the question. :)

With Apple FDE, you get nowhere without the password. The boot process doesn't pass go. Which catches people out when they reboot a headless Mac, the password comes before, not after boot even if the GUI experience makes you feel otherwise.


The counterpoint is wiping the device and restoring from local backups when it is returned.


You need to trust the erasure system, which is software. This also requires you to have write access to the disk whatever the issues are, otherwise your trust is left in the encryption and nobody having the key.

That's good enough for most consumers, but a lot more sensitive for enterprises IMHO. It usually gets a pass by having the contractual relation with the repair shop cover the risks, but I know some roles that don't get macbooks for that reason alone.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: