It's a tragedy no nation should face alone. There should be no "looking bad" here, just opportunities to help. Why not embrace that and do whatever you can? Any city would face horrible losses in this situation. Heck, Miami lost half a building on a normal day...
China's exit, as happened 70-80 years ago, would cause us to have to work more. That might not be such a bad thing, though I doubt it will happen to the same degree. It's still fresh in everyone's mind.
Case by case. Generally, if they commit to no violence then they should be supported, even protected by police. Uncomfortable conversations should be allowed to happen.
It wasn’t device scanning. A fingerprint of every photo was uploaded to the cloud as metadata for each photo. It’d be like being spooked if the title of the photo is uploaded to the cloud and examined for known csam titles.
And to make things more secure, the only way apple could read those fingerprints was if you uploaded 5 csam photos.
How is that worse than Google where engineers can look at every photo you upload regardless of the content as part of their server side scanning?
I took their argument to be that scanning at all was the problem. I may have misunderstood. I've made way too many comments today on HN, and I should most probably step away from the keyboard. In fact, I'm going to do that for real, my daughter just got home from school and I need to remember priorities.
Scanning on iCloud means that Apple can see the content of all your scannable data in iCloud. Scanning on-device is compatible with Apple never having access to your data in an unencrypted format. If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc. then either you have to scan on device before upload _or_ you have to store iCloud data without E2E encryption. From a privacy perspective, on-device scanning before upload is obviously better.
> Scanning on-device is compatible with Apple never having access to your data in an unencrypted format.
Only if you exclude the following network transmission, which is the easy part that needs no special code. The privacy concern comes in with those two things together. So yeah if you take away one half of a bad thing such that the bad thing no longer becomes possible, it's not bad anymore. The concern is the whole process, on-device scanning being the key not-yet-implemented component.
> If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc. then either you have to scan on device before upload _or_ you have to store iCloud data without E2E encryption.
Apple does not have that legal obligation. If they can't decrypt the content on their servers, then their only response to a government-issued warrant would be to hand over encrypted data.
Also, CSAM is not the concern. The concern is this would be used against dissidents in authoritarian countries. On-device scanning takes us a step towards becoming one and further empowering the existing ones.
This doesn't necessarily follow. Law enforcement having near realtime access to everything you ever photographed is a worse situation then them having to know in advance which things you might be storing, so they can add a fingerprint of it to a database and then wait until your device matches and uploads it.
CSAM is a serious problem and you're ignoring Apple's moral or even repetitional obligation to try to address it. However poorly.
We shouldn't be hyperbolic and just flatten levels of badness, or we'll can never find a compromise (which is, I suspect, just how you want things).
" If Apple has a legal obligation to ensure that iCloud does not store CSAM/etc"
My understanding is in the USA companies like Apple cannot be legally obligated to ensure that iCloud does not store CSAM. Something about the US Constitution, but I can't remember what. Apple is legally obligated to report CSAM if they come across it themselves though.
> Something about the US Constitution, but I can't remember what.
The first amendment comes into play. The government cannot compel Apple to write software in a certain way, such as "write your encryption so you have keys that access all of users' data". That would be "compelled speech". So if the government provides Apple with a warrant, Apple can only provide encrypted or whatever meta information they have, not decrypted content.
You'd have a hard time verifying that said on-device scanning would only have been run only on iCloud-uploaded content.
Once the feature exists your local data might become accessible to a government warrant, which would make the iPhone the opposite of a privacy oriented device.
If it's only for iCloud uploaded data they can simply do the scanning there. There's no reason to use customer's CPU/battery against them.
This necessitates a workflow where both the photos and decryption keys are accessible by the same server and that there is a security workflow to request the users decryption keys without the user involved.
This is specifically what Apple is trying to avoid - they are intentionally pushing an environment where the user must be involved to get the keys, by way of their account password and/or other enforcement mechanisms designed to ensure only the real user can access such keys.
All of the facial recognition, object identification, etc is all done on-device for the same reason. By contrast Google can and does do this in the cloud - and there are Google servers that intentionally have access to decrypt your photos.
iCloud backup was previously a vector that bypassed this, however they also today announced they are fixing that: https://news.ycombinator.com/item?id=33897793 "Apple introduces end-to-end encryption for backups"
Sure, from a technical perspective, it is a nice solution to a set of problems.
But there are many serious problems with it. I'll isolate one: an on-device content surveillance mechanism is a slippery slope to a bad, bad place.
As the saying from the 90s goes, "child porn [CSAM] is the root password to the US Constitution." It is bad enough that people are willing to suspend their better judgement to do something about it.
But after you already have the mechanism accepted an in place, it is far easier to add to it. Pick your boogeyman: you're giving all of them a lovely tool to address their desires. Dissent suppression and Winnie-the-Poo detection for Xi, Erdogan gets to sniff out Golem memes, and choose-your-own horror for the coming dictator of the US.
It is far harder to tell a sovereign, "we could easily do that, but will not" than "we don't have a mechanism to do that." And pretending that it won't happen doesn't pass the laugh test - we have seen this show many, many times. But if you want to argue, start by explaining how Apple's jumping to implement the 10-minute-max sharing limit shows how they'd stand up to China about this.
I agree there is a slippery slope concern, and Apple has themselves made related arguments such as the FBI case and not wanting to create a software update to decrypt the contents of a phone.
However that is contrasted with a very real and much more practical concern of malicious parties getting access to your cloud stored photos.
It would help to note Apple also today announced "Advanced Data Protection" in iCloud which closes the hole where iCloud Backups, iCloud Photos and various other bits of data were technically decryptable by Apple. They've closed that (but it's opt-in, to balance the average user losing all their photos against other users desire to be secure even if it means losing their data). Details: https://support.apple.com/en-us/HT202303#advanced
However even without the "Advanced Data Protection" what I said about not having a workflow for any Apple server to "normally" request both the keys and photo data is also still good security.
To detect Winnie-the-poo, it required a code push of a new database to all clients. If that’s the bar, than a corrupt apple could also push a software update tomorrow that enabled such scanning, whether this scheme was implemented or not.
> If it's only for iCloud uploaded data they can simply do the scanning there.
This is what Apple was trying to avoid. Scanning on iCloud also requires that Apple can see your photos.
If the scanning is done on device, Apple could encrypt the photos in the cloud so that they can't decrypt them at all. Neither could the authorities.
> There's no reason to use customer's CPU/battery against them.
The amount of processing ALL phones do for phones is crazy, adding a simple checksum calculation in the workflow does fuck-all to your battery life. Item detection alone is orders of magnitude more CPU/battery heavy.
> Once the feature exists your local data might become accessible to a government warrant, which would make the iPhone the opposite of a privacy oriented device
Why does that dystopia require on device scanning? Why couldn't they just do it with an OS update today? It's not a reasonably slippery slope, given the actual mechanics of how the CSAM system was designed (perceptual image hashes, not access to arbitrary files)
> There's no reason to use customer's CPU/battery against them.
That's the better argument, but still not super strong. On-device scanning means you know and can verify what hashes are being scanned for, who is providing them, and when they change. Cloud scanning is a complete black box. None of us would know if Google was doing ad-hoc scans of particular users' photos at the behest of random LEOs.
> None of us would know if Google was doing ad-hoc scans of particular users' photos at the behest of random LEOs.
Not your device, not your software. You should assume anything you upload unencrypted is scanned. This distinction was clearly voiced by the majority during the debacle of Apple's on-device scanning proposal. They basically said, "Scanning in the cloud is [choose one: fine, skeezy], but we draw the line at doing on-device scans. I don't want that software on my device."
The point of the feature is to use the data in court cases, which are public record. So word would get out there, via journalist, a whistleblower, etc. They had to make the proposal public before implementing it.
Either commit and say “scanning for CSAM is the top priority” and scan on device or say “privacy is the top priority” and don’t scan at all.
I’m literally saying this halfway path is the worst of both worlds, since people still lose privacy, but very little meaningful progress against csam will be made.
> Doing it on device is probably preferable to on cloud and at worst no different.
I own my device, I don't own their cloud. That's a big difference. Don't co-opt my property to do work you want done. Data stored on your servers is your business, so doing the checking there is fine, as long using them isn't mandatory.
They can't do the check there because clearly they had already been working on encrypting the data end to end making that impossible. So the middle ground was end to end encryption with on device scanning which is a step up from no encryption. Somehow we ended up with the best option of no scanning at all which is nice.
Are you trolling? What Federighi proposed before was scanning "for CSAM" on device [1]. Same angle.
> Doing it on device is probably preferable to on cloud and at worst no different.
Please elaborate. How is it better to force users to run software they don't want than to let them decide whether or not to have their photos scanned when they choose to upload them to the cloud?
Anyway it's a false dichotomy. Apple isn't doing on-device scanning, and now they've announced they won't do it in the cloud either.
Well the scanning was allegedly supposed to take place only when uploading. If a user chooses not to opt-in for cloud library the device scanning was allegedly supposed to be turned off.
So yeah allegedly no worst.
You might notice i used the word "allegedly" a lot, it's because we are speculating about a feature that was never actually rolled out and that nobody audited externally to my knowledge. If you don't trust Apple then this argument don't apply and you are probably better of not using an iPhone.
Nonetheless it's still not worst than actually rolled out CSAM scanning feature of Google Cloud that already had major user adversial effect. So you should trust Google even less and you definitely shouldn't use a stock Android device.
iOS is closed source. Literally every part is unverifiable and "forced". I have no way of proving that my iphone isn't and hasn't always been scanning my photos. But I don't have the time or energy to care about that, I've decided that using an Apple product is safer than an Android I didn't audit (which is essentially impossible). By scanning on device it enables the possibility of end to end encryption which reduces the risk of a hack or bug exposing my photos.
Yes, this whole thing is just another public relation exercise of "Apple cares about your privacy" bullshit when they are actually saying that they still plan to scan your device for CSAM. "End to end" encryption of backup on iCloud is also a joke when they are going to store the encryption keys on the iDevices on which you can run no other system software apart from the closed source ones provide by Apple.