Hacker News new | past | comments | ask | show | jobs | submit login

So why wouldn't that same whisteblower complain if Apple expands their CSAM detection system to other use cases ?

And iOS is a modular operating system. They could easily swap out the Photos.framework for different state actors and support that in perpetuity. They were already doing this when cross-building for ARM/x86.




> So why wouldn't that same whisteblower complain if Apple expands their CSAM detection system to other use cases ?

I assume Apple could make it very difficult, if not impossible, to detect what they're searching for when they are using hashes created and transmitted by all their own hardware and software.

But, even if they did publish the hashes and those were somehow verified in free-press countries by a trusted 3rd party, that does nothing for countries with no free press. Such places would have no knowledge of what's being searched for, and that's the whole point. I won't support an American company that helps oppressive countries stymie what little freedom their people have left to connect via the internet. To the extent they are successful, the results of those tools will eventually be aimed at us, either via uninformed people or by using the tools themselves on us.

> And iOS is a modular operating system. They could easily swap out the Photos.framework for different state actors and support that in perpetuity. They were already doing this when cross-building for ARM/x86.

Sure. And I expect if there were something nefarious there working on behalf of foreign governments then we would eventually hear about it, one way or another. It's a terrible idea that would be abused, and humans are natural pattern recognizers.


> I assume Apple could make it very difficult, if not impossible, to detect what they're searching for when they are using hashes created and transmitted by all their own hardware and software.

Correct, it would be easy to slip in additional hashes without the team knowing what those hashes represented.

HOWEVER, as soon as these additional hashes match something, the first person to see them will be an Apple employee performing manual review. When they see a picture of Winnie The Pooh or a photograph of some classified spy plane, they're going know that the CSAM system is being used for purposes other than CSAM.


Very naive to assume those hashes won't be treated differently on the backend. The most logical thing would be to send those directly to the CPC/NSA, since Apple's human review is clearly a smokescreen at the point where non-CP hashes are added.


But someone has to write code to hold multiple sets of hashes. And someone has to write the code which treats reports differently. It all has to be written and maintained. Thus developers at Apple will still know that the system is being used for something other than CSAM.


> developers at Apple will still know that the system is being used for something other than CSAM.

Will the next generation's developers call them out for that? Or will they be given justification to accept it?

We're inching towards 1984 with these big tech monopolies. It was one thing for Snowden to reveal the secret agreements the government imposes upon tech companies. It's entirely another for privately run businesses to capitulate, and thus excuse politicians from needing to make intelligence-gathering a public issue.

Whatever backroom discussions are occuring about this topic need to come into public view. This just doesn't make sense on the surface. The government can't have access to secretly monitor everything on the internet. It's too much power for too few, ripe for abuse by bad actors, etc. There must be another way that involves an informed citizenry. I don't care how uninformed we've shown ourselves to be in the last decade. We should press forward on informing regardless.


Is the hash matching being done on device or off device?

Up thread it was said that the device will hash the picture then send hash off for matching.

If that is a case, then the hashes coming off your device can be intercepted and checked vs other databases.


Hashing is done on device, matching is also done on device. In the event of a match, a "safety voucher" is generated and uploaded to iCloud. Multiple safety vouchers are required for your account to be flagged, at which point the contents of these vouchers (which contains metadata and a grayscale thumbnail of the photo) can be viewed by Apple.


> Multiple safety vouchers are required for your account to be flagged

I don't see how that makes any difference. What if someone plants bad data on your device? That would of course be a concern for cloud-scanning too.

I don't care how secure Apple says their devices are. There are companies that can crack them, and you can bet some unscrupulous people will use that against their opponents. Politicians and other influential people should be as concerned about this as everyone else. Didn't Saudis crack Bezos' phone to reveal his affair? With this tech they could make up worse stories. I believe our justice department could tell the difference between a hack and someone who actually harbors bad data most of the time, but I don't like relying on that.


> What if

Given that a functionally identical system has been implemented by Google for years, we should already know what will happen. So let me ask. Is this already happening to people with Android devices? In terms of opportunities for framing someone, how is what Google does any different?


Google's system doesn't do on-device scanning, and I gave an example above of something like this happening. Security is a constant race between good and bad actors. If you weaken your system you're scoring for the other team.


Not to worry that is just the "terrorist" hashes


If the human reviewer doesn't see a photograph of a sex act with a prepubescent (NCMEC classification "A1") then it will be rejected.


> the first person to see them will be an Apple employee performing manual review

In China, iCloud is already run by the government.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: