Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Sure, they can’t see the images - but they’ll send us hashes of illegal images, and we’ll turn your images into hashes, check them against each other, and report you to them if we find enough.

> I fully expect that Apple Intelligence will have similar system defects

Being able to scan devices for CSAM at scale is a "defect" to you?



Yes, it is a defect. For many reasons

- it's anti-user: a device spying on you and reporting back to a centralized server is a bad look

- it's a slippery slope: talking about releasing this caused them to get requests from governments to consider including "dissident" information

- it's prone to abuse: within days, the hashing mechanism they were proposing was reverse engineered and false positives were embedded in innocent images

- it assumes guilt across the population: what happened to innocent by default?

and yes, csam is a huge problem. And btw, apple DOES currently scan for it- if you share an album (and thus decrypt it), it is scanned for CSAM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: