No, I'm relaying the fact that scanning does happen, and has been happening for the past decade.
>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account
Apple refused to implement this until they found a more private method to handle things.
Only photos you upload to iCloud are scanned and nothing happens unless multiple images match known examples of kiddie porn. In that case, a human review is triggered to make sure you didn't just have several false positives at once.
No, I'm relaying the fact that scanning does happen, and has been happening for the past decade.
>The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)
https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...
>a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account
https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...
Apple refused to implement this until they found a more private method to handle things.
Only photos you upload to iCloud are scanned and nothing happens unless multiple images match known examples of kiddie porn. In that case, a human review is triggered to make sure you didn't just have several false positives at once.