Apple will soon scan all iCloud Photos images for child abuse

Update: Apple has a Child Safety page that describes this new feature and how it works.

TechCrunch has confirmed that Apple will soon roll out a new technology to scan photos uploaded to iCloud for child sexual abuse material (CSAM). The rollout will happen later this year as part of a collection of technologies meant to make its products and services safer for children to use.

Most cloud services already scan images for material that violates its terms of service or the law, including CSAM. They can do this because, while the images may be stored encrypted, the companies have the encryption key. Apple’s encrypts photos in transit and stores them encrypted, but has the decryption key to decrypt them if necessary—to serve data stored in iCloud under subpoena, or to make your iCloud photos available in a web browser.

To help preserve user privacy, the company is relying on a new technology called NeuralHash that will check images as they are uploaded to iCloud Photos, looking for matches to a known database of child abuse imagery. It works entirely on your iPhone, iPad, or Mac by converting photos into a unique string of letters and numbers (a “hash”). Normally, any slight change to a photo would result in a different hash, but Apple’s technology is said to be such that small alterations (like a crop) will still result in the same hash.

These hashes are matched on-device to a database of hashes for images of child sexual abuse. The hashes can be matched invisibly, without knowing what the underlying image is or alerting the user in any way. The results of the matches are uploaded to Apple if a certain threshold is met. Only then can Apple decrypt the matching images, manually verify the contents, and disable a user’s account. Apple will then report the imagery to the National Center for Missing & Exploited Children, which then passes it to law enforcement.

In other words, it’s extremely unlikely that Apple will have the ability to randomly look at whatever images they want to on your iPhone. According to TechCrunch, Apple says there is a one in one trillion chance of a false positive, and there will be an appeals process in place for anyone who thinks their account was flagged by mistake. The technology is only sort of optional: You don’t have to use iCloud Photos, but if you do, you will not be able to disable the feature.

Apple has published a technical paper detailing this NeuralHash technology. This new technology will roll out as part of iOS 15, iPadOS 15, and macOS Monterey this fall.

Subscribe to Applenews247.Com Newsletter

Leave a Reply

Your email address will not be published. Required fields are marked *

*


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>