Apple has announced plans to scan iPhones for images of child abuse,  raising immediate concerns regarding user privacy and surveillance with the move.

Apple says its system is automated, doesn’t scan the actual images themselves, uses some form of hash data system to identify known instances of child sexual abuse materials (CSAM) and says it has some fail-safes in place to protect privacy.

Privacy advocates warn that now it has created such a system, Apple is on a rocky road to an inexorable extension of on-device content scanning and reporting that could – and likely, will – be abused by some nations.

What Apple’s system is doing

There are three main elements to the system, which will lurk inside iOS 15, iPadOS 15 and macOS Monterey when they ship later this year.

1. Scanning your images

Apple’s system scans all images stored in iCloud Photos to see whether they match the CSAM database held by the National Center for Missing and Exploited Children (NCMEC).

Images are scanned on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

When an image is stored on iCloud Photos a matching process takes place. In the event an account crosses a threshold of multiple instances of known CSAM content Apple is alerted. If alerted, the data is manually reviewed, the account is disabled and NCMEC is informed.

he system isn’t perfect, however. The company says there’s a less than one-in-one-trillion chance of incorrectly flagging an account. Apple has more than a billion users, so that means there’s better than a 1/1,000 chance of someone being incorrectly identified each year. Users who feel they have been mistakenly flagged can appeal.

Images are scanned on the device.

2. Scanning your messages

Apple’s system uses on-device machine learning to scan images in Message sent or received by minors for sexually explicit material, warning parents if such images are identified. Parents can enable or disable the system, and any such content received by a child will be blurred.

If a child attempts to send sexually explicit content, they will be warned and the parents can be told. Apple says it does not get access to the images, which are scanned on the device.

3. Watching what you search for

The third part consists of updates to Siri and Search. Apple says these will now provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when people make what are deemed to be CSAM-related search queries, explaining that interest in this topic is problematic.

Apple helpfully informs us that its program is “ambitious” and the efforts will “evolve and expand over time.”

Ref : https://www.computerworld.com/article/3628454/apples-plan-to-scan-us-iphones-raises-privacy-red-flags.html