Supervisor Elham AbolFateh
Editor in Chief Mohamed Wadie

Apple to Scan iPhones for Child Sex Abuse Images


Fri 06 Aug 2021 | 03:00 PM
Ahmad El-Assasy

Apple has released information about a system that will detect child sexual abuse material (CSAM) on US users' devices.

Before an image is saved to iCloud Photos, the technology checks for existing CSAM matches. If a match is found, the company says a human reviewer will assess the user and report them to law police.

However, privacy concerns have been raised about the technology's potential to be used to scan phones for forbidden content or even political speech. Experts are concerned that authoritarian governments could utilise the technology to spy on their citizens.

It noted that new versions of iOS and iPadOS, due out later this year, will include "new cryptography applications to help prevent the spread of CSAM online while providing for user privacy."

The technology compares images to a database of known child sexual abuse images produced by the National Center for Missing and Exploited Children (NCMEC) in the United States and other child safety organisations.

These photos are converted to "hashes," which are numerical numbers that can be "matched" to an image on an Apple device.

The technology, according to the company, will also detect modified but similar versions of original photographs.