Apple to scan iPhones for child pornography

Apple has just announced the deployment of three new features intended for the protection of children, with the aim of preventing all kinds of abuse or attacks by sexual predators. A child pornographic image detection technology will, among other things, scan iPhone images.

Apple scan child pornography images
The new measures will be rolled out on iOS 15 © Apple

In a technical press release, Apple has just announced the deployment of three new features intended for the protection of children, with the aim of preventing any kind of abuse or attack by sexual predators. These measurements relate to iOS 15, iPadOS 15, and macOS Monterey. Operating systems that will land by the fall. One of them, which remains extremely controversial at this time, allows Apple to scan the iPhone Photos library, looking for child pornography images. A process that Google uses since 2008 already.

Three frontal measures to fight against child pornography

The first measure concerns Apple accounts configured for family sharing. On these devices, the Messages app will warn young users that such or such received content is potentially sensitive. Child pornography images sent by sexual predators, for example. These images will be blurred and an alert will be sent to the child’s device. If he decides to watch it anyway, the parents will be notified immediately. Apple says it has no access to these messages.

The second measure is pedophile image detection technology, the one that currently makes noise, that Apple calls NeuralHash. It applies to iOS and iPadOS, with an active iCloud Photos account only. If the device detects an image in the Photos library relating to cases of child sexual abuse, Apple will be able to alert the National Center for Missing and Exploited Children (NCMEC), as well as other protection groups. from childhood.

How it works ? Before an image lands in the iCloud library, NeuralHash analyzes it and converts it into a string of numbers. Then, before being uploaded to the iCloud library, NeuralHash compares this string of numbers with those in a database from NCMEC and other child protection groups.

If an image of the user matches a recorded child pornography image – if the numbers therefore match -, Apple will perform a manual scan, before the iCloud account is deactivated and a report is sent to NCMEC, who will take the necessary action.

Regarding the level of confidentiality, Apple guarantees a level ” extremely high »Vigilance and correspondence so that accounts are not reported incorrectly (as with pictures of his own child taking a bath, for example). Apple specifies that it will only have access to these images – only – if a match is made, and not to those that do not match.

The third and final measure will offer links to trace cases of sexual exploitation of minors directly from Apple devices, in case the user requests it.

The use of such a technique can be surprising from Apple, which has always defended the privacy of its users (see recently: under iOS 14, apps like Facebook will no longer be able to track you without your consent). Cupertino, however, claims to have designed each of these new features in order to guarantee the confidentiality of all, while fighting against the exploitation of children online.

These measures will, for now, be limited to the United States. No word on their deployment in France for the moment. For more information on this subject, do not hesitate to read the website deployed by Apple as well as the technical documentation which details the procedures.

LEAVE A REPLY

Please enter your comment!
Please enter your name here