How Apple’s software for detecting child pornography on iPhones works and why it is so controversial


Apple announced details of a system to detect child sexual abuse material on the devices of its US customers.

Before an image is stored in iCloud Photos, the system will look for similarities between the images on the device and others already known of child abuse.

Apple claimed that if a match is found, a human reviewer will assess and report the user to the police.

However, the announcement has raised fears that the technology could expand and allow phones to be scanned for other prohibited materials, including political content on dissident devices.

Privacy experts expressed concern bythat authoritarian governments can use this technology to spy on their citizens.

New applications

Apple announced that the new versions of iOS and iPadOS, which will be released later this year, will have “new crypto applications to help limit the spread of child sexual abuse material online, while keeping in mind the privacy of the users”.

The system works comparing photos to a database of child sexual abuse images compiled by the U.S. National Center for Missing & Exploited Children (US National Center for Missing and Exploited Children or NCMEC) and other child protection organizations.

Those images are translated into numerical codes that can “match” with the one of an image on an Apple device.

The company added that the system will also capture edited but similar versions of original images.

Cellphone with iClousd logo on the screen
Before an image is stored in iCloud, the system will look for matches between the images on the device and others already known of child abuse. (Photo: Getty Images)

“High level of precision”

“Before an image is stored in iCloud, that image will be checked against the number codes of known child abuse photos,” Apple said.

The company noted that the system has an “extremely high level of accuracy and guarantees less than one in a trillion chances per year of incorrectly marking a given account.”

Apple said it will manually review each report to confirm there is a match.

And then you could disable a user’s account and inform the authorities.

Apple said the new technology offers “significant” privacy benefits over existing systems, as the company will only analyze users’ photos if they have a collection of known child sexual abuse images in their iCloud account.

However, some privacy experts have raised concerns.

Regardless of Apple’s long-term plans, they have sent a very clear signal. In your opinion (very influential), it’s okay to build systems that scan users’ phones for banned content“Said Matthew Green, a crypto expert at Johns Hopkins University.

“It doesn’t matter if they are right or wrong on that point. This is equivalent to ‘opening the floodgates’; governments will ask for this technology. “

You are interested in:

Latino pervert admitted abusing his friends’ children in Queens to do child pornography

Child Abuse Activist Sentenced for Child Pornography in New York


Now you can receive notifications from BBC News Mundo. Download our app and activate them so you don’t miss our best content.

Do you already know our YouTube channel? Subscribe!



Source link