Tech

Apple delays the implementation of its tool to fight against child crime

NeuralMatch, the tool supposed to detect the presence of child pornography content on the iPhone, will ultimately not arrive right away. In response to strong criticism from privacy specialists, the American company announces that it is delaying its implementation.

The controversy got the better of Apple’s intentions. Criticized for a few weeks because of the establishment of a controversial tool to fight against child crime, the American company behind the iPhone and the iPad has decided to postpone the deployment of its service to a later date.

As explained in an article in The Verge dated September 3, 2021, Apple considers it necessary to “ take more time in the coming months to improve your system Before deploying it on its devices. The company claims to have listened to ” its customers, privacy groups and subject matter specialists To build a tool that is more respectful of the personal data of its customers. The page detailing these new features has been updated.

An impossible equation

Scheduled to arrive with the next version of iOS, this tool nicknamed NeuralMatch has the role of scanning the photos uploaded to iCloud to compare them to a database of child pornography pictures built by specialized associations. The goal? Note the possible presence of copies of these images on the iPhone, the possession and distribution of which are illegal in a large number of countries around the world, such as France or the United States.

Despite many efforts to try to explain how its system was cut to respect the privacy of its customers, the American brand remained under fire from critics. Privacy experts saw this tool as a sort of “back door” that could be used by an authoritarian government to spy on its people.

The data decryption mechanism is done in 3 secure steps according to Apple // Source: Apple

For Apple, which touts itself as a company at the forefront of privacy and confidentiality, the situation was uncomfortable. On the one hand weighs the obvious need to fight against pedocriminality and on the other the importance of doing so without harming its users. The system is based on a cryptographic analysis of the photos and is supposed to send an alert only after several detections, on the same iPhone of potentially illegal photos, to limit the risks of false positives and, therefore, of misfires.

But the heart of the tool, which scans the binary signature of a photo to compare it to those provided by an association for the defense of the rights of the child, worries several specialists in the protection of privacy who fear that the deployment of such a tool is the beginning of a soapy slope, where increasing demands could be formulated – for example on terrorist content. Caught between these two fires, Apple is thus procrastinating by reaffirming, while waiting to correct the situation, its intact intention to ” deploy these vital tools for the safety of children “.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *