Tech

Apple could scan photos stored on iPhone to fight child crime

Apple could scan photos stored on iPhone and iCloud to detect child pornography pictures. According to the Financial Times, this tool, called neuralMatch, will be deployed initially in the United States.

To fight against child crime, Apple could set up a new system for analyzing photos stored on iPhone and iCloud, the Financial Times tells us. The Cupertino company had already indicated in 2020 to carry out certain analyzes on this subject, on the pictures which are on iCloud. The Financial Times article reveals, however, that Apple plans to perform a more in-depth scan to detect child pornography pictures that may be on iPhones or iCloud.

According to the American media, the system used, called neuralMatch, will be able ” proactively alert a team of human analysts if they believe they have detected illegal footage. If it turns out to be the case, the teams will then alert the police. “. neuralMatch is expected to be deployed in the United States first. According to the Financial Times, it would be responsible for assigning a sort of ” security score To each photo. If several photos were to be categorized by the tool as suspicious, neuralMatch would then send an alert.

Other tech companies are conducting similar analyzes

Apple has not yet commented on the subject but the FT says the company could officially unveil more details on the system in the coming days. According to the media, she would have contacted various American academics. Several academics are worried about the risk that such a tool could be hijacked by authoritarian states, in order to detect actions that have nothing to do with child crime and to monitor their populations.

Apple would not be the first tech company to perform this type of analysis. In 2014, Google reported to the authorities an Internet user who had sent child pornography photos via Gmail. ” Unfortunately, all Internet companies are faced with cases of child sexual abuse. This is why Google is actively removing illegal images from its services – including the search engine and Gmail – and immediately sends a notification to the National Center for Missing and Abused Children. “, The group had said at the time.

Google had specified that this center was in charge of a tool called CyberTipline which allowed Internet service providers to relay information to law enforcement authorities on cases of child sexual abuse spotted on the Internet. ” Each image depicting child pornography is assigned a unique digital fingerprint that allows our systems to identify those photos, even in Gmail Google explained. The Mountain View giant had specified that this tool was only used to detect and fight against child pornography. This technology, specified the company, was not used ” on other email content that could be associated with illegal activity, for example, using the email to prepare a burglary. “

Related Articles

Leave a Reply

Your email address will not be published.