Apple removes all traces of its controversial anti-child pornography system

First presented in August 2021, Apple is now seeking to erase all traces of its system for detecting child pornography (CSAM). Indeed, the apple brand has removed all mention of this feature on its website. However, the Cupertino company still intends to launch this system soon, once it has made some changes.

apple child pornography tool
Credits: Pixabay

In August 2021, Apple unveils its new measures concerning the fight against child pornography. Indeed, the Cupertino company intends to integrate a feature capable of detecting child pornography material (images and videos), also called CSAM in the USA.

To do this, the system will analyze, using several algorithms, the unique digital signature of an image contained in iCloud or iMessages. It will then be compared with the digital signatures of millions of child pornography images recorded in the database of the National Center for Missing and Exploited Children.

On paper, Apple’s intentions remain commendable. However, this measure quickly met the hostility of many privacy advocates and whistleblowers, starting with Edward Snowden. In his eyes, there is a risk that this feature will be hijacked and turned into a “surveillance and censorship infrastructure ”.

Apple clears traces of controversial anti-child pornography system

Faced with the general outcry, Apple has decided to suspend the deployment of this feature until further notice. However, our colleagues from The verge tell us that the manufacturer has just updated its web page on its safety features for children.

Indeed, the apple brand has removed all mention of this controversial feature. References to CSAM’s detection system have disappeared. Nevertheless, Apple would like to point out that it does not bury the device for all that. : “Based on people, we have decided to take more time over the next few months to gather information and make improvements before we launch these child safety features which are of crucial importance ”, provides a spokesperson for the brand to the American media.

Note that if the detection function of CSAM has gone by the wayside, Apple has deployed two functions dedicated to the protection of the youngest presented in August 2021. With iOS 15.2, minors receive a notification from Apple when they receive or send an intimate photo. On this subject, Apple also thought to alert parents, but the manufacturer finally changed his mind. The second feature provides additional information when research related to child exploitation is done through Siri, Spotlight or Safari Search.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *