Apple

Controversy over Apple’s child pornography detection system continues


MacBook Air photos

Since Apple announced the detection of child sexual abuse material for iOS, iPadOS and macOS Earlier this month, there was a lot of debate. Not just among security experts, but even among Apple’s own employees who have asked Apple not to implement it.

The last to join the malaise over the implementation of this system are more than 90 civil rights groups. They have written an open letter to Apple, asking it to backtrack on its CSAM (Child Sexual Abuse Material) plans. The reason they allude is that this system can be exploited for other purposes.

What is CSAM?


CSAM

CSAM, translated as Child Sexual Abuse Material, is a catalog of photographs with content of child pornography, known, prepared and updated periodically by different associations and whose content is controlled by the National Center for Missing and Exploited Children (NCMEC).

Each of the photographs stored in this database has a unique digital signature, a digital signature that it will be compared with the images of the photos stored in the iCloud users’ accounts. If a match is detected, access to the user’s account will be blocked and the authorities will be notified.

Google, Dropbox, and Microsoft have been using this image tracking system for user accounts for some time, but Apple has gone one step further and has created a new system called NeuralHash, a system that analyzes the user’s encrypted cloud, in theory, looking for these types of images and which not even Apple itself has access to.

Upset from civil rights organizations

Among some of the signatories of this letter are the American Civil Liberties Union, the Canadian Civil Liberties Association, the Australian organization Digital Rights Watch, the British Liberty, Privacy International …

The letter begins by highlighting the capabilities of NeuralHash, stating that:

While capabilities are intended to protect children and reduce the spread of Child Sexual Abuse Material (CSAM), we are concerned that it is used to censor protected expression, threaten the privacy and safety of people around the world, and have disastrous consequences for many children.

Once this capability is incorporated into Apple products, the company and its competitors will face enormous pressure – and potentially legal requirements – from governments around the world to scan photos for not just CSAM, but also of other images that a government considers questionable.

Those images can be of human rights abuses, political protests, images that companies have labeled as ‘terrorist’ or violent extremist content, or even unflattering images of the politicians themselves who will pressure the company to scan them.

These 90 organizations claim that the search for photographs could be extended to images stored on the device, not just those that are stored in iCloud, so Apple may have created the basis for censorship, surveillance and persecution worldwide.

It goes without saying that when a government urges Apple to grant access to the content of its users, such as China or Russia, Apple bows his head and complies with his demands. Who assures us that Apple will not follow this same policy with other countries?

The letter goes on to state that this system will put children at risk:

The system that Apple has developed assumes that the “parent” and “child” accounts involved actually belong to an adult who is the parent of a child, and that these individuals have a healthy relationship.

This is not always the case; an abusive adult may be the account organizer, and the consequences of notifying parents could threaten the safety and well-being of the child. LGBTQ + youth in family accounts with unsympathetic parents are especially at risk

The letter ends by stating that despite Apple’s efforts to reduce child abuse, the company must remain firm in the privacy policy that he has built in recent years.

We support efforts to protect children and strongly oppose the proliferation of CSAM. But the changes that Apple has announced put children and other users at risk, both now and in the future. We urge Apple to abandon those changes and reaffirm the company’s commitment to protecting its users with end-to-end encryption. We also urge Apple to consult more regularly with civil society groups and vulnerable communities that may be disproportionately affected by changes to its products and services.

Posts

Messages app

Apple will introduce this new feature with the release of macOS Monterey, iOS 15, and iPadOS 15, a function that is accompanied by a system that will detect the dissemination of child sexual material through the Messages application and that will notify the parent or guardian if the minor has received images classified as sexually implicit.

These images will initially appear blurry and through a message, it will be explained to the minor (if they are 12 years old or younger) that the image is not suitable for them. If you choose to view it, parents will receive a notification along with the image. In this sense, it would be more advisable that it was the parents who gave the go-ahead for the minor to have access to the image.

Images received through the Messages application, will be scanned on the device and that information will not come out of there. Neither the authorities nor Apple will have knowledge of the event.

Siri


Siri

Siri has also joined the fight against child pornography. With the release of iOS 15, iPadOS 15, and macOS Monterrey, if a user searches for this type of content, you will receive a notification informing you that you are conducting a search for material deemed illegal and it will inform you of where you can find help and the means to report such content.

This process, like the analysis of images received through the Messages application, will be carried out internally on the device, without the knowledge of Apple or the authorities.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *