Tech

Facebook and Instagram want to put an end to child pornography content on the Internet

Facebook and Instagram have just taken a new measure to combat child pornography flooding their social networks, and it should help other sites to defend themselves.

instagram
Credit: 123rf

A new online tool aims to give back some control to teenagers, or people who have been, and remove explicit images and videos about them from the Internet. Called take it downthis tool is managed by the National Center for Missing and Exploited Children and funded in part by Meta Platforms, the owner of Facebook and Instagram.

It is essentially a database that will allow users to submit a “digital fingerprint” of known child pornography material, a digital code linked to an image or video rather than the file itself. This code will then be stored and deployed by the other participating platforms in order to detect signs of sharing the same image or video elsewhere online. The images will therefore be removed and blocked so that they do not proliferate.

Read also – He sends photos of his naked child to the doctor for a diagnosis, Google blocks his account

Meta wants to help remove intimate images of minors from the internet

Meta clarifies that this program is not just for those under 18. Parents can act on behalf of a child, and adults can delete images taken of them when they were younger.

Having a personal intimate image shared with others can be frightening and overwhelming, especially for young people. It can be even worse when someone tries to use these images as a threat to obtain other images, sexual contact or money – a crime known as sextortion. said Antigone Davis, global head of security at Meta.

To create a hash of an explicit image, a teenager can go to TakeItDown.NCMEC.org to install software on their device. The anonymized number, not the image, will then be stored in a database linked to Meta, so that if the photo is ever posted on Facebook or Instagram, it will be compared to the original, reviewed and possibly deleted.

For its part, the NCMEC warns that the platforms can have “ limited capacities to remove content that is already online, but that it could still contribute to mitigate or reverse the damage caused by unwanted sharing. In France, a law aimed at “guaranteeing respect for children’s image rights” could soon be passed. This bill, tabled by Bruno Studer (Renaissance deputy for the Bas-Rhin department), will be examined in March 2023.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *