Instagram wants to fight sexual harassment online by developing a new tool. The latter wants to automatically detect and block nude photos sent in private messages. Enough to allow safer browsing on the network, especially for women.
Online harassment is a problem for many women who see their emails filled with unsolicited photos, often of a sexual nature. Instagram is looking to create a tool to stem the problem.
Developer Alessandro Paluzzi spotted an interesting detail in the app’s code: a new feature that allows automatically block sexual messages. Everything works using an AI that analyzes the shots.
Instagram wants to fight online harassment
This tool is, on paper, the perfect tool to fight online sexual harassment. When a photo is sent in private message, it is analyzed by an AI which determines whether it is a nude shot or not. If that is the case, the photo is sent, but blurred. The user can then choose to deblur it or not. Important clarification: Instagram analyzes the photos, but does not have access to them.
In the columns of The Verge, Meta confirmed working on such a feature:
We are working closely with experts to ensure that these new options preserve users’ privacy, while giving them full control over the messages they receive.
Read also – Instagram fined 405 million euros for disclosing children’s data
At the moment, the feature is still under development. She could not arrive online for many months. However, it’s already encouraging to see that Meta is starting to take action on these delicate subjects. The American company has long been singled out for its inaction in this area, making Instagram a veritable jungle when it comes to private messages.
The social network already took a step forward in protecting its users a few weeks ago, by automatically hiding sensitive content for anyone under the age of 16. Blocking private messages of a sexual nature is therefore the next step.
Source : The Verge