Tech

Facebook labels a video of black men as ‘primates’

What has happened?

A video of the Daily Mail has been the trigger for this great controversy. The document was published by the aforementioned media on the platform and had a clear intention of complaint: it showed how a white man called the police when confronting a black man in a sea port. What no one expected was that the greatest act of racism was not going to be seen in said recording but in an advertisement generated by the film itself. Facebook.

And it is that when finishing watching said video, the users of the social network received the typical automatic notification of if they wanted to see more related posts, with the disadvantage that what they suggested was to “continue watching videos about primates.”

A Twitter user echoed what happened on her account with a screenshot that showed exactly what we described:

Facebook’s apologies

Although the video is a year old – it was uploaded by the British media in 2020 – now it seems to be shooting with more visibility on Facebook, accompanying the end of the controversial message, since there are several users who have complained of having seen it .

The social network has not been able to do anything other than to show its face and issue a statement apologizing for the situation, due to the automaticity of its artificial intelligence:

This was clearly an unacceptable mistake. […] As we have said, although we have made improvements to our AI, we know that it is not perfect and we have more progress to make. We apologize to anyone who has seen these offensive recommendations.

In addition to these statements, Facebook claims to have disabled the tagging function until they exactly “locate” the bug and fix it.

It is not the first time that the artificial intelligence of a large platform makes such a mistake. As you well remember in The VergeGoogle also had to apologize a few years ago when its photo app labeled black people as “gorillas.” A huge and insulting mistake for a large part of the population that shows that artificial intelligence still needs important adjustments in its algorithms when it comes to identifying people of color.

artificial intelligence

The US Federal Trade Commission (FTC) has already warned that this kind of error by an AI can lead to the violation of certain user protection laws, warning companies that had been involved in problems like this that they either take over and improve their systems or the FTC will have to take action.

Regardless of the economic sanctions, it is a serious insult to many people that these platforms should review urgently and immediately.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *