Tech

On Facebook and Instagram, some celebrities are entitled to relaxed moderation

A Wall Street Journal investigation sheds light on the existence of accommodating moderation for celebrity accounts. The social network justifies this program by saying that it is used to avoid mistakes. But drifts have been observed, because Facebook could be more conciliatory than towards other of its members.

Depending on whether you are powerful or miserable, court judgments will make you white or black. This famous excerpt from a fable by La Fontaine seems to find an echo today in the way in which Facebook applies its rules to its community. This is in any case what emerges from a survey by the Wall Street Journal, published in its columns on September 13, 2021.

The American economic daily affirms, in fact, on the basis of internal documents that it has been able to consult, that the social network offers a small part of its audience a differentiated treatment on the moderation of content. These privileges apply to Internet users enjoying a certain media status – in short, to celebrities, whether in politics, show business, sport or the Internet.

5.8 million privileged profiles

This program which benefits the stars is called “Crosscheck” or “XCheck” and, according to data dated 2020, it would have affected approximately 5.8 million profiles – a considerable number taken as is, but which remains modest compared to the totality. Internet users who populate the community site. Remember that in 2017, Facebook exceeded the 2 billion user mark.

Members appearing in XCheck are not exempt from moderation, but they are entitled to more consideration. If they are reported, cases are handled much more carefully, sometimes longer, and are checked multiple times – hence the name “cross check”, which means “cross check”. Considerations to which the majority of members are not entitled.

These controls, which are much more meticulous depending on the person’s notoriety, have only one goal: to prevent Facebook from becoming the target of a media storm caused by a “VIP” – for example a politician whose publication is questionable and which would cry censorship in the event of withdrawal, or a web star who can launch a bad buzz by mobilizing his community.

Footballer Neymar is cited as one of the personalities covered by the XCheck program. // Source: Wikimedia Commons / CC / Antoine Dellenbach

In short, the XCheck program would consist of avoiding the hassle and attention of the media by being more flexible towards those with a certain rank in society. This device would concern not only the community site, but also its famous subsidiary Instagram, very popular with personalities from the world of entertainment and sport, but also influencers and influencers.

The fact is that this device has sometimes been too lax or slow to act, to the point of leaving content that would normally have been deleted, including more quickly.

Two examples are given by our colleagues: the first concerns the footballer Neymar, who was accused of rape by a woman of whom he published nude photos. The photos were eventually deleted by Facebook, but they were able to stay online long enough to be seen by its fans. The person concerned is followed by tens of millions of people around the world.

The second does not cite anyone in particular, but notes that statements deemed false in the context of fact-checking programs – notably through partnerships with the press – have been left on the accounts of the XCheck program. Examples: vaccines are deadly, Hillary Clinton is linked to a pedophile network or Donald Trump would have called refugees animals.

Facebook defends its program

The revelations around XCheck led Facebook to send a spokesperson on Twitter to highlight the seniority of the program, which had been mentioned publicly in 2018 in the pages of the social network. He also justified his interest and disputed parts of the Wall Street Journal article. The person also suggested that it is undoubtedly illusory to expect from Facebook a pure and perfect moderation.

There are no two systems of justice, it is an attempt to protect against error “, thus wrote Andy Stone on Twitter. ” At the center of this story is Facebook’s own analysis that we need to improve the program. We know that our rule application is not perfect and that there are tradeoffs between speed and accuracy. He admitted.

There are no two systems of justice, it is an attempt to protect against error

These cross-checks, he continued, recalling the objectives stated when this device was presented three years ago, target content published by celebrities, governments or pages where errors have been made in the past. . They also benefit the media, so that certain content is not removed or left by mistake.

Andy Stone concluded his intervention by noting that the Wall Street Journal article relies on Facebook documents highlighting the need for change. However, these ” are in fact already underway within the company. We have new teams, new resources and an overhaul of the process », He affirmed. In short, the reviews are almost already dated.

However, there remains a problem: what about moderation for others, which is not “premium”? In the Wall Street Journal article, it is reported that in 2018, the founder of the social network, Mark Zuckerberg, estimated the site’s errors in its content removal decisions to be 10% at 10%. One in ten cases: it is considerable for such a site. Especially since it was said that Internet users might never be informed of the rule they had obstructed or never have the opportunity to appeal. Things are changing, however, with new mechanics, but probably not fast enough.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *