Tech

Fake news and toxicity: by wanting to react too much, Facebook has favored violent content

In 2018, Facebook intentionally changed the algorithm of its news feed, in order to highlight the content with the most engagement. This change had the effect of valuing the most violent posts, according to a survey by the Wall Street Journal.

In 2018, almost all of the media on Facebook experienced a sudden and impressive audience drop. From the World to Buzzfeed via the Finnish daily Helsingin Sanomat, all have undergone the same changes, sometimes brutal: the Facebook page of the World has seen its traffic drop by 30%.

This fall did not happen by chance: it was due to a change in Facebook’s algorithm, responsible for selecting the content of users’ newsfeed. This newsfeed, or current feed, is the part of the site where recent publications are displayed, and photos shared by the pages. liked or by relatives of users. However, the newsfeed is the part of Facebook where its users spend the most time. This change in algorithm has profoundly transformed the social network, making Facebook a place where “ fake news, toxicity and violent content were overly shared “, According to researchers from Facebook, whose the Wall Street Journal was able to read the notes.

“Less qualitative” content

The American newspaper has published the new part of its Facebook Files, a series of surveys on the social network. After talking about the preferential treatment that the stars of the platform receive in terms of moderation, and the toxicity of Instagram for its youngest users, the Wall Street Journal explains in its new article the mechanisms of the news feed of Facebook.

Facebook’s Newsfeed is the most popular feature // Source: Eric McLean / Unsplash

Journalists have had access to memos, emails, notes and a large number of documents exchanged internally over the course of several years, which detail the evolution of the newsfeed algorithm. Over the years, Facebook has made many small updates to the News Feed. But the 2018 update is by far one of the most important. One of the main changes in the feed was to put more emphasis on posts from friends, and less on media posts.

The aim of this change was to highlight the posts most likely to generate engagement, rather than the content of companies that would have fostered more passive behavior. Facebook announced at the time that an algorithm would “ determine which posts you are most likely to interact with with your friends, and make them appear higher in [le] news “.

However, this algorithm works in such a way that the most negative content was overemphasized. Researchers working for Facebook have repeatedly criticized that the new algorithm offers content ” less qualitative “. The new algorithm grants 5 points each time an “angry” reaction is used on a publication, while a “like” is only worth one point. It is important to note that the “love” button was also worth 5 points, but since the most divisive posts received the most reactions, they were highlighted. In an email sent to Facebook services, Buzzfeed CEO Jonah Peretti estimated that the new algorithm “ did not reward the most relevant interactions “. And the research results of Facebook employees proved him right.

Outrage and sensationalism

Facebook researchers soon found that the media and some political parties had readjusted the tone of their posts in order to generate more reactions, and for that, tended to be more ” sensationalists and outrageous Writes the Wall Street Journal. These decisions have direct effects on their political positions: politicians have reportedly told Facebook that they have changed the tone of their speech, in part to have more impact on the social network.

Wall Street Journal reporters report that Polish, Spanish, Taiwanese, and Indian political parties have been influenced by the Facebook changes. ” Many political parties, including those who changed their rhetoric, worry about the long-term effect [de ce changement]on democracy “, Explained an internal Facebook report, which the Wall Street Journal was able to consult. ” Our approach had very negative effects on political content Other researchers wrote in another memo on the subject.

Facebook CEO Mark Zuckerberg was aware of the implications of the algorithm change. But the entrepreneur has for a long time refused to take measures to match. A team of researchers found that changing some aspect of the algorithm helped limit the virality of some fake news – the change was applied to health-related publications in the spring of 2020. But when the team proposed to Mark Zuckerberg to implement this change on all types of content, he would have refused, fearing the impact on the engagement of users and users of the network, writes the Wall Street Journal.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *