Facebook is one of the social networks most used today, for five years it gave its users a new power to interact with the publications, through five new ways of reacting, these being the “I like”, “I love it”, “It amuses me “,” it surprises me “,” it saddens me “and”it makes me angry“.
We now know that the algorithm of Facebook It was also affected after said launch, promoting the content that generated the most reactions, regardless of whether they were of liking or anger, compared to the traditional I like. That is, since 2017 the social network gave more reach to publications that generated interaction through emojis and not likes.
The theory was simple: Posts that elicited a lot of reaction emojis tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business. – Will Oremus, Washington Post
However, this change brought with it some “unforeseen” consequences, including the favor “controversial” publications, including all those that even made users angry, a significant number of them being fake news and spam.
The consequences of the algorithm change
In 2019, data scientists confirmed that a large portion of posts with “makes me mad” emojis contained false, misleading, toxic, or poor-quality content. Reason why Facebook was forced to implement a new plan to control the algorithm.
Unfortunately, for three years the moderators of the social network were forced to undermine inappropriate content on the platform, constantly waging an uphill battle against content. toxic and harmful.
However, the end of this story was far from over. This Monday, October 25, 2021, the French whistleblower Haugen accused the social network before the British Parliament of promoting the dissemination of harmful content, at the expense of the health of its employees: “Anger and hatred is the easiest way to grow on Facebook“said Haugen, a former Facebook employee.
Red flags and complicated regulation
Documents displayed during the hearing show Facebook employees on their “integrity” teams lifting red flags on the human costs of specific elements of the rating system, caveats that executives sometimes heeded and sometimes seemingly ignored.
But, the feeling of anger is a negative emotion? One of the most debated points in court was the concept of human emotions that can be used in different ways. The defense pointed out that posts that generate anger could be essential, for example, protest movements against corrupt regimes, among others.
Quick question to play devil’s advocate: Will putting Reactions 5 times stronger than Likes lead to News Feed having a higher proportion of controversial content than enjoyable?
For his part, Facebook spokesperson Dani Lever said:
We continue to work to understand what content creates negative experiences so that we can reduce its distribution. This includes content that has a disproportionate amount of angry reactions, for example.
The adjustments in the social network put an end to the weight of the reaction “it makes me angry“greatly reducing the scope of posts that contained misinformation, disturbing content, or violence.
With information from The Washington Post.
Watch out! These Squid Game apps install viruses on your devices
macOS Monterey: How to install Apple’s new operating system and which Macs are supported?