Definitely, it’s a very delicate week for Facebook. the Wall Street Journal Indeed connects the revelations on the firm of Mark Zuckerberg after having got hold of a series of rather damning internal reports.
The latest survey concerns a modification of the news feed’s recommendation algorithm that took place in 2018. The latter chooses the content that the user will be confronted with. At the time, these changes were clearly announced by the executives of the tech giant.
The idea was to limit interactions with content produced by “professionals” to show more publications from our relatives. All this being justified by reasons of protection of mental health. According to an internal memo that our colleagues were able to consult, it was in fact a strategy to respond to a decline in use of the platform.
“Violent content is abnormally prevalent”
The consequences were very problematic since, according to the researchers’ observations, the effect was the opposite. We have thus seen a greater emphasis on inflammatory publications:
Misinformation, toxicity and violent content are abnormally prevalent in re-shared content. (…) Our approach had unhealthy side effects on important parts of the content, especially in politics and current affairs. Our responsibility is growing. Many interlocutors told us that they feared, in the long term, the negative effects that this algorithm can have on democracy.
The consequences were significant for certain political organizations and media which revised their strategies by orienting themselves towards sensationalist and outrageous communication to boost engagement. We unfortunately know the rest of the story and in particular the proliferation of online disinformation campaigns.
Questioned by the American media, Facebook did not fail to react. Quoted by Business Insider, a spokesperson defends himself:
Is a change in classification the source of divisions in the world? No. Research shows that some partisan divisions in our society have been growing for several decades, long before platforms like Facebook existed. It also shows that meaningful engagement with friends and family on our platform is better for people’s well-being than the alternative.
Recall that this week, the Wall Street Journal broadcast two other surveys of Mark Zuckerberg’s firm. The first concerns the existence of a differentiated moderation program that would be applied to 5.8 million VIP users. The second concerns Instagram which has very negative consequences on the mental health of adolescent girls.