SAN FRANCISCO, Sept 8 ― Facebook is back in the spotlight following a study highlighting the popularity of fake news on the platform. Compared to verified information, misinformation generates more clicks from users, it seems, with a higher rate of engagement, especially among the right-wing electorate.

By analysing a sample of more than 2,500 pages, researchers from New York University in the United States and the Université Grenoble Alpes in France discovered that misinformation content generated six times more engagement on Facebook than verified news stories. The study was conducted between August 2020 and January 2021, during the US presidential election period.

The study findings, reported in The Washington Post, show that the pages publishing the most fake news receive more frequent “Likes,” and generate more shares and comments. According to Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University, who reviewed the researchers' findings: the study “helps add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home ― and an engaged audience ― on Facebook.”

Leaning to the right


With the help of NewsGuard and Media Bias/Fact Check, the researchers looked closely at thousands of Facebook publishers from across the political spectrum, and their propensity to share content that is reliable or not. The study then compared interactions on posts from pages known for fake news, from Occupy Democrats (more left-leaning) to the political commentator Dan Bongino (more right-leaning) and the conservative media outlet Breitbart. The results highlighted the more viral nature of far-right and far-left content compared to more factual political content. However, it is on the right that fake news is more likely to circulate compared to other political categories. This is a significant finding, since the study was conducted during the American presidential election in November 2020.

The findings weren't to the taste of Rafael Rivero, the co-founder and president of Occupy Democrats: “We occasionally get small things wrong ― and immediately issue corrections ― but we would never deliberately mislead our readers,” he said in a statement to The Washington Post.

This isn't the first time that questions have been raised about Facebook's algorithm, yet the social network's spokesperson deplored the fact that the study did not take into account the posts' impressions, i.e., the number of times a user sees the post displayed on their feed. “This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook,” Facebook spokesman Joe Osborne told The Washington Post. “When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests.”


Recently, the famous report on the platform's most viewed content, covering the first quarter of 2021, showed that an article from the Chicago Tribune linking the death of a doctor to Covid-19 vaccination topped the list of most-viewed content. It proved a hard blow for Facebook, which has been accused by President Joe Biden of “killing people.”

While Facebook's algorithm doesn't necessarily favour fake news or specific parts of the political spectrum, misinformation content seems to be more appreciated by users, which in turn makes it viral on the social network. According to the Washington Post: “Among publishers categorized as on the far right, those that share misinformation get a majority ― or 68 per cent ― of all engagement from users.” ― ETX Studio