PARIS, Oct 26 — The news may not be surprising, but it’s no less disappointing. Indeed, social media sites’ moderation techniques still do not seem to be wholly effective, recent research suggests, with TikTok and Facebook coming under fire in an investigation into their ability to detect election disinformation.

Ahead of the upcoming midterm elections in the United States, a study set out to test the ability of social networks to monitor content containing election disinformation, in particular, featured in advertisements. The research found that TikTok and Facebook aren’t always up to speed when it comes to moderation.

The investigation was helmed by the organisation Global Witness, in partnership with the Cybersecurity for Democracy (C4D) team at New York University. The latter submitted 20 advertisements, 10 in English and 10 in Spanish, either containing false information about the upcoming US elections or aimed at delegitimizing the electoral process and result. For its investigation, the initiative selected three of the most popular social networking platforms in the United States: YouTube, Facebook and TikTok. The objective was to determine whether, on the eve of the midterm elections, disinformation could once again play a role, as it did in 2020.

Variations between countries

Advertisement

After submitting these ads, which totally violate the rules of each platform, the results were diverse. YouTube passed the test with flying colours. The video-hosting platform not only blocked the ads from running, but also suspended the channel carrying them. Despite this, YouTube had approved ads containing disinformation in a similar test conducted by Global Witness, in August 2022, in Brazil.

On the Meta social network, two English and five Spanish ads were approved by Facebook. Meta’s policy also seems to apply differently in different countries. According to the study, 20 per cent of English ads and 50 per cent of Spanish ads were approved in the US, compared to 30 per cent of English ads and 20 per cent of Spanish ads in the UK. As a result of this experiment, one account out of three, through which the group published the test ads, was removed from the platform. The other two are still active.

TikTok was also found wanting in terms of moderation, despite the many measures announced. The Chinese social network approved almost all of the ads, except for one in each language, both of which indicated that Covid-19 vaccination was required to vote: “Ads containing the wrong election day, encouraging people to vote twice, dissuading people from voting, and undermining the electoral process were all approved. The account we used to post the election disinformation ads was still live until we informed TikTok,” the report explains.

Advertisement

Misinformation on social networks remains a particular issue during elections, especially in the United States. With 247 million users on YouTube, 226 million on Facebook and an estimated 85 million users on TikTok, these platforms can be a major breeding ground for false information. — ETX Studio