KUALA LUMPUR, Aug 5 — Meta, the parent company behind Facebook and Instagram, said in a recent report it had identified and removed over 600 accounts across all its social network platforms for violating the policy against “coordinated inauthentic behaviour”, with most of them alleged to be part of a “troll farm” to corrupt or manipulate public discourse using fake accounts.

Meta claimed in its Quarterly Adversarial Threat report released yesterday this network of fake accounts posted memes in the Malay language in support of the current government coalition and attempting to paint its critics as corrupt, in addition to promoting police.

“Typically, their posting activity accelerated during weekdays, taking breaks for lunch. Their fake accounts were fairly under-developed and some of them used stolen profile pictures.

“Some of them were detected and disabled by our automated systems,” the company said in the report.

Advertisement

To date, it said it has removed 596 Facebook accounts, 180 pages, 11 groups and 72 Instagram accounts. Meta said its investigation found that these accounts were linked to the Malaysian police force.

“We found this network after reviewing information about a small portion of this activity initially suspected to have originated in China by researchers at Clemson University. Although the people behind it attempted to conceal their identity and coordination, our investigation found links to the Royal Malaysia Police.”

Malay Mail is seeking comments from the police over the accusation.

Advertisement

The bogus pages and accounts were said to have some 427,000 followers, while 4,000 accounts joined one or more of these groups and about 15,000 accounts followed one or more of these Instagram accounts.

Many of these accounts were also said to have spent up to US$6,000 (about RM26,739) for ads on Facebook and Instagram, paid for primarily in ringgit.

Meta described “Inauthentic behaviour (IB)” as an effort to mislead people or Facebook about the popularity of content, the goal of a community through groups, pages, events or the identity of the people behind it.

“It is primarily centred around amplifying and increasing the distribution of content, and is often (but not exclusively) financially motivated,” it said.

“IB operators typically focus on quantity rather than the quality of engagement. For example, they may use large numbers of low-sophistication fake accounts to mass-post or like their content — be it commercial, social or political,” the company added.

Often used tactics are similar to other large-scale online activities like spam, Meta said.

IB is different from “Coordinated Inauthentic Behaviour (CIB)” where operators invest in mimicking human social activity as closely as possible.

Meta's quarterly public threat reporting began about five years ago when it first shared findings about purported coordinated CIB by a Russian influence operation.

Since then, the company said it has expanded its ability to respond to a wider range of what it termed “adversarial behaviours”.

“To provide a more comprehensive view into the risks we tackle, we’ve also expanded our regular threat reports to include cyber espionage, inauthentic behaviour, and other emerging harms,” it said.