Meta, the parent company of Facebook, has identified and removed over 7,700 suspicious accounts and 990 pages on Facebook that were linked to Chinese law enforcement. This discovery is being dubbed the “largest known cross-platform covert influence operation in the world.” These accounts were also found to be active on various other platforms including Twitter, Reddit, TikTok, and Medium. Meta’s Q2 Adversarial Threat Report reveals that this extensive network of accounts was connected to a previous pro-China online operation known as “Spamouflage.”
The accounts in question were focused on promoting positive commentary on Chinese policies while posting negative content about the US. Meta traced the origins of these influence operations to a collection of accounts in China that received content guidance from a central source. The content shared these accounts included spammy links, political memes, and text posts, all with the aim of enhancing China’s image, particularly in the Xinjiang province, and criticizing Western foreign policy.
Many of the posts contained typos and lacked coherence, with headlines attempting to question the origin of the Covid-19 pandemic, often suggesting that the US was responsible. The accounts also targeted individuals such as Steve Bannon, Yan Limeng, and Jiayang Fan. Furthermore, the influence operation utilized various currencies, including Chinese yuan, Hong Kong dollars, and US dollars, to fund Facebook ads.
Despite the large number of accounts involved, the operation did not gain substantial momentum. With most of the fake accounts quickly detected and removed, Meta’s automated systems played a significant role in thwarting this operation. However, Meta acknowledges the importance of continued vigilance in reporting and taking action against these attempts.
The operation, referred to as Spamouflage, stood out for its vast size and extensive reach across multiple platforms. These accounts not only posted on Facebook but also shared pro-China audio on Soundcloud, cartoons on Pinterest and Pixiv, and even authored posts on Quora. Their posting frequency ranged from 5-10 posts per day.
This incident highlights how internet platforms present a convenient avenue for intelligence agencies and government operators to subtly manipulate narratives. Similar influence operations have been discovered in the past, including one that promoted pro-US narratives in Russia, China, and Iran. It is imperative to continue addressing and preventing these covert influence operations while recognizing their limited ability to reach authentic audiences.
– Meta’s Q2 Adversarial Threat Report