Meta Disrupts Covert Influence Operations Across Multiple Countries
On Thursday, Meta unveiled significant findings related to covert influence operations that have been traced back to three distinct countries: Iran, China, and Romania. This revelation was part of their quarterly Adversarial Threat Report, which highlights the company’s ongoing efforts to counteract misinformation on their platforms.
Overview of Disrupted Operations
In this latest report, Meta confirmed that it managed to detect and dismantle these operations before they could successfully establish a genuine following on their platforms. This proactive approach underscores the company’s commitment to safeguarding its user community from disinformation and manipulative practices.
The Romanian Campaign
One prominent operation under scrutiny involved a network based in Romania. This particular campaign included an impressive number of fake accounts: 658 on Facebook, alongside 14 Pages and 2 Instagram accounts. The primary objective of this operation was to target audiences within Romania, utilizing various social media platforms such as TikTok, X (formerly Twitter), and YouTube. Remarkably, one of the Facebook Pages linked to this campaign boasted around 18,300 followers.
The deceptive accounts were employed to manage Facebook Pages, guide users towards off-platform sites, and engage in commenting on posts made by public figures and news organizations. By posing as local residents, these accounts shared posts related to sports, travel, and local news, thereby trying to blend in seamlessly with genuine community discourse.
Manipulation Tactics and Engagement Efforts
Notably, much of the content generated by these fictitious personas failed to gain traction with authentic audiences. To enhance their credibility, these actors also created a presence on other social media platforms. Meta observed that the campaign demonstrated an impressive level of operational security, ensuring stringent measures were in place to obscure their origins. This included the use of proxy IP settings and careful message scheduling.
Dominantly, the posts from this operation were in Romanian, discussing local news and key events, including upcoming elections. Such tactics illustrate the intricate methods used by these groups to manipulate information in a targeted manner.
Iranian Influence on Social Media
Another alarming network implicated in the report originated from Iran, targeting Azeri-speaking communities in both Azerbaijan and Turkey. This operation consisted of 17 Facebook accounts, 22 Pages, and 21 Instagram accounts. The counterfeit profiles were primarily utilized to disseminate content within groups, manage pages, and artificially amplify engagement through self-referencing comments.
Accounts within this network commonly portrayed themselves as female journalists and pro-Palestinian activists. As part of their strategy, they exploited trending hashtags like #palestine, #gaza, and #starbucks to garner attention and insert their narratives into wider public discussions. Their posts addressed current events, such as the Paris Olympics, military actions, and criticisms directed toward U.S. leadership concerning the Israel-Palestine conflict.
According to Meta, this operation is attributed to a known cluster of malicious activity termed Storm-2035. In past reports, Microsoft characterized this Iranian network as a significant player in spreading divisive messaging aimed at U.S. voter groups.
Chinese Operations Targeting Southeast Asia
In a separate development, Meta also disclosed the removal of 157 Facebook accounts, 19 Pages, one Group, and 17 Instagram accounts that were part of campaigns targeting Myanmar, Taiwan, and Japan. The operations traced back to Chinese origins revealed the sophisticated means by which fake accounts were generated—utilizing AI capabilities to create convincing profile photos and engaging in coordinated activity reminiscent of an "account farm."
This Chinese-origin activity spread disinformation in multiple languages, including English, Burmese, Mandarin, and Japanese. In Myanmar, for instance, the operation focused on promoting narratives to end ongoing conflicts while discrediting civil resistance movements and supporting military governance. In Japan, the accounts aimed to undermine trust in the government and its military alliances with the U.S. In Taiwan, discussions surrounded corrupt political leadership while masquerading as anonymous posts to foster the illusion of authentic public debate.
Through these actions, Meta has reaffirmed its stance on combating coordinated misinformation campaigns that operate on a global scale, highlighting the complexities and challenges in maintaining a trustworthy online environment.