Gangs linked to Iran and China join Putin’s disinformation war

Pro-Beijing and Iranian disbelievers are using the war in Ukraine to spread disinformation that supports those countries’ political interests — namely, advancing anti-Western narratives — according to threat intelligence experts at Mandiant.
Additionally, Iran’s cyber campaigns are using Russia’s invasion of its neighbor to target Saudi Arabia and Israel, the researchers found.
In a new report released today, Mandiant’s Alden Wahlstrom, Alice Revelli, Sam Riddell, David Mainor and Ryan Serabian analyze several information operations the team observed in its response to the conflict in Ukraine. He also attributes these campaigns to actors that threat researchers believe operate in favor of nation states such as Russia, Belarus, China and Iran.
“They are leveraging the invasion opportunistically to pursue known campaign goals,” said Wahlstrom, principal analyst at Mandiant. The register. “It’s a bit of a vision of how other players can use a major event to pursue their own interests on the world stage.”
Death to the Great Satan
In the case of Iran, that means adapting their existing campaigns, some of which the Mandiant team has been following for years, to relate to the Russian invasion of Ukraine while continuing to promote their anti-American messages and allies. This includes the pro-Iran Liberty Front Press (LFP) campaign, which Mandiant began tracking in 2018, and the Iran-aligned Endless Mayfly influencer campaign that Citizen Lab reported on in 2019.
For the first time in this research, Mandiant named a pro-Iran group “Roaming Mayfly” because of its similarities to the Endless Mayfly campaign.
“The campaigns aren’t new, but the activity and purpose are,” Revelli, a senior Mandiant executive, said in an interview. “They took advantage of the stories relating to the invasion to target the usual adversaries.”
Some of the campaigns include messages aimed at Arabic-speaking audiences that the United States has abandoned Ukraine like it did with Afghanistan in 2021. “So it’s really focused on the United States and the how they abandon their allies rather than the invasion per se,” Revelli added.
In other examples of messages promoting Western hypocrisy and racism, pro-Iranian campaigns used the invasion of Ukraine to highlight how the United States and its allies responded to Russia over how they treated Saudi Arabia after the Saudi invasion of Yemen.
Roaming Mayfly, meeting Dragonbridge
Mandiant also documented Roaming Mayfly targeting the Russian public on the eve of the invasion in what security researchers say was an attempt to heighten tensions between Russia and Israel. Specifically, this effort used a fake Twitter account that posed as Russian journalist Fyodor Lukyanov to suggest that Israeli intelligence services supported Ukraine and had also supported earlier violent protests in 2000, 2004 and 2014. .
Also in the research, Mandiant links a pro-China government effort called “Dragonbridge” to an ongoing disinformation campaign that alleges Pentagon-linked labs are conducting bioweapons research in Ukraine.
While it’s hard to gauge the influence or reach of these types of disinformation campaigns, the general chaos and distrust they sow is in itself a successful outcome, Wahlstrom said.
“Creating an influx of questionable information in a conflict zone at least raises questions or concerns about reliable sources of information – that greater possibility of confusion or not knowing exactly when you can trust if something has been posted on a website or if it’s real,” he mentioned.
New Russian campaign details
In addition to Chinese and Iranian campaigns related to the Ukraine War, the research includes information on new Russian information operations.
This includes a Russian influence campaign known as the “Secondary Infektion”, which began before the ground invasion and spread misinformation about Ukrainian President Volodymyr Zelenskyy. Mandiant linked the operation to a false statement from March that Zelenskyy had died by suicide in the Kyiv military bunker.
A more recent secondary infection campaign broadcast in both Ukrainian and Russian falsely claimed that the Ukrainian and Polish governments were seeking to allow Polish troops to deploy to western Ukraine. And an early April claimed that Poland had tried to use an alleged “provocation”, staged by Ukraine, to station Polish troops in the country.
Additionally, a new Operation Ghostwriter, which Mandiant publicly credits for the first time, published fake content about Polish criminals harvesting organs from Ukrainian refugees.
“Obviously, this is an incredibly concerning narrative that has spread with the potential to create mistrust between Ukrainian refugees and where they are fleeing,” Wahlstrom said.
“But in terms of the context of viewing this as Operation Ghostwriter, it fits very well with the campaign’s established goals, which include fomenting mistrust or tension between different countries in the region, including Poland and Ukraine. .”
Reminder: Ghostwriter, a crew suspected of being linked to the Russian military intelligence service GRU, traffics in both disinformation and destructive cyberattacks. The US government blamed Ghostwriter for WhispherGate, the destructive malware strain used against Ukrainian government and private sector networks on January 15 – before the ground and air bombardment began.
This same group also tried to trick people on Facebook into posting a fake YouTube video claiming to show Ukrainian soldiers coming out of a forest waving white flags. And Ghostwriter attempted to hack “dozens” of Ukrainian military Facebook accounts, according to Meta.
When misinformation, destructive attacks collide
This gang, in particular, highlights the intersection of disinformation and destructive attacks that Russian-backed criminals have used against Ukraine since the conflict began.
While Mandiant addresses this alignment in its new report, identifying info ops that occurred alongside disruptive or destructive malware (such as the wiper attacks early in the war), the authors say that they cannot definitively link these types of operations due to a “limited overlap model”.
This suggests that some of the crews behind these info ops “are at least tied to groups with broader capabilities,” Wahlstrom added. “And that means the skills required to conduct a basic information operation vary from what you might need to develop and deploy malware.”
However, the two types of threats complement each other in a warzone, he added. “If one of the goals is to create fear, uncertainty among the population, then having both a very visible statement, like a website downgrade, and then potentially some other activity that has a more destructive aspect that could make the headlines has a double impact.” ®