Taking Action Against Coordinated Inauthentic Behavior in Moldova
As part of our regular updates on notable threat disruption efforts, we’re sharing our findings into coordinated inauthentic behavior (CIB) targeting Moldova that we disrupted in early Q3 of this year, including threat indicators linked to this activity to contribute to the security community’s efforts to detect and counter malicious activity across the internet.
As a reminder, we view CIB as coordinated efforts to manipulate public debate for a strategic goal, in which fake accounts are central to the operation. In each case, people coordinate with one another and use fake accounts to mislead others about who they are and what they are doing. When we investigate and remove these operations, we focus on behavior, not content — no matter who’s behind them, what they post or whether they’re foreign or domestic.
Here is what we found:
We removed seven Facebook accounts, 23 Pages, one Group and 20 accounts on Instagram for violating our policy against coordinated inauthentic behavior. This network originated primarily in the Transnistria region of Moldova, and targeted Russian-speaking audiences in Moldova. We removed this campaign before they were able to build authentic audiences on our apps.
This operation centered around about a dozen fictitious, Russian-language news brands posing as independent entities with presence on multiple internet services, including ours, Telegram, OK (Odnoklassniki), and TikTok. It included brands like Tresh Kich, Moldovan Mole, Insider Moldova, Gagauzia on Air.
The individuals behind this activity used fake accounts – some of which were detected and disabled prior to our investigation – to manage Pages posing as independent news entities, post content, and to drive people to this operation’s off-platform channels, primarily on Telegram. Some of these accounts went through significant name changes over time and used profile photos likely created using generative adversarial networks (GAN).
They posted original content, including cartoons, about news and geopolitical events concerning Moldova. It included criticism of President Sandu, pro-EU politicians, and close ties between Moldova and Romania. They also posted supportive commentary about pro-Russia parties in Moldova, including a small fraction referencing exiled oligarch Shor and his party. The operators also posted about offering money and giveaways, including food and concert tickets, if people in Moldova would follow them on social media or make graffiti with the campaign’s brand names.
This campaign frequently posted summaries of articles from a legitimate news site point[.]md, but with an apparent pro-Russia and anti-EU slant added by the operators. They also amplified a Telegram channel of the host of a satirical political show in Moldova critical of pro-European candidates. One of this operation’s branded Telegram channels was promoted by a Page we removed last quarter as part of a Russia-origin CIB network (case #3 in the Q2 2024 report).
We found this network as part of our internal investigation into suspected coordinated inauthentic behavior in the region. Although the people behind this activity attempted to conceal their identity and coordination, our investigation found links to individuals from Russia and Moldova operating from the Transnistria region, including those behind a fake engagement service offering fake likes and followers on Facebook, Instagram, YouTube, OK, VKontakte, X and the petition platform Change.org. We also found some limited links between this CIB activity and a network from the Luhansk region in Ukraine that we removed in December 2020.
- Presence on Facebook and Instagram: 7 Facebook accounts, 23 Pages, 1 Group and 20 Instagram accounts.
- Followers: About 4,200 accounts followed one or more of these Pages, no accounts joined this Group, and around 335,000 accounts followed one or more of these Instagram accounts. The vast majority of these followers were outside of Moldova, which suggests the use of inauthentic engagement tactics to make these efforts appear more popular than they actually were.
- Ad spend: About $4,000 in spending for ads, paid for mostly in US dollars.
Threat indicators
This section details unique threat indicators that we assess to be associated with the malicious network we disrupted. It is not meant to provide a full cross-internet, historic view into this operation. It’s important to note that, in our assessment, the mere sharing of these operations’ links or engaging with them by online users would be insufficient to attribute accounts to a given campaign without corroborating evidence.
To help the broader research community to study and protect people across different internet services, we’ve collated and organized these indicators according to the Online Operations Kill Chain framework, which we use to analyze many sorts of malicious online operations, identify the earliest opportunities to disrupt them, and share information across investigative teams. The kill chain describes the sequence of steps that threat actors go through to establish a presence across the internet, disguise their operations, engage with potential audiences, and respond to takedowns.
As part of our next threat reporting cycle, we’ll be adding these threat indicators to our public repository on GitHub.
Phase: Acquiring assets
Tactic: Acquiring Facebook accounts, Pages, Groups, Instagram accounts
- Threat indicators: 7 Facebook accounts, 23 Pages, 1 Group and 20 Instagram accounts.
Tactic: Acquiring TikTok accounts
- tiktok[.]com/@trech_kich6
- tiktok[.]com/@moldova_acum
Tactic: Acquiring Telegram channels
Tactic: Acquiring other social media assets
- ok[.]ru/group/70000005349948
Phase: Disguising assets
Tactic: Creating fictitious news outlets
- Треш Киш – Trech Kich
- Молдова онлайн – Moldova Online
- Молдавский Крот – Moldovan Mole
- Флуераш – Fluieras
- Кишинев – Kishinev
- Реальный Кишинев – Real Chisinau
- Молдова сейчас – Moldova Acum
- Гагаузия в эфире – Gagauzia on Air
- Бельцы 24 – Beltsy 24
Tactic: Adopting visual disguise
- Threat indicators: Using profile photos likely generated using artificial intelligence such as Generative Adversarial Networks (GAN)
Phase: Evading detection
Tactic: Camouflaging content
- Threat indicators: Frequently posting summaries of articles from a legitimate news site point[.]md, but with an apparent pro-Russia and anti-EU slant added by the operators.
Phase: Targeted engagement
Tactic: Running Ads
- Threat indicators: About $4,000 in spending for ads on Facebook, paid for mostly in US dollars
Tactic: Engaging with users outside the operation
- About 4,200 accounts followed one or more of these Pages;
- About 335,000 accounts followed one or more of these Instagram accounts. However, the vast majority of these followers were outside of Moldova, suggesting the use of inauthentic engagement tactics to make these efforts appear more popular than they actually were.
Tactic: Engaging with specific audience
- Threat indicators: Targeting Russian-speaking audiences in Moldova
Tactic: Directing online traffic
- Threat indicators: Using fake accounts to drive people to this operation’s off-platform channels, including Telegram channels
Tactic: Posting about individuals or institutions
- Posting original content that included criticism of President Sandu, pro-EU politicians, and close ties between Moldova and Romania.
- Posting supportive commentary about pro-Russia parties in Moldova, including a small fraction referencing exiled oligarch Shor and his party.