A pro-Russia disinformation campaign known as Operation Overload has dramatically scaled up its output using free consumer AI tools, producing nearly triple the content in the past eight months compared to the previous year. The campaign leverages readily available AI image generators, voice cloning technology, and text-to-image tools to create fake videos, manipulated images, and fabricated content targeting global elections, Ukraine, and immigration issues across multiple platforms.
The content explosion: Between September 2024 and May 2025, Operation Overload produced 587 unique pieces of content—more than double the 230 pieces created in the entire previous year from July 2023 to June 2024.
- The majority of recent content was created using AI tools, marking what researchers call a shift toward “more scalable, multilingual, and increasingly sophisticated propaganda tactics.”
- Video production alone jumped from 150 videos over 13 months to 367 videos in just eight months, with most recent videos using AI manipulation technology.
What tools they’re using: Researchers identified Flux AI, a text-to-image generator from Black Forest Labs, a German company founded by former Stability AI employees, as a primary tool in the campaign’s arsenal.
- Using SightEngine image analysis, researchers found a 99 percent likelihood that fake images—including those depicting Muslim migrants rioting in Berlin and Paris—were created with Flux AI.
- The campaign uses prompts containing discriminatory language like “angry Muslim men” to generate inflammatory content that “promotes racism and fuel anti-Muslim stereotypes.”
- Voice cloning technology manipulates videos to make prominent figures appear to say things they never did, including a fabricated video of Isabelle Bourdon, a senior lecturer at France’s University of Montpellier, seemingly encouraging German citizens to riot and vote for the far-right AfD party.
Platform reach and response: The AI-generated content spreads across over 600 Telegram channels and bot accounts on X, Bluesky, and recently TikTok.
- TikTok accounts sharing the content reached 3 million views before the platform demoted them, though only 13 accounts were involved.
- While Bluesky suspended 65 percent of fake accounts, researchers noted that “X has taken minimal action despite numerous reports on the operation and growing evidence for coordination.”
The unusual email strategy: Operation Overload employs a counterintuitive tactic of directly alerting media and fact-checking organizations to their own fake content.
- Since September 2024, the campaign sent up to 170,000 emails to more than 240 recipients, requesting fact-checkers investigate whether their fabricated content is real.
- Getting content posted by legitimate news outlets—even when marked as “FAKE”—serves the campaign’s ultimate goal of amplifying their disinformation.
What the experts are saying: The campaign’s sophistication and tool diversity surprised researchers tracking the operation.
- “What came as a surprise to me was the diversity of the content, the different types of content that they started using,” said Aleksandra Atanasova, lead open-source intelligence researcher at Reset Tech, a London-based nonprofit. “It’s like they have diversified their palette to catch as many like different angles of those stories.”
- Black Forest Labs responded that they “build in multiple layers of safeguards to help prevent unlawful misuse” and support partners in implementing moderation tools, though researchers found the reviewed images contained no metadata.
The bigger picture: This campaign reflects a broader trend of Russian disinformation networks embracing AI tools for content creation.
- The American Sunlight Project estimates Russian networks produce at least 3 million AI-generated articles annually, content that’s “poisoning the output” of AI chatbots like ChatGPT and Google’s Gemini.
- As distinguishing real from AI-generated content becomes increasingly difficult, experts predict continued growth in AI-fueled disinformation campaigns.
A Pro-Russia Disinformation Campaign Is Using Free AI Tools to Fuel a ‘Content Explosion’