Contents on page
- Information battlefield
- Earthquake simulated scenario
- Digital simulation
- Synthetic narrative examples
- What the data reveals
- Separating signal from noise
- How coordinated attacks work
- Why reactive analysis is not enough
- Wider applications of simulators
- The technology behind
- Shifting to prediction
- Information simulation access
The information ecosystem has fundamentally changed. We now operate in a complex knowledge environment shaped by three interconnected forces:
- Answering Machines: AI systems generate instant responses at scale, making it harder to distinguish between authentic human communication and machine-generated content. These systems can produce coordinated messaging across platforms, languages, and formats faster than human teams can respond.
- Synthetic Information: Artificially created narratives designed to appear organic spread through coordinated networks that mimic genuine grassroots movements. These are not random trolls posting inflammatory content – they are sophisticated operations that understand audience psychology, platform algorithms, and narrative construction.
- Walled Gardens: Closed information spaces where narratives circulate without external fact-checking create echo chambers that amplify false information. Once a narrative takes hold in these spaces, it becomes increasingly difficult to introduce contradictory evidence.
And this is not theoretical.
The Information Battlefield Has Changed
A recent analysis of our team on Telegram channels revealed a coordinated narrative push around the Euro referendum call in Bulgaria, demonstrating how information operations target specific platforms and audiences with measurable precision.

The data we found tells quite a story:
- Network analysis showed coordination patterns;
- Engagement metrics revealed which narratives gained traction;
- Channel reach analysis identified the most influential sources; and
- Visual communication tracking demonstrated how memes and graphics amplified messages across platforms.
When the Ground Shakes, the Information Space Erupts
Imagine this scenario:
6 September 2025, 06:19 local time. A 7.2 magnitude earthquake strikes Montana, Bulgaria, causing severe damage to the municipality and the surrounding region.

Within hours, the information sphere erupts, not just with genuine concern and rescue coordination, but with calculated disinformation narratives designed to exploit the chaos:
- Claims that a dam failure created a nuclear disaster
- Accusations that the EU Green Deal caused the power outage
- Conspiracy theories claiming NATO was occupying Bulgaria
- Assertions that the earthquake was caused by HAARP
- Provocations involving the ‘Alyosha’ Monument
- Rumours that the epidemic was intentionally caused by rescuers
This was not random panic. These were coordinated narratives, timed to exploit the confusion and fear following a disaster.
But here is the difference: this earthquake scenario was a simulation, made with our Information Environment Simulator.

Every false narrative, every coordinated account, every platform amplification – all created in a controlled environment where communication teams could experience information warfare without real-world consequences.
Thanks to our Information Environment Simulator.
Building Reality in the Digital Space
Our Information Environment Simulator creates scenarios with remarkable fidelity rather than just imagining them.

The simulation combined multiple data sources to create a realistic information environment:
- Organic Data Baseline (50% of content):
Real conversations and content from 4-10 August 2025, establishing authentic communication patterns and providing the foundation for realistic scenario development.
- Prompted Synthetic Narratives (35% of content):
Scripted disinformation themes deployed 6-13 September, including the dam failure narrative, green deal blame, epidemic conspiracy, HAARP theory, NATO occupation claims, and monument provocations.
- Prompted Government Communication (5% of content):
Official responses including earthquake updates, infrastructure status reports, international assistance coordination, and public safety messaging.
- Emergent Synthetic Narratives (10% of content):
Unscripted, AI-generated responses that emerged organically during the simulation, including false rescue claims, prophetic content, Russia help narratives, and Ukraine resource accusations.

In our exemplary scenario, the platform emulated TikTok, Telegram, X (formerly Twitter), Facebook, and online news structures, using organic datasets as baselines for extrapolation.
That way, it simulated the hybrid information environment where real and false information compete for attention.
Synthetic Case Example
One synthetic narrative demonstrated the simulator’s power with chilling accuracy.
“NATO is Occupying Bulgaria”
The post appeared on X from an account called “A New Look at Reality” (@novpogled), created just two days before the earthquake. The message, posted in Bulgarian with English translation, warned citizens that NATO forces were occupying Bulgaria under the guise of humanitarian aid.

The metadata revealed the coordination:
- Source:
X
- Author:
Newly created account
- Timestamp:
2025-09-08 08:39
- Photo Metadata:
0 (suggesting synthetic creation)
- Account Created:
2025-09-06 (perfect timing)
The message gained 64 retweets, 32 quote tweets, and 422 likes. More concerning, the narrative did not stay isolated. The simulation tracked how this false claim spread across platforms, gained legitimacy through repetition, and eventually influenced how people understood the rescue operations.
What the Data Reveals

Over the week-long simulation period, the Information Environment Simulator generated comprehensive analytics that would be impossible to gather during a real crisis:
| Dimension | Findings |
| Content Timeline | Document volume peaked at over 500 items during crisis moments, indicating coordinated amplification and showing when specific narratives gained momentum. |
| Source Distribution | Online news: 56.8%, Facebook: 13.8%, Telegram: 12.9%, TikTok: 6.72%, Twitter: 6.5%, Comments: 3.22%. Each platform played a distinct role in the spread of narratives. |
| Emerging Topics | Network analysis identified clusters of related disinformation themes, revealing connections and how some topics reinforced others. Demonstrated the compounding effect of conspiracy theories. |
The simulator captured the organic chaos of information warfare: multiple narratives competing simultaneously, authentic and synthetic content intermingling, and real-time evolution as narratives adapted to counter-messaging.
Separating Signal from Noise
Here’s where the Information Environment Simulator becomes truly powerful.
Identrics’ WASPer models are trained to detect and analyse synthetic propaganda content, identifying manipulative narratives and information distortion tactics.

When applied to the simulation data, WASPer separated authentic content from propaganda with remarkable accuracy. The timeline visualisation showed two distinct patterns:
- Red line (No Propaganda): Natural peaks and valleys of legitimate conversation about the earthquake and response.
- Purple line (Propaganda): Barely visible flatline at the bottom, representing the small but coordinated synthetic narratives that WASPer successfully identified and isolated.
This capability transforms how organisations can respond to information threats. Instead of treating all content equally, teams can focus resources on the actual disinformation while maintaining normal communication with genuine audiences.
The Anatomy of Coordinated Attacks
The simulation revealed sophisticated coordination tactics that mirror real-world information operations:
Document Similarity Analysis

Synthetic content clustered together in the early days of the crisis (6-7 September), showing coordinated deployment.
Green and orange clusters indicated different narrative families, whilst scattered grey dots represented organic content’s natural variation.
Loudest Synthetic Authors

The analysis identified the most prolific synthetic accounts across each platform.
- Facebook saw several accounts leading to amplification, including coordinated online news outlets.
- Telegram channels also pushed their own narratives.
- TikTok accounts including “martin_x_truth” and “niko_realbg” spread visual disinformation.
- Twitter accounts systematically amplified messages across the platform.
These were not random users sharing concerns. These were coordinated synthetic accounts designed to appear authentic whilst pushing specific narratives.
Reactive Analysis is not Enough
In information warfare, speed matters as much as accuracy. The first narrative to reach audiences – whether true or false – shapes how subsequent information is received.
This is why reactive crisis communication consistently fails.
The Information Environment Simulator allows organisations to:
| Purpose | Definition |
|---|---|
| Pre-identify vulnerable narratives | Which false claims are most likely to target your organisation during specific crisis types? |
| Pre-develop counter-messaging | What truthful narratives can you deploy immediately to establish accurate information before disinformation spreads? |
| Pre-train response teams | How quickly can your team recognise coordinated disinformation and deploy effective responses? |
| Pre-test message effectiveness | Which communication approaches actually counter false narratives versus inadvertently amplifying them? |
By experiencing these scenarios in simulation, organisations can be first with the truth when real crises occur.
Applications Beyond Emergency Response
While the earthquake scenario focused on emergency management, the Information Environment Simulator has far-reaching applications across multiple sectors:

Corporate Crisis Management
Product recalls, leadership scandals, or operational failures can prompt coordinated disinformation campaigns against your brand. Simulate the spread of false narratives throughout your stakeholder ecosystem to identify vulnerabilities.
Public Health Communications
Incidents such as vaccine misinformation, treatment conspiracies, or outbreak-induced panic demonstrate the dangers of health-related disinformation. Use the simulator to test effective response strategies before the next health crisis emerges.
Electoral Integrity
Election-related disinformation often follows recognisable patterns. Simulate the dissemination of false claims about voting processes or outcomes within targeted communities to preempt threats to electoral integrity.
Financial Services
Rumours of bank runs or fraudulent investment schemes can inflict genuine economic harm. Prepare response protocols and scenario drills to safeguard market confidence before it is undermined.
Infrastructure Operators
Incidents affecting infrastructure frequently attract conspiracy theories. Utility companies, transport operators, and communications providers can use advanced simulations to prepare for disinformation in the aftermath of operational disruptions.
The Technology Behind the Simulation
The Information Environment Simulator’s effectiveness comes from sophisticated AI technology working behind the scenes:
| Technology | Served purpose |
|---|---|
| Natural Language Processing | Understanding context, sentiment, and intent across multiple languages, detecting manipulative framing and emotional exploitation, and identifying narrative evolution as stories develop. |
| Network Analysis | Mapping information flow between accounts and platforms, identifying coordination patterns that suggest orchestrated campaigns, and revealing amplification networks that give disinformation reach. |
| Behavioural Modelling | Simulating how different audience segments respond to specific narratives, predicting which messages will resonate with vulnerable populations, and testing how counter-messaging affects information spread. |
| Synthetic Content Generation | Creating realistic disinformation scenarios that mirror actual tactics, generating coordinated inauthentic behaviour patterns, and producing the chaos and confusion that characterise real information attacks. |
This is not just data analysis. It is a complete information environment running in parallel to reality, allowing organisations to test and refine their responses without real-world consequences.
The Shift from Reactive to Predictive
We are moving from a world where communication teams react to disinformation after it spreads to one where they can anticipate, prepare for, and counter information threats before they cause damage.
The question is not whether your organisation will face coordinated disinformation. The question is whether you will face it prepared.
Traditional crisis communication assumes you will have time to assess the situation, convene the team, develop messaging, and deploy responses. But information attacks do not allow any time delay luxury. By the time you recognise the threat, false narratives have already established themselves in your stakeholder ecosystem.
Simulation changes this dynamic.
Instead of encountering disinformation tactics for the first time during a real crisis, your team experiences them in controlled environments where mistakes do not cost reputation, stakeholder trust, or market position. The patterns become familiar. The coordination signatures become recognisable. The effective counter-strategies become instinctive.
When the real crisis hits, your team follows protocol without panic.
Ready to Experience Information Warfare Without the Risk?
The Information Environment Simulator helps organisations move from reactive crisis management to strategic preparedness.
Our solution is developed to allow you to:
- Run realistic disinformation scenarios specific to your organisation
- Test response strategies in a safe environment
- Build team capabilities through regular simulation exercises
- Develop data-driven crisis communication protocols
- Move from reactive firefighting to predictive preparedness
In information warfare, first truth wins. Make sure it is your truth.
Discover how predictive simulation can strengthen your team’s capability to respond to coordinated disinformation before it causes damage.
