AI war and propaganda are becoming increasingly linked as artificial intelligence reshapes how media is created during conflict.
The way information is being presented during war is beginning to change, and not always in ways that are easy to detect.

In recent weeks, there have been growing reports suggesting that AI generated or manipulated war related content is beginning to circulate more widely online. Images, video clips and audio are being questioned more frequently, and in some cases, it is becoming harder to distinguish between what is real and what has been artificially created – How misinformation and AI deepfakes are reshaping the Iran war.
That alone is significant. But it also raises a deeper and more uncomfortable question.
Propaganda has always played a central role in conflict.
Throughout history, governments and regimes have used media to shape perception, influence public opinion and justify actions. Posters, radio broadcasts and controlled news coverage were all used to present a particular version of events.
If we look back at some of the most powerful propaganda machines of the past, it is difficult not to ask how they might have evolved with access to today’s technology. The scale and speed at which AI can now generate convincing content introduces a very different level of capability.
Artificial intelligence does not just allow content to be edited. It allows it to be created.
Recent examples reported in the media include fabricated or heavily altered footage showing scenes of destruction, military activity or symbolic attacks that never actually occurred, yet still gained significant traction online before being challenged – AI fakes about Iran US war swirl on X despite policy crackdown.
This represents a shift. It is no longer just about controlling the narrative. It is about potentially creating the evidence that appears to support it.

In older science fiction films, there was often a recurring idea that technology would reach a point where reality itself could be manipulated. Synthetic broadcasts, simulated events and environments where seeing was no longer believing.
That concept no longer feels as distant as it once did.
What makes this particularly complex is that the issue is not simply about fake content.
It is about uncertainty.
Even a relatively small volume of AI generated or manipulated material can have a wider effect. Once people become aware that convincing content can be fabricated, it changes how all content is viewed. Genuine footage may be questioned. Real events may be doubted.
Some reports suggest that this erosion of certainty is already beginning to take hold, with audiences increasingly unsure about what they are seeing, even when the material is authentic – Deepfakes are already shaping opinions around conflicts.
There is also a distribution dynamic that cannot be ignored.
Content now spreads based on engagement rather than origin. Platforms prioritise material that captures attention, and during periods of conflict, that often means content that is emotional, dramatic or visually striking.
Artificial intelligence plays a role not only in creating content, but also in shaping how it is distributed. In some cases, manipulated or misleading media has reached large audiences before verification processes have had time to catch up – AI deepfakes blur reality in 2026 US midterm campaigns.
At the same time, it is important to remain measured.
Not all questioned content is artificially generated. Misinterpretation, selective editing and the use of material out of context have existed long before AI. The difference now is that the tools available have significantly increased both the realism and the scale at which such content can be produced.
So where does this leave us?
Perhaps in a position that feels slightly unfamiliar.
People are no longer simply consuming information. They are interpreting it, questioning it and making ongoing judgements about its credibility in real time.
There is a certain irony in all of this.
For years, science fiction imagined a future where technology could blur the line between reality and illusion. That future always seemed distant.
Now, it feels much closer.
Not fully realised, not fully understood, but present enough to change how information is viewed.
And once that shift begins, it does not easily reverse.