A fake photo showing an “explosion” near the Pentagon building went viral on Twitter today, causing consternation on Wall Street and bringing back to the fore the dangers of artificial intelligence (AI) applications.

The controversial photo allegedly created through such an application forced the US Department of Defense to react. “We confirm that this is false information and that the Pentagon was not attacked today,” said a spokesman for the US Department of Defense.

The Arlington Fire Department was also quick to clarify via Twitter that there was no explosion or other incident at the Pentagon or in the surrounding area.

The temporary disruption caused by the fake photo sent the S&P 500 down 0.29% in New York.

It was the latest in a series of doctored images that have come to light recently, demonstrating the potential of AI technology, such as the photos purportedly capturing former US President Donald Trump or Pope Francis wearing a down jacket.

Software such as DALL-E 2, Midjourney and Stable Diffusion allow even non-Photoshop users to create lifelike images, which they then share on social media platforms, where the lines between truth and fiction are often blurred.