AI is Shaping The Online Narrative Around the War

Since the start of the war between the Islamic Republic, Israel, and the United States, false claims and manipulated media circulating across Persian-language social media have tended to cluster around a few recurring themes. A large share of posts focus on the battlefield, with fabricated strikes, attacks, and military incidents presented as real events, often reinforcing either pro–Islamic Republic or anti–Islamic Republic messaging. Others revolve around the newly appointed supreme leader, where AI-generated videos and altered clips are used to construct speeches, appearances, or public reactions that did not occur. At the same time, a number of claims come directly from officials, often contradicted by available evidence.
Contradictory Claims Around the Newly Appointed Supreme Leader
AI-generated video falsely showing a cardboard cutout of Iran’s newly appointed supreme leader
A video claiming to show the unveiling of a cardboard cutout of Iran’s newly appointed supreme leader was generated using AI and shared widely on social media. The original footage from the event shows a framed photo being presented, not a cardboard figure. Visual errors—such as distorted text in the background and unnatural placement of the cutout—indicate synthetic creation.

In his first written message, Mojtaba Khamenei said Iran had only targeted military bases and had not attacked neighboring countries. However, documented reports show that strikes hit a wide range of locations across multiple countries, including non-military targets. These include residential areas, hotels, civilian airports, energy facilities, and commercial ships. The scale and type of targets directly contradict the claim, showing it is false.
AI-generated video falsely presented as first speech by Iran’s newly appointed supreme leader
A video claims to show the first speech of Iran’s newly appointed supreme leader, in which he says he is in good health, dismisses rumours, and calls for unity while urging security forces to avoid harming protesters. However, no verified video or audio message of him in this role exists, and only written statements have been officially released. Unnatural lip movements, inconsistent facial expressions, and clear differences from previously known footage indicate the clip was generated using AI. These signs confirm the video is not real.

False Claims by Officials
Iran’s foreign minister, Abbas Araghchi, made the claim during an interview with CBS, saying internet restrictions were necessary for security and that all countries take similar measures during war. However, available data show that nationwide internet shutdowns during conflicts are rare and not a common global practice. Large-scale wars such as Ukraine–Russia have not involved deliberate nationwide shutdowns, while only a few cases—mainly in countries like Iran and Sudan—have done so. Presenting this as a standard wartime measure ignores how unusual such actions are, making the claim misleading.
Hardline politician Hamid Rasai falsely claims a one-week deadline for selecting Iran’s new leader
Hamid Rasai, a hardline Iranian politician and member of parliament, claimed that under the law, the interim leadership council can only operate for one week and a new leader must be appointed within that time. This claim is incorrect. Iran’s constitution does not set any fixed deadline and only states that a new leader should be chosen “as soon as possible.” Until then, the interim council continues to carry out leadership duties with no specified time limit.
Pro-Islamic Republic Content
Misattributed video falsely claimed to show daughter describing Khamenei’s body
A widely shared video claims to show Ali Khamenei’s daughter describing his body on television. The woman in the clip is not related to Khamenei, and the claim is false. The footage is actually from an interview with the wife of an IRGC Quds Force commander, aired on a TV program before the current events. The video has been taken out of context and misidentified.

Former Hamshahri editor-in-chief falsely claims Madrid protest supported Iran
Abdollah Ganji, a political figure close to conservative circles, shared a video claiming a large protest in Madrid was held in support of Iran. The footage does not show any pro-Iran demonstration and predates the current events. The video actually shows a pro-Palestinian protest held in Madrid on October 4, 2025, where Palestinian flags are visible throughout the crowd. The claim misrepresents an unrelated protest to suggest international support for Iran.

AI-generated image falsely claiming to show Basij commander fleeing in disguise
An image circulating online claims to show a Basij commander disguised in a chador while attempting to flee. The image is not real and was generated using AI. Reverse image checks indicate it was created with AI tools, and there is no verifiable evidence confirming the identity mentioned in the claim. Regardless of the identity, the image itself is fabricated.

AI-generated video shared by Iranian embassy accounts falsely showing missile attack on Tel Aviv
A video shared by official Iranian embassy accounts and widely circulated on social media claims to show Iran bombing Tel Aviv, but it is AI-generated. The footage did not originate from any real event and appears to have first been posted by an unrelated social media account before spreading across platforms. Clear visual inconsistencies—such as cars suddenly appearing or freezing in place and a floating Israeli flag with no physical attachment—reveal synthetic generation. These flaws confirm the video does not depict a real missile attack.

Digitally altered image falsely showing message about Epstein victims on Iranian missile
An image shared by official Iranian accounts claims a missile carried the message “in memory of Epstein Island victims,” but the text was added later. The original photo of the missile had been published earlier without any such writing. The mismatch between the original and edited versions shows the image has been manipulated. The claim that the message appeared on the missile is false.

False claims about Iran downing a U.S. F-15 are based on an unrelated crash in Kuwait
Claims that Iran shot down a U.S. F-15 are not supported by any verified evidence and rely on misattributed footage. Videos presented as proof actually show an incident in Kuwait, where three U.S. F-15E jets were brought down by friendly fire and the crews ejected safely. No satellite imagery, official reports, or independently verifiable data confirm any F-15 crash in Iran. The lack of credible evidence, along with the misuse of unrelated footage, shows the claim is false.

AI-generated video of an alleged Iranian missile strike in Qatar
A widely shared video claims to show an Iranian missile hitting a U.S. base in Qatar, but it is AI-generated. The footage shows clear signs of manipulation, including unnatural movements, no realistic reactions to the explosion, and people and vehicles disappearing during the blast. The audio is also unusually clear for such a large explosion, and no verified footage of such an event exists. These issues confirm the video is not real.

AI-generated video misrepresented as massive Iranian strike in southern Tel Aviv
Footage shared online as a large-scale Iranian missile strike in southern Tel Aviv is not authentic and was generated using AI tools. The scene includes unrealistic elements, such as people showing no reaction to the blast, distorted body movements, and an explosion that spreads unnaturally. Although the area has been targeted during the conflict, real footage from those events looks very different. The visual anomalies in this clip show it does not depict a real event.

Anti-Islamic Republic Content
AI-generated video falsely showing a drone strike on Basij and IRGC forces
A video claims to show a drone strike targeting Basij and IRGC forces during the recent conflict. The footage is not real and was generated using AI. Frame analysis shows clear visual glitches, including distorted elements and a person standing motionless during an explosion. These inconsistencies confirm the video does not depict a real attack.

Old videos falsely linked to deaths of U.S. soldiers in Iran-related attacks
Clips shared on Instagram claim to show U.S. military officers informing families about soldiers killed in recent attacks, including scenes of a child receiving news about her father and a woman learning about her husband’s death. These videos are not related to the current conflict and were published online years earlier across platforms like YouTube, TikTok, and Instagram. In reality, such moments are not publicly filmed due to strict U.S. military protocols protecting families’ privacy. Many similar videos online are staged or produced for training purposes, not real notifications of casualties.

AI-generated video falsely showing the arrest of a military member for disobeying orders
A video claims to show a member of Iran’s security forces being arrested for refusing orders, with a woman in the background shouting that he has a young daughter. The clip has been widely shared with different claims about the man’s affiliation, including the IRGC, Basij, or police. Visual inconsistencies—such as distorted body movement, unrealistic vehicle models, and unreadable license plates—indicate the footage was generated using AI. These signs confirm that the video does not show a real arrest.

AI-generated video falsely showing a precision missile strike on a Basij base in Shahin Shahr
A video circulating on social media claims to show a precise missile strike targeting a Basij base in Shahin Shahr. However, the footage is not real and was generated using artificial intelligence. Visual inconsistencies—including unreadable text on signage, unnatural movement of individuals before and after the explosion, and a lack of physical reaction from surrounding vehicles—indicate synthetic content. These anomalies confirm that the video does not depict a real attack.

AI-altered video falsely showing an attack on a Basij checkpoint in Tehran
A clip described as showing the moment a Basij checkpoint in Tehran was hit is not authentic, and the explosion was added using AI to an older video. The original footage had already been published earlier without any explosion, and the altered version inserts a blast into the scene. Visual inconsistencies—such as people and motorcycles completely disappearing after the explosion with no debris—indicate synthetic manipulation. These signs confirm the video does not show a real event.

AI-generated video falsely showing a sniper targeting civilians
A video shared on Instagram claims to show a sniper identified as “Ruhollah Karami” admitting to shooting civilians, but it is AI-generated. The clip contains multiple inconsistencies, including unreadable text on the uniform, unnatural speech and voice, and the absence of ambient sound despite visible traffic. Additional visual errors—such as incorrect details in the weapon design and anomalies in the uniform—further indicate synthetic content. These issues confirm the video is not authentic.

Digitally manipulated video falsely showing Israeli flag projected on an apartment block in Tehran
A video showing an Israeli flag projected onto a residential building in Tehran has been altered, with the flag added to an existing clip. Comparison with another video from the same time and location—showing identical window patterns and matching audio—confirms the original scene did not include any flag. The manipulation appears to use advanced visual effects rather than clear AI-generated artifacts, but the flag itself is not real. This shows the claim is false.

AI-generated video falsely showing IRGC launcher operators surrendering to drones
A video shared online claims to show two IRGC launcher operators surrendering to U.S. or Israeli drones in a desert setting, but it is AI-generated. The clip includes clear anomalies such as unnatural body movement, inconsistent lighting on hands, distorted shadows, and ground and vegetation behaving unrealistically. These visual inconsistencies indicate synthetic content rather than real footage. The claim that the video shows an իրական surrender event is false.

Old Instagram video of motorcyclists misrepresented as a drone-related incident in Tehran
An Instagram video showing several motorcyclists fleeing while a police car turns and joins them is being shared as footage from Tehran on March 11, 2026, allegedly after security forces spotted an Israeli drone. However, the same video was already posted on January 24, 2026, with a different explanation, before the current conflict began.
There is no evidence linking the video to the claimed date or location, and its low quality makes key details impossible to verify. Its earlier circulation under a different narrative shows the claim is false.

AI-generated video of police panic at checkpoint after alleged drone attack
A video circulating across social media claims to show police officers panicking at a checkpoint during a drone attack, but it is AI-generated. Visual inconsistencies reveal manipulation, including unreadable text on the police vehicle and weapons appearing and disappearing in officers’ hands during the clip. These irregularities are consistent with synthetic content and do not reflect real events. The claim is therefore false.

AI-generated video falsely showing an attack on an Iranian police patrol in Izeh
A video circulating online claims to show an armed attack on an Iranian police patrol in Izeh, but it is AI-generated. Visual details do not match the real location, and the police vehicle shows inconsistencies in its design, markings, and lights. Although reports of clashes in the area have appeared in the past, no verified footage exists. These visual inaccuracies and lack of evidence confirm the video is not real.

Old video misrepresented as a funeral for U.S. soldiers killed in Iran conflict
A video shared online claims to show Americans welcoming the bodies of soldiers killed in the recent conflict with Iran. The footage is unrelated and predates the current events.
The video actually shows a memorial for two sheriff’s deputies killed during a police pursuit in Missouri. It has been miscaptioned to falsely suggest it depicts casualties from the Iran-related conflict.
