Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources. Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content.
A viral video showing what appears to be an American military contractor waving and greeting a crowd of Gazans penned inside a newly opened aid distribution site has ignited a fierce online debate over its authenticity. The clip, shared by Daily Wire reporter Kassy Akiva, among others, purports to show Palestinians entering a Secure Distribution Site (SDS) run by the Gaza Humanitarian Foundation (GHF) in Rafah. In the video, a uniformed American contractor makes a heart sign with his hands and waves to the crowd, with his face covered by a camouflaged balaclava.
According to Akiva, a source at the site said the crowd had broken through a Hamas roadblock and spontaneously shouted "Thank you, America" after reaching the food aid center. A source at one of the Gaza distribution sites tells me that Hamas set up a roadblock to prevent Gazans from getting aid. They broke through it and were shouting “thank you America” upon reaching the site.
pic.twitter.com/5bfRoy6mcO — Kassy Akiva (@KassyAkiva) May 27, 2025 After the video was posted on X, many users questioned whether the clip was AI-generated. Some claimed the shadows and fence visuals looked digitally altered, while others said the crowd audio sounded inconsistent with the footage. In response, Akiva issued a blunt rebuttal: "This video is not AI, and anyone saying so is simply in denial of the facts."
The Context The debate erupted against a grim backdrop. That same day, May 27, at least one Palestinian was killed and 48 were wounded at the same GHF site, after thousands of civilians pushed through barriers seeking food aid. Gaza's Hamas-run Health Ministry blamed Israeli gunfire for the chaos.
The GHF, a U.S.-backed organization operating independently of the United Nations, has faced criticism from aid groups for its approach to aid distribution, which is designed to bypass Hamas. What To Know Within hours, the video sparked a wide-ranging discussion on social media questioning whether the video was genuine, speculating that it had been generated or doctored with AI, particularly the new cutting-edge AI video tool released by Google last week. "I think it's quite clear that we're at a point where the lines are becoming very hazy and hard to define, and the implications for how this will be used politically are frightening," said one Reddit user on the ChatGPT subreddit, where users frequently discuss topics related to AI use.
According to journalist Kassy Akiva, a source at the site said the crowd had broken through a Hamas roadblock and spontaneously shouted "Thank you, America" after reaching the food aid center. According to journalist Kassy Akiva, a source at the site said the crowd had broken through a Hamas roadblock and spontaneously shouted "Thank you, America" after reaching the food aid center. Kassy Akiva @KassyAkiva Like their counterparts on X, Reddit users were particularly skeptical about the audio, with some questioning the quality of the recording and arguing that the acoustics of the space don't match the sound heard in the clip.
However, the video's legitimacy has been verified, according to leading misinformation journalists. BBC Verify's Shayan Sardarizadeh confirmed the footage was filmed at the Tel al-Sultan aid hub, pointing out that "the video is real and was filmed by the main entrance to the site, which is visible in satellite images, drone footage and other videos." There are viral but false claims that this video of the new Tel al-Sultan aid distribution centre in Gaza is AI-generated and fake.
The video is real and was filmed by the main entrance to the site, which is visible in satellite images, drone footage and other videos. https://t.co/z3NBCgLFEq pic.twitter.com/zKDNtJZopf — Shayan Sardarizadeh (@Shayan86) May 28, 2025 In his post, Sardarizadeh tagged journalist Tal Hagin, a research fellow at FakeReporter—an Israel-based group fact-checking war-related claims on social media—who also verified the authenticity of the video using geolocation methods. "I was able to geolocate the image and find a unique detail present in both images that I'd find very hard (and surprising) if the AI had managed to replicate it," Hagin wrote on X, pointing to destroyed buildings and unique fence alignments visible in both the video and satellite maps.
I also thought the video looked a bit "odd". While I can't speak for the authenticity of the audio – I was able to geolocate the image and find a unique detail present in both images that I'd find very hard (and surprising) if the AI had managed to replicate it: 1. We can see… pic.twitter.com/DKqi7qhRNT — Tal Hagin (@talhagin) May 28, 2025 Hagin added more granular detail pointing to the configuration of metal fences near the entrance: four facing left and two to the right.
He noted that these details precisely matched overhead video and mapping data, stating that the alignment would be extremely unlikely for an AI to fabricate with precision. Remnants of a destroyed building are visible in the distance. That same building is visible in a satellite image in roughly the same direction.
Remnants of a destroyed building are visible in the distance. That same building is visible in a satellite image in roughly the same direction. Tal Hagin @talhagin He also responded to users questioning that faces in the video appeared to "phase" through the fences or that shadows were unnatural.
"A higher quality of the footage doesn't showcase any of the faces 'fazing' through the fence, as it might appear in a screenshot," Hagin explained. "This appearance is very likely due to image quality, not AI." On the issue of shadows, Hagin noted, "The people behind the fence are standing very tightly together, with some of their arms raised—I think those are what the shadows are of."
When a user suggested the video was manufactured, Hagin replied, "There is a direct connection, as markers in the video shared by the [original poster] can be lined up perfectly with satellite/drone imagery of the area. That's geolocation. If you disagree, please state what I got wrong with the geolocation analysis."
Though they authenticated the visuals in the video, both Sardarizadeh and Hagin were unable to verify the audio track. "While I can't speak for the authenticity of the audio," Hagin noted, "the markers in the video align perfectly with satellite/drone imagery." Sardarizadeh, similarly, only confirmed the video location, not the sound.
Brave New World The debate over the authenticity of the short clip reflects growing concerns about the accelerating capabilities of generative artificial intelligence to distort reality and fuel misinformation. As synthetic media tools become increasingly sophisticated, the ability to fabricate realistic imagery and sound is beginning to challenge the foundations of trust in visual journalism and digital evidence. Even in cases where the video is verified to be authentic, the mere fact that an AI could conceivably have created it adds a new layer of complexity to the fight against misinformation and so-called "fake news."
"In the past couple of years, we've witnessed the proliferation of generative AI text and image generators, but video feels even more high-stakes." wrote AI reporter Victoria Turnk in an op-ed for The Guardian. Google's Veo 3 and OpenAI's Sora represent the latest wave of high-powered AI tools capable of generating photorealistic videos from text prompts.
Veo 3, introduced by Google just last week, can produce clips that include synchronized speech, background noise, and even environmental effects, raising the bar for what AI-generated content can simulate. In one viral demo of Google's Veo abilities, a clip that purports to be footage from an auto show was generated entirely from scratch. Nothing in the below video is real.
OpenAI's Sora similarly allows users to create lifelike moving images based solely on descriptions, giving creators—as well as propagandists—access to unprecedented tools of visual storytelling. The implications are especially significant in conflict zones. Verification of footage from war-torn regions has always been difficult, but the rise of accessible, high-fidelity generative video now allows bad actors to simulate real-world events, frame opponents, or discredit authentic documentation.
The Gaza video controversy, where viewers disputed the authenticity of audio and visuals despite expert geolocation and forensic validation, illustrates the fragility of trust in the digital age. "The advances in generative video are stunning, but they're also deeply unsettling," said Hany Farid, a professor at the UC Berkeley School of Information and a leading expert on deepfakes. In an interview with ABC News, Farid warned that "as these tools get better, the burden of proof for visual content becomes steeper.
People will either fall for fakes—or doubt the truth." Both Google and OpenAI have acknowledged these risks. Google has said it will limit Veo access to vetted partners and use metadata tagging to identify AI-generated videos, while OpenAI has stated that it is working on watermarking technology and content provenance tracking to accompany outputs from Sora.
Gaza Aid Starts to Flow The Gaza Humanitarian Foundation (GHF) began operations on Monday. Recently launched with U.S. and Israeli backing, the group has drawn criticism from the United Nations and Palestinians on the grounds related to its ties to Israeli security groups, and fears it could be used to further displace Gazans to the south of the besieged enclave. The U.N. affiliated agencies have declined to join the operation, citing violations of neutrality.
"What we saw yesterday is a very clear example of the dangers of distributing food under these circumstances," said Ajith Sunghay, head of the U.N. Human Rights Office for the Palestinian territories (UNRWA). Palestinians carry boxes containing food and humanitarian aid packages delivered by the Gaza Humanitarian Foundation, a U.S.-backed organization approved by Israel, in Rafah, southern Gaza Strip, on Tuesday, May 27, 2025. Palestinians carry boxes containing food and humanitarian aid packages delivered by the Gaza Humanitarian Foundation, a U.S.-backed organization approved by Israel, in Rafah, southern Gaza Strip, on Tuesday, May 27, 2025.
Abdel Kareem Hana/AP Photo UNRWA Commissioner-General Philippe Lazzarini was equally blunt. "This new aid model is not only wasteful but a distraction from atrocities," he told Reuters. The GHF system, Lazzarini argued, circumvents established humanitarian norms and risks using food as a weapon of population control.
The aid project has strong ties to President Donald Trump's administration, which has publicly endorsed it as a way to prevent Hamas from intercepting relief supplies. "Aid is getting to the people in need and through their secure distribution system, Israel is kept safe and Hamas empty-handed," a senior Trump official told the Daily Wire. What People Are Saying Senior U.S. administration official to Newsweek: "GHF is a threat to Hamas' longstanding system of looting the assistance intended for the people of Gaza.
The UN and other aid agencies were wrong to criticize. Aid is getting to the people in need, and through their secure distribution system, Israel is kept safe and Hamas empty handed." Jens Laerke, spokesperson for the UN aid coordination office, OCHA: "It is a distraction from what is actually needed, which is a reopening of all the crossings into Gaza, a secure environment within Gaza and faster facilitation of permissions and final approvals of all the emergency supplies that we have just outside the border; [aid] needs to get in."
Hamas in Arabic statement, via Telegram: "The scenes of thousands of our people rushing into the center designated for implementing the occupation's mechanism for aid distribution, and the accompanying live fire directed at citizens who had gathered at the distribution center under the pressure of hunger and siege, leave no room for doubt that this suspicious mechanism has failed. What Happens Next As uncertainty on the ground looms over humanitarian assistance operations, Israel said hundreds of aid trucks have been permitted to enter and distribute food in Gaza.