The footage is shaky but the sounds of gunfire and “Allahu Akbar!” are unmistakable as the boy darts along the dusty road towards the burnt-out car. Puffs of smoke erupt around him. He falls to his knees. Has he been shot? It’s hard to tell, but a moment later he is up again, running for the shelter of the abandoned car. Yet it’s not over. He emerges holding the hand of an even younger girl dressed in pink. They run, hesitantly at first, then desperately. The fear on their faces is palpable.
This is the ‘Syrian Hero Boy’. The footage appeared on 10 November 2014 on YouTube and it quickly went viral as millions of viewers watched, astonished at the boy’s bravery and shocked at a world that could place children in such danger.
But further shock was to come. The film was fake. It was filmed in Malta on the set of Gladiator by Norwegian film-maker Lars Klevberg.
Klevberg’s intention was to spur debate about children and war. By pretending the film was real, he believed that “people would share it and react with hope.”
It also drew attention to an increasingly common scenario: fake footage appearing on social media. “By publishing a clip that could appear to be authentic, we hoped to take advantage of a tool that’s often used in war; make a video that claims to be real,” he said.
In our digitally enabled world, a legion of ‘civilian witnesses’ has sprung up: individuals “in the wrong place at the wrong time” who capture an event and then publish the scrap of footage or the incriminating photograph on social media. But amid the fog of propaganda, hoaxes and digital manipulation, how can we tell what’s real and what’s fake?
Cambridge researchers are developing an automated tool, ‘the Whistle’, to help verify the authenticity of digital evidence.
Behind the Whistle is sociologist Dr Ella McPherson: “There is much excitement about the speed with which news can be captured by bystanders and disseminated on social media. In the field of human rights, it allows fact-finders for NGOs to get digital reports of violations from hard-to-reach places.
“In a country such as Syria, which is largely closed to outside observers, YouTube videos are a crucial source of information for people within and without its borders and contribute to an information environment incomparable to the past.”
She mentions footage that appeared on social media in 2013 which Syrian opposition activists claimed as being evidence of a chemical weapons attack. An expert told the BBC that the footage was consistent with such an attack, although he cautioned that it was difficult to verify the film owing to the absence of metadata. Meanwhile the state-run news agency Sana said the claims were “baseless” and an attempt to distract United Nations weapons inspectors.
“This example shows the nature of the terrain we are now in: news is disseminated fast but verification is slow and often contested,” says McPherson. “For human rights NGOs, credibility can be lost in a moment if the evidence they are using for advocacy or in courts is later found to be false. No matter how devastating the documented violations, they cannot act on them unless they can verify them first.”
Many guidelines, handbooks and tools now exist to help the verification process. The ‘witness’ can be checked through their digital footprint – their organisational affiliations or a social media profile, for instance. The image itself can be corroborated through comparison with landmarks and weather data, or checked using tools that ‘reverse image search’ for previous publication.
However, all of this takes precious time, which may introduce bias – those who are easier to verify may be more likely to be heard than those who have few resources and a minimal digital footprint.
Through her research interests in how social media can be used by human rights NGOs for generating governmental accountability, McPherson became increasingly aware that fact-finders were struggling with the torrent of information. Time spent verifying was in danger of crippling this most powerful means of communication. For Syria, fact-finders have described the number of videos and photos as becoming a ‘Big Data’ problem.
The Whistle is a digital platform that speeds up the whole process. Being developed for mobile and web, the app eases the process of reporting for the witness, and prompts them to furnish the information needed by the fact-finder for verification – the “who, what, why, where, when” metadata. A ‘dashboard’ then aggregates the information and automates the cross-checking process of comparing the civilian witness report to the many databases that are used to corroborate reliability.
“We knew from fact-finders that civilian witnesses do not necessarily know what metadata is or that they should include it with their information – even something as simple as panning the horizon for landmarks or turning on geolocation features. The Whistle prompts them for the information at the time of upload, so the fact-finders don’t have to piece it together later. It also provides individuals with information literacy – helps them understand what characteristics their information should ideally have in order to do things for them.”
Initially funded by Cambridge’s Economic and Social Research Council Impact Acceleration Account, the Whistle is now funded by the European Union as part of ‘ChainReact’, a multi-partner programme to support whistle-blowing in business. The team has grown to six members and plans to start using the demo of the Whistle to gain feedback from NGOs and civilian witnesses.
McPherson sees the Whistle as a tool for NGOs to use in the field, rather than as a global repository of information, since the latter would create security risks for the whistle-blowers: “Security challenges vary a lot according to context, and we don’t see ourselves as ever being able to anticipate all the security challenges of a local context – it depends on the threat model. So we always want to partner with local organisations.” Although civilians may never have heard of the Whistle, they are likely to be aware of the support of a local NGO, who would then direct them towards the tool as a means to submit information. “It would then be up to the NGO to decide what to do with the data,” she explains.
McPherson is reflective when she considers the implications of a digital world that requires tools such as the Whistle to verify trustworthiness. “Reporting violations and fact-finding are communicative acts of ‘bearing witness’ – inherently human activities that involve solidarity, support and rapport. Technological innovations mean that we may increasingly have to replace this with reporting to a machine – how do you balance that opportunity with safeguards around traumatisation? This is something we don’t yet have answers for.”
Nevertheless, she and her team are aware that an increase in digital information on human rights violations only translates into evidence once it is validated. “Tools like the Whistle are desperately needed by fact-finders to reduce the labour time in sifting the wheat from the chaff, and to make it easier for them to evaluate more digital information for evidence.
“We hope that our platform will increase the possibility that those who report violations receive attention, and particularly that those who most need access to human rights mechanisms are heard.”
The team behind the Whistle include Dr Ella McPherson, Rebekah Larsen, Giles Barton-Owen, Isabel Guenette Thornton, Matt Mahmoudi, Sarah Villeneuve, Dr Richard Mills and Scott Limbrick.
Smartphones and social media have made it easy for accidental witnesses “in the wrong place at the wrong time” to capture and share violations and crimes. But how can we tell what’s real and what’s fake?
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.