Use Cases

Haiti earthquake video hoax

use_case_1_bigIn January 2010, a major earthquake hit Haiti. The fake video was created and circulated by an individual who downloaded a video from a previous earthquake a few days earlier in California, taken by a monitoring camera at a newspaper headquarter and published by media like CNN. This individual used After-effects software to slightly modify the original video (just by cropping it, to avoid displaying the timestamp of the monitoring camera), and made it appear to be a video from a monitoring camera of an embassy in Port-of-Prince during the Haiti earthquake, and then uploaded it on his YouTube and Daily Motion channels. Subsequently, he started to put links on his Facebook page and use some web engines to artificially increase the number of views on YouTube and Daily Motion. Later-on he received emails from journalists asking for information and he did not answer, just putting in his caption that the video was coming from some official of the embassy, giving the name of a diplomat. Then, he also erased the suspicious comments put on his channels (such as, “why are there no black people in the video?”). While sending in a hurry their reporters, some media took this raw footage and aired it for several hours. People on social networks started to cast doubt on this video and, finally, it appeared to be a fake when the official authorities denied any relation to it and people on social networks discovered the original video. The media which aired the video had to apologize for the mistake.

In this real-life example, having a tool capable of detecting the after-effects alteration of the original video, and also being able to efficiently perform a similarity search through videos on the same subject (earthquake) would have helped to prevent the wider spreading of the hoax.

 

Eagle video hoax

use_case_2_bigIn December 2012 a video showing an eagle snatching a baby was posted on YouTube (https://www.youtube.com/watch?v=CE0Q904gtMI). Very soon it went viral, with many websites re-publishing it. The video got more than 44 millions views on YouTube. There were some doubts about the video’s authenticity, but only a close examination using video editing software helped to detect inconsistencies in the shadows of the eagle’s flight path, thus providing evidence that the video had been somehow manipulated. Later on, it was revealed that the whole video was a hoax prepared for an exam by some Canadian students.

This example shows that being able to detect the inconsistencies of a video file rapidly with the aid of forensic analysis would greatly help in the verification process.

 

Presidential fake picture

use_case_3In January 2013, a fake image of Venezuelan president Hugo Chavez under surgery in a Cuban hospital was published by a top newspaper. The picture was offered by an alleged sister of a nurse working in Cuba to a Spanish photo agency which, in turn, sold it to the newspaper. In fact, the hoax picture was taken from a 2008 Mexican surgery operation video of someone else posted on YouTube almost five years before. The picture was not manipulated; it was just a screenshot of a video file. The video could be retrieved at the time of the hoax with a manual query in YouTube with the Spanish word “intubación” (intubated). The hoax caused major trouble as the newspaper had to withdraw a complete edition in the middle of the night and was accused of political bias in South America.

This example shows that being able to deal with cross-modal fakes, i.e. pictures that have been taken from videos or movies, is very important for journalists and media organizations. It also shows that in the context of verification, we should treat as “user generated content” not only the content that has been strictly created by a non-professional user, but in fact any piece of content that is not fully verified and trusted; as in this example, where a frame of the video became viral after being published by a mainstream media outlet.