Understanding and Dealing with Deepfakes
Deepfakes are fake videos created by using artificial intelligence, otherwise known, as “deep learning” to make images of fake events to look realistic. The word deepfake itself is a coinage from deep learning and the fake.
The technology creates convincing but entirely fictional events making people who didn’t participate in such occurrence appear very closely real. It allows anyone with a powerful computer to create a picture or video of people saying what they didn’t actually say.
It was researchers and university academics who started developing the technology in the 1990s, but it recently slipped into the hands of amateurs and communities of computer users online.
Audios can be deepfaked too. Simple apps like VoiceApp let users mimic the voices of celebrities and public figures to share on messaging applications like WhatsApp. Other sophisticated apps like Lyrebird AI and Overdub allow users to train them to mimic the voices of anybody.
A Tool for Scam
Several deepfake audio scams were reported worldwide, including the story of a hacker who mimicked the voice of the CEO of a UK-based German company which cost the company £200, 000!
According to a report by Forbes, the “fraudster called the company three times: the first to initiate the transfer, the second to falsely claim it had been reimbursed, and a third time seeking a followup payment.
It was at this point that the victim grew skeptical; he could see that the purported reimbursement had not gone through, and he noticed that the call had been made from an Austrian phone number.”
Blowing like Balloon
The number of deepfakes continue to balloon at an alarming rate. A 2019 report by Deeptrace, a startup for tracing the deepfakes, found 7,965 deepfakes on the Internet. Nine months later, the figure jumped to 14,678, approximately doubling the entire videos produced since the emergence of the technology.
Simple internet search could take you to hundreds of deepfakes on social media, especially YouTube. The deepfakes online today range from satire, revenge pornography and celebrity scandals.
The use of deepfake to manipulate voters’ decisions is also getting popular and there is likelihood that similar videos may stir religious unrest, public tensions and undermine democratic discourse and development.
In 2018, the video of Ali Bango suspected by the citizens of Gabon as a deepfake led to launching a military coup, citing the video as a deception from the president.
The biggest threat of deepfakes, according to The Brookings Institutions, is the destruction of public trust. As deepfakes continue to become popular, people will no longer trust one another and it will become more urgent to debunk claims even if they are true and clear.
According to John Dabson, a former British diplomat and expert on deepfake, a “deepfake could destroy democracy”
The Art of Detection
Technologies emerging to spot deepfakes such as the ones created during Facebook’s Deepfake Detection Challenge (DDC) are found weak and become outdated easily.
Researchers at the Deeptrace have initially found that the eyes of people on deepfake videos don’t blink. Less than a year later, the creators of deepfake blocked the loophole by creating a “blinking video”.
With deepfake on the rise, seeing is no longer believing. Although the quality of deepfake videos didn’t reach the parity of real videos, we shouldn’t deceive ourselves to think we are not going there. The only solution today and beyond is the use of critical thinking when watching any controversial video online.
This article is a partnership between HumAngle and the Dubawa 2020 Fellowship (by the Premium Times Centre for Investigative Journalism) in promoting the ethos of “truth” in journalism and enhancing media literacy in Africa.
Support Our Journalism
There are millions of ordinary people affected by conflict in Africa whose stories are missing in the mainstream media. HumAngle is determined to tell those challenging and under-reported stories, hoping that the people impacted by these conflicts will find the safety and security they deserve.
To ensure that we continue to provide public service coverage, we have a small favour to ask you. We want you to be part of our journalistic endeavour by contributing a token to us.
Your donation will further promote a robust, free, and independent media.Donate Here