A video of a little boy learning to walk on the edge of a hydroelectric plant’s abyss has gone viral in recent days. Clinging to a railing, he takes uncertain steps, almost tripping, while the intense noise of the waterfalls below can be heard. The feeling that at any moment he is going to fall is suffocating, but it is interrupted by a revealing cut: the scene is fake. In fact, the child is walking on a sidewalk and, between him and the guide, a great edit has inserted image and sound to manipulate the viewer.
The video summarizes what is called a VFX breakdown, content that shows the before and after of special effects in such a fascinating way that it makes the viewer see and rewatch the sequence over and over again. But because it arouses such intense emotions in seconds – fear of the child dying, revolt by those responsible for leaving him there, shock at the falsification of reality, among others –, it also serves as a warning to the destructive potential that audiovisual distortion can have.
This is precisely the greatest risk of the spread of deepfakes, images and videos that not only change scenarios and voices, but people’s faces and bodies through artificial intelligence. In general terms, in this type of digital manipulation it is possible to superimpose and change countenances, expressions, lip synchronization, looks, limbs – in other words: everything, and in a very realistic way, capable of convincing even the most attentive users of social networks.
The use of these videos has been growing exponentially in recent years. Data from the Dutch cybersecurity company DeepTrace shows that between 2018 and 2019, there was a 100% increase in the number of content of this type published on streaming and pornography platforms. Worse: 96% of all videos detected by the company were pornographic and, of course, not authorized by the victims, which means that women, famous or not, have had their faces placed on the bodies of actresses in adult films.
While the main targets are women, no one is immune to deepfakes. Like other forms of disinformation, these videos can demolish reputations, contributing to cases of cyberbullying, and directly interfere in the political debate, causing damage to the individual and collective life of any society.
The damage could be greater than the sharing of “fake news” and WhatsApp chains. That’s because hyper-realistic images are harder to check, which can confuse our senses and scramble our critical sense much faster and more effectively than print or audio media. The power of the moving image is undeniable and deepfakes take advantage of exactly that.
Therefore, the more sophisticated this technique becomes, the more poisonous it can be, especially in polarized environments such as election periods. In the United States, several politicians have already had their faces inserted in adulterated media, of which perhaps the most famous case was that of former President Barack Obama offending the also former US President Donald Trump. Around here, the most emblematic occurrence took place in 2018, during João Doria’s campaign for the government of São Paulo, when an alleged erotic video was released before the decision of the second round.
It is worth remembering that the use of deepfakes can be innocent, generating humor pills based on scenes from series, soap operas and movies, and also pedagogical, as in the Dalà Museum in the United States, where one of the exhibitions used a digital panel that recreated the artist, who died in 1989, on video. However, this has unfortunately lost ground to the various types of social corrosion that this technology can cause.
While social media platforms like Meta and Twitter discuss what to do with this content, free apps and tools for the production of deepfakes are increasingly popular, challenging the debate about copyright and individual freedoms as well as artificial intelligence itself. , as the recognition of legitimate media has become increasingly complicated (as if the spell turns against the sorcerer).
No matter how media-educated and educated we are, it is necessary to assume that, in the face of this type of misinformation that displaces our sense of reality, constant vigilance is necessary. Some researchers believe that, in the not-too-distant future, we will not be able to differentiate between what is true and what is false. Not easily believing what our own eyes see, as in the case of the video of the little boy on top of a hydroelectric cliff and so many others, can be a beneficial attitude in individual and, of course, democratic terms. A dose of healthy skepticism is – and will be – increasingly important in hyper-connected times.
I have over 8 years of experience in the news industry. I have worked for various news websites and have also written for a few news agencies. I mostly cover healthcare news, but I am also interested in other topics such as politics, business, and entertainment. In my free time, I enjoy writing fiction and spending time with my family and friends.