Features

To appraise the double-edged: Deepfakes in modern media

By and
Published June 17, 2020 at 5:37 pm
Illustration By Kaitlyn Mercado

AS TECHNOLOGY progresses, developments beyond imagination begin to surface. Such is the case with deepfake technology, a form of artificial intelligence (AI) that makes it possible to alter a person’s appearance in a video or image. While there is potential for much boon in the usage of deepfake for health care and visual storytelling purposes, among many others, the bane is not one to be taken lightly as deepfakes can make fabricating fake news much easier

With some praising the technology and others claiming deepfake is where the truth goes to die, a data analyst and a visual effects specialist unpack both the innocuous and malicious implications of the new technology.

The realm of deepfakes

The term “deepfake” is a combination of the words “deep learning” and “fake.” Data analyst and founder of applied analytics consultancy group Cirrolytix Dominic Ligot explains that deep learning is a subset of machine learning which “allows computer systems to make predictions, classify objects, and generate synthetic information that resembles the original data learned.” 

Visual effects specialist Adrian Arcega adds that deepfake technology’s main purpose is to make it possible for computers to “process someone’s likeness and [place] it onto another person’s.” Arcega has independently worked with deepfake technology and has since applied his research on media projects such as Truefaith’s music video Uwian Na (2018), where deepfake technology is used to bring showbiz personality Lourdes Carvajal back to life.

Ligot adds that deepfake “accomplishes what traditional computer-generated images are used for.” The main difference, he says, is that deepfake technology is cheaper and faster because the algorithms make new media with little to no human intervention. 

However, Arcega points out some limitations in the technology, such as the 256×256 pixel limit for face processing. “In other words, it works for low-resolution video, but is a far cry from being effective in high-definition film,” he explains.

Despite deepfake technology’s limitations, Arcega believes that deepfakes may pave the way for major changes in the entertainment industry, claiming that actors might just become another element in the canvas of video and film. Because actors’ schedules are often erratic, Arcega says that deepfakes can replace stars in short scenes when they are unavailable. He further asserts that the industry may have to “redefine the concept of actors” amid developing issues of value and market importance. 

From the point of view of a visual effects practitioner, Arcega also expresses his delight with face replacement technology being more affordable compared to the early 2000s. Previously, only big-time companies like Industrial Light & Magic could utilize deepfake technology. As the power to create deepfakes becomes more accessible, the possibilities of the software continue to be boundless.

“Technology has always been a double-edged sword,” Arcega says. “It is no different with anything in the adult video industry, which tends to be on the forefront of internet technologies… [I]deally, [deepfakes] should be used for frivolous or entertainment matters—such as film, TV, the occasionally fun app.” 

However, Arcega recognizes that if deepfake technology were to fall in the wrong hands, then much could be done to abuse it. 

What is real and what is not

The technology of deepfake is undeniably innovative and intriguing, but its ability to blur the line between real and fake instills a very valid fear of identity theft and other illegal activities. 

These possibilities pose potential risks for political figures—as they could be intentionally misquoted by generating fake faces and voices. Aside from this, Arcega notes that deepfakes could also be misused to impersonate entertainment professionals without their consent. One such example that Arcega cites is how deepfake technology’s popularity stems from its usage in pornographic videos where celebrity’s faces are superimposed on the body of a porn actress.  “This has its own set of issues in terms of violating a person’s agency for use in someone’s pleasure,” he says. 

Arcega adds that the possible misuse of deepfakes also carries “implications on issues of truth,” referring to the easier propagation of fake news with deepfakes as a tool. 

All things considered, Arcega asserts that whether deepfakes are good or bad is best determined by how it is used. Ligot adds that as we appreciate these technological developments, it is imperative to also ensure the prevention of cybercrime through strong data privacy measures. “The deep learning process, which powers deepfakes, can help society with many challenges, but the shocking power of these algorithms need to be tempered with rules [and] guidelines to prevent misuse,” Ligot says. 

“We also have to learn to be responsible and discerning with our applications,” Arcega says. “We don’t [want to] hinder or censor technological development, but we sure have to be mindful and conscientious.”


How do you feel about the article?

Leave a comment below about the article. Your email address will not be published. Required fields are marked *.

Related Articles


Features

October 31, 2024

Tanging Yaman Foundation: Keeping the spirit of giving alive

Features

October 12, 2024

Ateneans continue to amplify call for Palestinian liberation following A4P formation

Features

October 6, 2024

Cradling Classrooms: Examining policies for student-parents

From Other Staffs


Sports

November 21, 2024

Ateneo Table Tennis Teams’ tumultuous season continues with contrasting day three outcomes

Sports

November 21, 2024

Costelo gains final points, Loraña named Rookie of the Year as Ateneo concludes UAAP Athletics Championships

Sports

November 21, 2024

ICYMI: Blue Eagles incur varied results on day two of Season 87 Taekwondo tilt

Tell us what you think!

Have any questions, clarifications, or comments? Send us a message through the form below.