Quantcast
Channel:
Viewing all articles
Browse latest Browse all 25628

Deepfakes Are Like Face Swap On Steroids — But How Scared Should You Be TBH?

$
0
0

The origin of the term, “deepfake,” began with a simple interview.

Samantha Cole, an assistant editor at Vice interviewed the first Redditor to post a video with highly realistic face-swapping technology who referred to himself as, Deepfake.

The technology was new and due to its ground breaking advancements, many were intrigued. However, the video that Deepfake created was a porn video that stared Gal Gadot, despite the fact that Gal Gadot has never starred in a porn video.

Deepfake, told Cole, ‘"I just found a clever way to do face-swap," he said, referring to his algorithm. "With hundreds of face images, I can easily generate millions of distorted images to train the network," he said. "After that if I feed the network someone else's face, the network will think it's just another distorted image and try to make it look like the training face." 

The term, deepfake, encompasses the entire practice of using face-swapping technology to distort or enhance a video or image. In simpler terms, they are created by feeding hundreds of photos from something as simple as a Google image search or Facebook photos and placing one face, on top of another. In other words, no public image is really safe from being used in a deepfake.

What You (Probably Already) Know About Deepfakes & How They Got into the Porn Industry

Ellis Clopton, a junior journalism major at the University of Nebraska-Lincoln says that most millennials know what a deepfake is, even if they’ve never heard the word. He tells Her Campus, “From my experience, I think most people are familiar with the basic idea of Deepfakes, even if they aren’t familiar with the term. I think any millennial with Snapchat is aware of faceswapping tech. [The tech] the average user plays with right now is pretty crude and mostly harmless, however, I think faceswapping tech is starting to scratch the mainstream, since people are starting to become aware of the pornographic Deepfakes.”

And the pornographic deepfakes, for many, are where the real fear lies.

Mary Anne Franks, a technology law professor at the University of Miami and advisor for the Cyber Civil Rights Initiative told Rolling Stones,” We could all be living in a world, as of next year, in which everybody has a fake sex video of themselves out there.” 

And while that news is unsettling, a new threat has emerged. For deepfake users looking to impose images of celebrities onto porn actors, a Google image search for an actress might yield a photo of them as a minor.

For instance, Emma Watson or Elle Fanning who began their careers well before they were 18 have countless images on the internet that could be legally accessed at any time, and the risk of those images being used for a deepfake are now possible.

The following face dataset that can be accessed on deepfakes.com, shows all but one image of Watson under the legal age of 18. If a deepfake creator were to make a video using the images of Watson as a minor, they could be found guilty of creating a nonconsensual child porn video.

Fortunately, Congress passed the PROTECT Act in 2003 that classified porn computer generated child porn as obscene, and the perpetrator could face up to 30 years in prison. However, finding the perpetrator is where the challenge lies, as many of these videos are made and many deepfake makers are computer savvy enough to stay under the radar. 

The threat, while it may not seem personal or possible for the average American to be a victim of a pornographic deepfake, the impending risk for national security and democracy is another factor that cannot be ignored.

Journalism technologist and former BuzzFeed technologist Ben Kreimer sees zero benefits from this technology when it comes to the free press. He tells Her Campus, “There are definitely interesting creative and artistic possibilities with the technology, which is why I’m interested in it. As for journalism, I can’t think of any upsides. It’s almost certainly going to create yet another challenge for journalism when it comes to the veracity of information and data.”

So how did this technology get into the hands of every day users?

The technology on the surface appears to be something only a trained computer science specialist or major movie production team would spend hours working on. But the creation of Fake App has made it possible for anyone with a graphics processing unit and a programming model built into their computer to make a deepfake video of their own. Most computers and smartphones have this technology built into them.

The entire process takes anywhere from eight to 12 hours to create and, while the finished product using the app is not superior quality, it does show just how far the technology has come and leads to the question, where will the technology go next?

Uh, Why is no one stopping this?

In short, this is not illegal. The First Amendment protects, “freedom of expression,” and as long as the videos are advertised as a, “deepfake” and not real, it’s simply defined as prurient entertainment.

However, there are few exceptions that a victim of a deepfake could argue. They could sue on anti-defamation laws  which would require proof that the creator intended to harm the reputation of the person’s image or they could argue misappropriation of their image, but only for commercial purposes. For example, if the porn video or any video of a person was used to sell a product, the victim then would have a stronger case. 

Is there literally any good news here?

The use of deepfakes is one that has made many fearful but Clopton argues the technology can still be utilized to tell stories in a way we never have before.

"My hope is that the future of journalism is one that embraces new technological developments and seeks to utilize them in new methods of news storytelling," Clopton said. "Today, telling a story shouldn’t be constrained to reading words on a page or watching an anchor on a screen. Remote drones can take journalists up to higher perspectives and into dangerous areas much easier than helicopters. Virtual reality can help someone visualize a natural disaster or a war zone without putting them at risk.”

He notes a deepfakes project that took the recent Solo movie trailer and super-imposed Harrison Ford’s face onto the new actor who is taking on Ford’s iconic role to highlight that not all DeepFakes are bad. 

The pitfalls of deepfakes lie in the exploitation and salacious use of someone’s image to satisfy the needs or desires of someone else. Still, many have argued that deepfakes are cutting edge technology that should be embraced (obviously with respect to others' and their right to own their own images). Regardless of the intent for deepfakes, in the age of ever-evolving technology and cutting edge imagery, deepfakes are new territory and with that comes new responsibility. 

The First Amendment is a powerful weapon in protecting the free press and individuals, and Kreimer notes that it’s the journalists who embrace the technology with reverence who will the be ones that succeed.

“Approaching these kinds of tools [ deepfakes] from just a fearful perspective will cloud the response," Kreimer said. "When fear is the motivating factor, people don’t always respond thoughtfully. So, the people who will be able to best combat such technology in journalism are those individuals that are more open and embrace the tool, because they will have taken a more thoughtful and curiosity driven approach to understanding the tool and how it works, which will inform how to defend against it.”


Viewing all articles
Browse latest Browse all 25628

Latest Images

Trending Articles





Latest Images