New Sexually Explicit App Allows You To Paste Anyone’s Face Onto The Star’s Body


In what sounds like a future plot of a Black Mirror episode, an app has been created which allows you to paste a person’s face onto the star of a sexually explicit video. The technology first surfaced in a dark corner of the web in December 2017 and since then, more and more videos have appeared with celebrities’ faces spliced into adult videos, Newsweek reports.

According to Motherboard, the technology was developed by a Reddit user named “deepfakes” and uses a machine-learning algorithm and open-source code that can be easily accessed on the web for free. But, deepfakes didn’t create FakeApp, the app that anyone can download so that they can create these videos themselves. That’s the brainchild of another neural-network savvy developer named “deepfakeapp.”

It’s not perfect by any means. It may trick you at first glance but a closer look will reveal that the face has been superimposed.

But the fact that it can be done in a relatively convincing way is raising concerns about the ethics of the new technology. Even though the videos are still rough around the edges right now, there are experts who say that it could be improved in the near future to give the viewer a more seamless experience, Newsweek notes.

A key concern is the fact that these fake porn video creators are taking a person’s image and using it for a purpose for which it was not originally intended. For example, as Motherboard notes, Gal Godot’s face was used in an incest-themed explicit video which is more than likely something that the Wonder Woman actress would not be okay with.

Also, if you can paste a celebrity’s face into a sexually explicit video, it’s easy to see how this could evolve into the ability to use anyone’s face. Here’s what the creator of FakeApp had to say about the evolution of the technology in an interview with Motherboard.

“Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button”

But, as NY Magazine notes, the app could be used for nefarious plots way outside the scope of adult entertainment. It isn’t hard to envision it being used to cut and paste the face of a controversial political figure like Kim Jong Un, Donald Trump or Vladimir Putin into a video and make it seem like they said or did something that they didn’t. We’ve seen how gullible people are when it comes to sharing fake news articles on social media. Imagine the impact of a video that uses an upgraded version of deepfakes’ technology.

Share this article: New Sexually Explicit App Allows You To Paste Anyone’s Face Onto The Star’s Body
More from Inquisitr