Scarlett Johansson Calls Attempts To Stop ‘Deepfake’ Porn A ‘Lost Cause’

Scarlett Johansson attends the American Museum Of Natural History's 2017 Museum Gala at American Museum of Natural History on November 30, 2017 in New York City.
Jamie McCarthy / Getty Images

Avengers star Scarlett Johansson is one of the biggest movie stars in the world, and with that fame comes a lot of unwanted attention. A hacking ring stole nude photos and leaked them onto the internet in 2011, and now she’s the victim of another vile trend: AI-generated pornography that overlays someone’s face on another person’s body. They’re called “Deepfakes,” and Johansson recently shared her thoughts on the topic with the Washington Post.

“Clearly this doesn’t affect me as much because people assume it’s not actually me in a porno, however demeaning it is,” Johansson said. “I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself.”

“Also, every country has their own legalese regarding the right to your own image, so while you may be able to take down sites in the U.S. that are using your face, the same rules might not apply in Germany.”

“The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause, for the most part.”

Motherboard reports that deepfakes got their start near the end of 2017 when a Reddit user with that name posted a series of porn videos that seemed to feature celebrities like Gal Gadot, Scarlett Johansson, and Taylor Swift. As Inquisitr noted, a user-friendly program for other users to make their own deepfakes soon followed, which used open source machine learning algorithms to manage the computational heavy lifting involved in making the disturbingly real face swaps. At this point, anyone with a half decent computer and enough photos of a subject can create their own deepfake video.

Eva Longoria (L) and Scarlett Johansson (R) at the 2018 Women's March Los Angeles at Pershing Square on January 20, 2018 in Los Angeles, California.
  Araya Diaz / Getty Images

While Johansson is right in noting there’s little legal protection from the practice, that hasn’t stopped private companies from trying to push deepfakes off their platforms. Reddit banned the Deepfake subreddit in February. Google also addressed the phenomena in September, blocking results for “involuntary synthetic pornographic imagery.” Even Pornhub has committed to removing deepfakes as they pop up. That hasn’t stopped the many less scrupulous sites from profiting off the trend. One video of Johansson on a streaming porn site has over 1.5 million views.

Johansson noted that it’s not just celebrities that need to be worried about how this technology is being used.

“People think that they are protected by their internet passwords and that only public figures or people of interest are hacked,” she told the Washington Post. “But the truth is, there is no difference between someone hacking my account or someone hacking the person standing behind me on line at the grocery store’s account. It just depends on whether or not someone has the desire to target you.”

We’re already hearing stories bearing that truth out. A Jezebel article describes online communities where users pay to have collections of photos scraped from Facebook and Instagram accounts turned into deepfakes for as little as $20.