Fake Sex Tapes Of Emma Watson, Taylor Swift, And Natalie Portman Go Viral Online, Generate Controversy

Jesse GrantGetty Images

A series of fake sex tapes featuring Emma Watson, Taylor Swift, and Natalie Portman are going viral online, part of a controversial new trend where people generate fake explicit videos using artificial intelligence algorithms.

Though celebrities’ faces have been cut and pasted onto explicit images for as long as the internet has existed, the latest trend actually uses a sophisticated process of artificial intelligence technology to insert the faces of famous actresses onto the bodies of adult film stars. The videos that result are known as Deepfakes, which Variety noted is a portmanteau of “fake” and “deep learning.”

These fake celebrity sex tapes started showing up in December, when a Reddit user started to share them online. The user told Motherboard that he used publicly available images and videos and then trained AI algorithms to make the celebrity’s face actually “react” as if they were really in the explicit video.

These sex tapes have focused on Taylor Swift and Natalie Portman among a number of others, but some of the most controversial has come from the fake Emma Watson sex tapes. As the Sun noted, several smutty images of Emma Watson were actually taken when the actress was as young as 10 years old.

The report suggested that these videos could actually be illegal, and now some hosting services are taking efforts to take down the fake celebrity sex tapes.

This is not the first time that Emma Watson has been the target of a sex tape, one that appeared to be part of a larger targeted leak that struck several celebrities. Last March, a number of images showed up online that were reportedly stolen from Watson, showing her trying on swimsuits and another video — not showing her face or any identifying features — claiming to show Watson naked in a bathtub.

As the Sun noted, there could be trouble ahead for people sharing the fake Emma Watson sex tapes. Laws in the U.K. prohibit child pornography that is “simulated,” using real pictures of children pasted onto explicit scenes, though it is not clear if any legal action is being taken against those spreading the Deepfakes yet.