Bumble Cracks Down On Unwanted Nude Photos With ‘The Private Detector’

The founder not only wants to stop the sharing of unwanted nude photos on Bumble, but she wants the act to be illegal.

Collaboration of shirtless male and female models
Pexels & beachbodydc / Pixabay

The founder not only wants to stop the sharing of unwanted nude photos on Bumble, but she wants the act to be illegal.

The Bumble dating app is rolling out a new artificial intelligence system programmed to detect, censor, and flag nude, lewd, and otherwise sexually explicit photos.

Called the “Private Detector,” the system can detect inappropriate photos with an accuracy rating of 98 percent. The dating app hopes to use the AI to prevent their users from seeing unwanted lewd photos.

When an inappropriate photo is shared, the Private Detector will flag the snapshot and blur it before sending it.

“From there, the user can decide whether to view or block the image, and if compelled, easily report the image to the moderation team,” a representative of Bumble explained in a statement, according to PC Magazine.

For those who are unfamiliar with Bumble, CEO Whitney Wolfe Herd developed and founded the dating app after being inspired to create a more woman-first product based on her experience as the co-founder of Tinder. Like Tinder, Bumble users can show interest in someone by swiping.

Male users of Bumble cannot reach out to female users after a match is made unless the woman initiates the conversation between the two. Wolfe Herd has continued to push the app to define what is and isn’t an acceptable way to behave while dating online. For example, users are barred from sharing pictures exclusively of their children as well as mirror selfies.

According to CNN, there are over 55 million users on Bumble with 5,000 content moderators fielding the 10 million photos shared by these users each day.

Not only is Bumble not the first company to use an AI system to eliminate nude photos, it has used a similar system in the past to remove gun- and gun violence-related images from their platform as well.

During the reveal of the new AI system, Wolfe Herd also announced her plans to put pressure on the Texas state lawmakers to pass a bill to criminalize sharing unwanted sexually explicit photos. Under the proposed bill, anyone who sends an unsolicited crude snapshot to someone could face a fine of $500.

According to a 2017 survey from YouGov, over 50 percent of women within the millennial generation admitted to receiving lewd photos over the internet. Over 75 percent of those women also admitted they were unsolicited. The study also found that less than 25 percent of all the male participants admitted to ever sending an unwanted nude photo.

“The digital world can be a very unsafe place overrun with lewd, hateful, and inappropriate behavior. The ‘Private Detector,’ and our support of this bill are just two of the many ways we’re demonstrating our commitment to making the internet safer,” Wolfe Herd added during her statement.

Loading...

According to The Washington Post, the state of Texas is not alone, as New York lawmakers are also considering a bill that would classify sharing unwanted nude photos or sexually explicit videos as a misdemeanor with repercussions of up to a year behind bars and a $1,000 fine.

While the new system will not eliminate nude photos on the dating app all together, it will give users the option of viewing, deleting, or reporting, as opposed to just being forced to view them when a message containing them is opened.

Bumble currently plans to release the Private Detector in June.