Some of the internet’s largest websites are actively banning those deemed too extreme for the internet. According to the Huffington Post, sites like Reddit, Facebook, and Twitter have started banning groups that actively traffic in hate speech. In 2017, Reddit banned the group r/incels, which had 40,000 members. But when another group, called r/braincels, emerged on the site, with rhetoric that seemed to echo the r/incel page, they were not banned by the “front page of the Internet.” Days later, the website Incels.me was created.
According to a Reddit spokesperson, “at this time, r/braincels is not in violation of our policy,” even though the page has featured “posts bragging about sexually assaulting women and gleeful discussions about hurting or killing women.”
The concern about online extremist groups has increased over the last year, especially after the Toronto incident when a 25-year-old man named Alex Minassian drove his car into a crowd of people, killing eight women and two men. Minassian, who posted that the “Incel Rebellion has already begun,” on his Facebook page, was a part of several incel (short for involuntarily celibate) groups and inspired by Elliot Rodgers. Rodgers, who killed six people near UC Santa Barbara before shooting himself, wrote a manifesto and posted on an online forum for men to “start envisioning a world where WOMEN FEAR YOU.”
Twitter is another platform that has frequently drawn criticism for not banning individuals whose posts seem to promote racism, sexism, homophobia, and xenophobia. One example of this is the Twitter-verified blogger, Daryush “Roosh” Valizadeh. Valizadeh, who tweets photos of random women along with the question “would you bang?” and who has bragged about having sex with women who “barely conscious, too drunk to consent to sex, or who repeatedly refused his advances,” was barred from entering Australia in 2016. But, his Twitter account remains active. When the Huffington Post reached out to Twitter about Valizadeh, they did not comment.
Facebook has also declined to remove groups that call for violence against women. Though it faced criticism over the existence of the group page “If she puts you in the friend zone, you put her in the RAPE zone,” it took down the page in April.
“Major platforms’ latest efforts have mostly been focused on what are conceived to be traditional forms of violent extremism and terrorism. But there is no comprehensive policy in place for addressing other forms of online extremism, such as radical misogyny,” said Ludovica Di Giorgi, a far-right extremism expert.
Despite the fact that incel and other radical misogynist groups seem difficult to control, experts say that removing them from prominent sites is impactful. A study was conducted after Reddit closed many extremist pages in 2015. It found that while some people left the site altogether, others “lessened their hate speech usage significantly.”
Keegan Hankes, a senior research analyst at the Southern Poverty Law Center, has said that the rhetoric in incel and radical misogynist groups is extremely toxic. Though it can be difficult for sites to stop them from existing, “it’s very important to make sure it’s harder to find them.”