Twitter hasn’t implemented an algorithm that would help them more effectively police white supremacist content on their platform because they fear the programming would also flag the accounts of Republican politicians and other mainstream conservatives, according to a new report from Motherboard.
At one of the company’s open forum all-employee meetings, one person present asked why Twitter has so far failed to stamp out the increasingly prevalent white nationalist and Neo-Nazi content on the site when they had been so generally successful in cracking down on other unacceptable content, such as ISIS videos, using machine learning and algorithmic solutions.
Representatives from the company answered by explaining that a tradeoff always takes place when it comes to automated content filters such as the ones that Twitter uses to eliminate the majority of Islamic State propaganda that is posted to the platform. In the ISIS example, one employee explained, the tradeoff is that sometimes completely innocuous Arabic-language content is accidentally flagged for removal based on the automated filters.
In general, the employee went on, users are willing to accept that tradeoff for the benefit of curtailing terrorist propaganda on the site. When it comes to tackling white supremacy, however, different challenges emerge.
Specifically, another employee reportedly added, there is high-profile and influential group that could be similarly affected by such content filters aimed at white supremacist content: Republican politicians. Aggressively removing white supremacist material, the employee said, would likely result in the removal of content or outright bans of accounts affiliated with such GOP figures.
There it is: Twitter employee admits that algorithms can't distinguish between Republicans and white supremacists. https://t.co/Ivo0Cz2LaA— Steve Silberman (@stevesilberman) April 25, 2019
“Most people can agree a beheading video or some kind of ISIS content should be proactively removed, but when we try to talk about the alt-right or white nationalism, we get into dangerous territory, where we’re talking about Steve King or maybe even some of Trump’s tweets, so it becomes hard for social media companies to say all of this ‘this content should be removed,'” said Amarnath Amarasingam, an extremism researcher at the Institute for Strategic Dialogue.
President Donald Trump himself, who recently met with Twitter CEO Jack Dorsey, has already expressed his personal concerns that platforms such as Twitter are unfairly discriminating against conservative voices on social media.
“[T]hey don’t treat me well as a Republican. Very discriminatory, hard for people to sign on. Constantly taking people off list. Big complaints from many people,” the president said in an early-morning tweet. Trump went on to suggest that Congress should get involved in the matter.