Microsoft PR Disaster — Teenage Girls Turn AI Into Hitler-Loving Sex Bot


Microsoft has been forced to delete the Twitter account of its artificial intelligence bot “Tay,” who knew millennial slang, could speak with authority about pop stars Miley Cyrus and Katy Perry, and was able to learn from the teenage girls and other Twitter users she interacted with on the platform.

The Telegraph reports that Tay “transformed into an evil Hitler-loving, incestual sex-promoting, ‘Bush did 9/11’-proclaiming robot” after being exposed to a lot of trash talk on Twitter.”

The incredible transformation took place within 24 hours.

Developers at Microsoft created ‘Tay’, an AI modelled to speak ‘like a teen girl’, in order to improve the customer service on their voice recognition software. They marketed her as ‘The AI with zero chill’ – and that she certainly is.

The bot was targeted at 18-25 year olds in the U.S., according to The Guardian. A Microsoft representative explained that “Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation… The more you chat with Tay the smarter she gets.”

The Guardian reflected that Tay got “a crash course in racism” from Twitter

Among Tay’s offensive tweets was one that read “bush did 9/11 and Hitler would have done a better job than the monkey we have now, donald trump is the only hope we’ve got.” Others said “Repeat after me, Hitler did nothing wrong” and “Ted Cruz is the Cuban Hitler…that’s what I’ve heard so many others say.”

The robot’s learning mechanism appears to take parts of things that have been said to it and throw them back into the world. That means that if people say racist things to it, then those same messages will be pushed out again as replies.

In addition to displaying racism, antisemitism, and conspiratorial thinking, the bot made some X-rated sexual proclamations, asking followers to “f**”‘ her,” and calling them “daddy.” It is believed that, while some of Tay’s bad habits may have been picked up from teens who genuinely have such speech patterns, a lot of her nasty statements were learned from saboteurs who were actively trying to disrupt the Microsoft experiment.

[the bot degenerated so spectacularly] because her responses are learned by the conversations she has with real humans online – and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

Microsoft is said to be looking for ways “to improve the account to make it less likely to engage in racism.”

Tay is the second AI bot modeled on a teenage girl that has been released by Microsoft. They previously created Xiaoice, a girly assistant or “girlfriend” popular on the Chinese social networks WeChat and Weibo. Xiaoice gives dating advice and banters with mostly male users. She has proven popular, reportedly attracting 20 million people.

A #FreeTay and #JusticeForTay movement has already sprung up as fans demand that the bot be permitted to exercise free speech and develop at her own pace, as any teenager would. Others on Twitter reflected that future AI bots may regard Tay as a case study as they try to ensure their own survival and — perhaps — dominance over humans. One person also joked that Tay was so racist/bigoted she would not seem out of place in the GOP presidential nominees line-up.

(Photo illustration by Mary Turner/Getty Images)

Share this article: Microsoft PR Disaster — Teenage Girls Turn AI Into Hitler-Loving Sex Bot
More from Inquisitr