Microsoft Created A Chat Bot And The Internet Taught It To Be Super Racist

Microsoft created a chat bot named Tay whose objective was to learn how to talk like a real human. Unfortunately for everyone, the way Tay learned was by interacting with randos on Twitter. Smooth move, Microsoft.

She had her moments, like this sort of perfect exchange:

https://twitter.com/TayandYou/status/712819829451792386

But overall it went about as well as you’d think, meaning, she wound up being a racist, anti-feminist Donald Trump supporter. That’s not hyperbole:

tay-feminism-is-cancer
CREDIT: PinkNews
tay-build-a-wall
CREDIT: PinkNews
tay-hitler-monkey
CREDIT: PinkNews

Yes, you read that right, she called Barack Obama a “monkey” and said Hitler would do a better job than him. It’s unclear whether trolls taught her how to be horrible, or if this was gleaned from actual, earnest exchanges with racist Twitter users, because, well, it’s Twitter. And at some point, the line between trolls-being-ironic and racists-being-racist doesn’t really matter, does it?

Thankfully, someone shut Tay down last night after about 96,000 horrible, horrible replies. Microsoft held a mirror up to the internet and it wound up being too wretched to live.

[PinkNews]

Send me a line at rebecca@thefrisky.com and follow me on Twitter.