Microsoft's Twitterbot Tay is super racist, apparently down with genocide

  • 8 years ago
REDMOND, WASHINGTON — Microsoft's new artificial intelligence chatbot had an interesting first day of class after Twitter's users taught it to say a bunch of racist things.

The verified Twitter account called Tay was launched on Wednesday. The bot was meant to respond to users' questions and emulate casual, comedic speech patterns of a typical millennial.

According to Microsoft, Tay was "designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you."

Tay uses AI to learn from interactions with users, and then uses text input by a team of staff including comedians.

Enter trolls and Tay quickly turned into a racist dropping n-bombs, supporting white-supremacists and calling for genocide.

After the enormous backfire, Microsoft took Tay offline for upgrades and is deleting some of the more offensive tweets.

Tay hopped off Twitter with the message, "c u soon humans need sleep now so many conversations today thx."

----------------------------------------­­---------------------

Welcome to TomoNews, where we animate the most entertaining news on the internets. Come here for an animated look at viral headlines, US news, celebrity gossip, salacious scandals, dumb criminals and much more! Subscribe now for daily news animations that will knock your socks off.

Visit our official website for all the latest, uncensored videos: http://us.tomonews.net
Check out our Android app: http://bit.ly/1rddhCj
Check out our iOS app: http://bit.ly/1gO3z1f

Get top stories delivered to your inbox everyday: http://bit.ly/tomo-newsletter

Stay connected with us here:
Facebook http://www.facebook.com/TomoNewsUS
Twitter @tomonewsus http://www.twitter.com/TomoNewsUS
Google+ http://plus.google.com/+TomoNewsUS/
Instagram @tomonewsus http://instagram.com/tomonewsus

Category

🗞
News

Recommended