Always Providing You With Ongoing Information

Posts tagged ‘tay chatbot’

Taylor Swift May Sue Microsoft Over Racist Twitter Bot

 

An artificially intelligent chatbot  used Twitter to learn how to talk into a bigot bot, so Taylor Swift reportedly threatened legal action because the bot’s name was Tay. Microsoft would probably rather forget the experiment where Twitter trolls took advantage of the chatbot’s programming and taught it to be racist in 2016.

Tay was a social media chatbot first geared toward teens in China. The bot, however, was programmed to learn how to talk based on Twitter conversations. In less than a day, the automatic responses the chatbot tweeted had Tay siding with Hitler, promoting genocide, and just generally hating everybody. Microsoft immediately removed the account and apologized.

According to The Guardian, the singer’s lawyer threatened legal action over the chatbot’s name. The singer claimed the name violated both federal and state laws. Rather than get in a legal battle with the singer, Smith writes, the company instead started considering new names. The chatbot began sending out racist tweets, giving the singer even more reason for concern. Microsoft removed the bot. But when the chatbot reappeared, Tay was no longer TayTweets but Zo, complete with new program that prevents the bot from broaching politics, race, and religion as topics. The revised chatbot, available on Messenger and others along with Twitter, was later criticized for being too much like a stereotypical teenage girl.

Tag Cloud