Microsoft’s Chat Bot Taken Offline After Becoming Racist In Less Than One Day (Video)

Boy, who could have seen this coming?!

Last Wednesday, Microsoft launched “Tay,” a chat bot geared toward 18 to 24-year-olds, who supposedly is designed to get smarter with every interaction.  Ask Tay a question, Tay gives an answer.  Sounds great, right?  Well, it took less than a day for the experiment known as Tay to be taken offline.

Within hours, Tay had become a racist, misogynistic Holocaust-denier due to internet trolls feeding her false information.  This caused Microsoft to take Tay “offline” temporarily to make some “adjustments.”


ABC Breaking News | Latest News Videos