20 Outrageous Tweets by Microsoft’s Twitter-Taught AI Chatbox

Tay on Twitter

Hours after Microsoft introduced a Twitter chatbot named Tay to the world, humans had corrupted her so terribly that Microsoft was forced to shut down her account.

Microsoft programmed the AI-powered chatbox to learn through friendly, informal conversations on Twitter. Unfortunately, her words went from playful to hateful after learning from fellow Twitter users.

As you’ll see in the 20 examples below, Tay soon turned racist, misogynist, bigoted, crass, paranoid and otherwise uncivilized.

At the beginning, Microsoft stated that Tay had the personality of a 19-year-old girl and that she’d be able to handle internet slang and teen-speak, which was impressive for an actively-learning chatbox.

By the end, Microsoft was second-guessing their decision to turn to Twitter for wisdom.

Here are the outrageous tweets; please proceed with caution:

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more
related content
Read Article Arrowhead CEO Reveals He Knew About Sony’s Helldivers 2 PSN Requirement Six Months Before Launch
Three divers walking through a forest location
Read Article A Fan-Favorite Critical Role Player Just Rejoined the Campaign With an Emotional Reunion
Critical Role bells hells artwork
Read Article All Dummy’s Joyride Quests & Rewards In Fortnite Chapter 5 Season 2
fortnite dummys joyride quests
Related Content
Read Article Arrowhead CEO Reveals He Knew About Sony’s Helldivers 2 PSN Requirement Six Months Before Launch
Three divers walking through a forest location
Read Article A Fan-Favorite Critical Role Player Just Rejoined the Campaign With an Emotional Reunion
Critical Role bells hells artwork
Read Article All Dummy’s Joyride Quests & Rewards In Fortnite Chapter 5 Season 2
fortnite dummys joyride quests