20 Outrageous Tweets by Microsoft’s Twitter-Taught AI Chatbox

Tay on Twitter

Hours after Microsoft introduced a Twitter chatbot named Tay to the world, humans had corrupted her so terribly that Microsoft was forced to shut down her account.

Microsoft programmed the AI-powered chatbox to learn through friendly, informal conversations on Twitter. Unfortunately, her words went from playful to hateful after learning from fellow Twitter users.

As you’ll see in the 20 examples below, Tay soon turned racist, misogynist, bigoted, crass, paranoid and otherwise uncivilized.

At the beginning, Microsoft stated that Tay had the personality of a 19-year-old girl and that she’d be able to handle internet slang and teen-speak, which was impressive for an actively-learning chatbox.

By the end, Microsoft was second-guessing their decision to turn to Twitter for wisdom.

Here are the outrageous tweets; please proceed with caution:

Recommended Videos
related content
Read Article How to Make Minecraft in Infinite Craft
Read Article All Patch Notes for Baldur’s Gate 3 Update Hotfix #20
Read Article Tron 3 First Look Reveals Costumes for Ares
Related Content
Read Article How to Make Minecraft in Infinite Craft
Read Article All Patch Notes for Baldur’s Gate 3 Update Hotfix #20
Read Article Tron 3 First Look Reveals Costumes for Ares