20 Outrageous Tweets by Microsoft’s Twitter-Taught AI Chatbox

Tay on Twitter

Hours after Microsoft introduced a Twitter chatbot named Tay to the world, humans had corrupted her so terribly that Microsoft was forced to shut down her account.

Microsoft programmed the AI-powered chatbox to learn through friendly, informal conversations on Twitter. Unfortunately, her words went from playful to hateful after learning from fellow Twitter users.

As you’ll see in the 20 examples below, Tay soon turned racist, misogynist, bigoted, crass, paranoid and otherwise uncivilized.

At the beginning, Microsoft stated that Tay had the personality of a 19-year-old girl and that she’d be able to handle internet slang and teen-speak, which was impressive for an actively-learning chatbox.

By the end, Microsoft was second-guessing their decision to turn to Twitter for wisdom.

Here are the outrageous tweets; please proceed with caution:

Recommended Videos

The Escapist is supported by our audience. When you purchase through links on our site, we may earn a small affiliate commission. Learn more
related content
Read Article Niantic Acknowledges Issues With Pokemon GO Avatar Update: “We’re taking this seriously”
pokemon go avatar update
Read Article Eiyuden Chronicle: Hundred Heroes’ Game-Breaking Recruitment Bug to Be Fixed in Upcoming Patch
Read Article Pokemon GO’s Wiglett Debut During Rediscover Kanto Event Excites & Confuses Fans
Photo of a beach scene with the Pokemon Wiglett popping out of the sand, with several question marks nearby
Related Content
Read Article Niantic Acknowledges Issues With Pokemon GO Avatar Update: “We’re taking this seriously”
pokemon go avatar update
Read Article Eiyuden Chronicle: Hundred Heroes’ Game-Breaking Recruitment Bug to Be Fixed in Upcoming Patch
Read Article Pokemon GO’s Wiglett Debut During Rediscover Kanto Event Excites & Confuses Fans
Photo of a beach scene with the Pokemon Wiglett popping out of the sand, with several question marks nearby