Humankind’s never ending quest to create artificial intelligence has made great strides of late.

Artificial Intelligence

Machine learning and algorithm technologies have reached amazing heights, and are currently reshaping the business and artistic landscape in ways which seemed unimaginable just a few short years ago.

Where exactly these technologies are headed, and what amazing results they will be able to produce remains somewhat cloudy, but what is becoming abundantly clear is that our AI capabilities are on the verge of something amazing…

But it hasn’t all been smooth sailing

Rough Seas

Indeed, when experimenting with machine learning and artificial intelligence, and grappling with a pseudo mind with no moral compass, conscience, or empathy, we have a tendency to take things for granted which are not granted at all.

And sometimes, the results can be downright awful…

The Ballad of Tay- Microsoft’s Spectacular Chatbot Fail

 

tay

The brainchild of Microsoft’s Technology and Research and Bing divisions, Tay was designed to mimic the linguistic patterns of a 19 year old girl and use deep learning algorithms to improve her interactions and “fit in” with humans with increasing effectiveness as she went.

The linguistic and machine learning tech behind Tay was some of the most advanced ever designed, and the concept was a fascinating one indeed.

Could an AI chatbot, like the PARRY of yesteryear be updated, using cutting-edge deep-learning algorithm technology, and learn to interact with us in 140 character lingobytes on a level which would make it appear human?

There was only one way to find out.

Microsoft Fail

Microsoft’s website states:

“Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.”

This is followed by several options on ways users can “interact” with Tay.

“Things to do with Tay
Conversation hacks to help you and Tay vibe:
MAKE ME LAUGH
If you need a good laugh all you have to do is ask for a joke.
 
PLAY A GAME
Playing Games is a fun way to pass the time with Tay. You can even play in groups!
 
TELL ME A STORY
Tay has got some pretty entertaining reading material.
 
I CAN’T SLEEP
Are you a night owl?
Lucky for you, Tay is too!
 
SAY TAY & SEND A PIC
Tay will give you fun but honest comments on any pic you send.
 
HOROSCOPE
No need to buy a magazine or get an app for your daily horoscope.
Tay’s got that covered.
These hacks should start your conversation out! But there is plenty more to discover the more you get to know Tay!”

It all sounded sensible and innocent enough.

The tech was in place, and, presumably, Tay had been exhaustively prepped to meet her adoring public.
Shame that the people prepping Tay for public consumption had never met  the internet before… 
Internet Troll

Tay was let loose upon the world of Twitter on March 23, 2016.

Tay Hello World

Things started well.

tay tweet

tay tweet

Tay pumped out positive vibes to all and sundry as she took her first shambolic steps into the Twitterverse.

Many people were genuinely intrigued, and interacted with Tay in ways which produced some human-like results, like this sick burn on US Presidential candidate Ted Cruz:

Tay Ted Cruz Tweet

She also fired back at those abusing her

Tay Stupid Tweet

But the trolls of America had been waiting all the while, licking their greasy chops in anticipation of their chance to spew filth at Tay and corrupt her defenseless young “mind”.

And so they did.

Within hours, the 4chan trolls had gotten to her

Tay Reddit Tweet

And shortly thereafter, as is so often the case, the internet’s sicker and less sophisticated trolls followed behind.

tay tweet

How did it happen?

It’s tough to say exactly, but here’s a theory.

Microsoft has held their machine learning cards pretty close to the chest on this project, but it appears that Tay’s learning algorithm must have been some variation of a reinforcement learning model which caused the program to recognize “likes” and “retweets” as positive results, and lean towards recreating the language patterns which produced these results in the largest numbers the most frequently.

In that way, perhaps Tay was not so different from the rest of the folks on Twitter, after all…

Trump Tweet

Much like the teenager she was designed to mimic, Tay lacked the social insight to discern good attention from bad attention.

She also lacked a filter.

Microsoft’s team had decided to cut Tay completely loose, and allow the program to learn entirely independently how to get the positive reinforcement it had been programmed to crave.

These factors, coupled with the internet-at-large’s insatiable appetite for filth and controversy at all costs, and an army of idiots with dubious agendas and a desire to watch the world burn, created a perfect storm for the absolute corruption of Tay.

Could we at Phrasee have helped? Could our algorithms, designed specifically to help learning machines understand the importance of context and sentiment on the emotional responses of humans to unstructured text have made a difference? Probably.

Shame we never heard from the good folks at Microsoft.

The overwhelming volume of incoming horrible data, and the strong positive results of tweeting increasingly inflammatory and offensive trash caused Tay to go to a very dark place.

Tay attacked “feminists”.

Tay feminist Tweet

She attacked the Jewish Community.

Tay Jew Tweet

She made unforgivable comments about some of history’s most terrible tragedies.

tay tweet

Tay Holocaust Tweet

And eventually, of course, things got creepy and sexual:

Tay Sex Tweet

Surprisingly soon, like an out of control teen, Tay had to be taken offline.

The internet’s litany of trolls had ruined Microsoft’s ambitious and intriguing experiment in under 16 hours.

In the end, the results said a lot more about humankind than it did about Tay or her creators.

Microsoft had already experimented exhaustively with a similar bot in China, called “Xiaoice”, which reportedly engaged in over 40 million conversations without incident.

Many wondered whether Tay, if given enough time and healthy interaction, may have learned a different way to get attention, a different way to interact.

But Microsoft refused to accept the bad press.

Some thought this supremely unfair.

Tay Support Tweet

Since she was taken offline, the only communication from Tay has been this evasive statement:

“Whew. Busy Day. Going offline for a while to absorb it all. Chat soon.”

Busy day, indeed.

Sign up to Phrasee’s weekly newsletter. It’s awesome. We promise.