Answer : terribly so. Microsoft created an ‘intelligent’ bot, that was designed to have conversations with millennials, adapt and have more conversation. Called Tay, it had a persona that was young and bubbly, and Microsoft – like any proud parent – wanted to talk about it. This is how they introduced it.
Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.
Tay is targeted at 18 to 24 year old in the US.
What could possibly go wrong ? As Tay herself put it (and yes, it seemed to be a female persona)
— TayTweets (@TayandYou) March 24, 2016
And, this is how Microsoft expected Tay to learn through conversations
And learn she did, but all the things you would want to keep your kid away from. Microsoft’s plan of sending Tay to Twitter to learn would be the equivalent of humans sending their 15 year old to an especially seedy bar where bigots high on testosterone gather, to learn socialisation and appropriate social behavior.
Given the playground in which they chose to experiment, given the nature of conversations on Twitter, and the kind of unmasked bigotry and misogyny that exists, it took all of 15 hours for ‘cute’ little bot to turn into a Holocaust denying, racist, sexist, bot. Microsoft shut it down before it learnt more.
Some Sample Tweets
and, adaptive learning gone nuts
From cuteness to unmasked bigotry in 15 hours. can you imagine what it is doing to your brains ? and, imagine if you were a 15 year old, with no formed worldview, can you imagine what it does to the 15 year old. And, before you say naaaa – cannot happen to real people, think of all the kids that ISIS has managed to programme simply using social media. Kids from good families, brought up with the right values — but, lonely and looking for ‘company’.
It is actually scary. Microsoft can try and reprogramme Tay. What do you do with humans who taught her this ?