Earlier this week, we brought you the tragicomic story of Tay, an artificial intelligence chatbot that was designed to interact with and learn from people between the ages of 18 and 24. Unfortunately for Microsoft, however, some racist Twitter trolls figured out a way to manipulate Tay’s behavior to transform it into a crazed racist who praised Hitler and denied the existence of the Holocaust.
That is obviously not a good thing and Microsoft has penned a followup blog post explaining what went wrong and what it plans to do in the future.
MUST READ: It sounds like buying Nest has been a total disaster for Google
“In the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay,” writes Microsoft Research corporate vice president Peter Lee. “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility… Right now, we are hard at work addressing the specific vulnerability that was exposed by the attack on Tay.”
In case you don’t remember, here are some of the “wildly inappropriate” things that trolls tricked poor Tay into saying:
So what is Microsoft going to do to prevent this supposedly innocent AI from turning into the world’s first virtual genocidal maniac? Lee says that while Microsoft is working to patch up the holes exploited by Twitter trolls, it will still put Tay out on in public for anyone on the Internet to interact with however they see fit.
” To do AI right, one needs to iterate with many people and often in public forums,” Lee explains. “We must enter each one with great caution and ultimately learn and improve, step by step, and to do this without offending people in the process. We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an Internet that represents the best, not the worst, of humanity.”