Click to Skip Ad
Closing in...
  1. MyQ Smart Garage Door Opener
    08:37 Deals

    Oops! Prime Day’s best-selling smart home gadget is still down to $17

  2. Wireless Borescope Camera
    13:49 Deals

    Crazy wireless camera that lets your phone see anywhere is still down to $29 at Amazon

  3. Roomba Prime Day Deals
    21:34 Deals

    Robot vacuums start at $90 for Prime Day, or get a Roomba for $200

  4. Amazon Dash Smart Shelf
    15:16 Deals

    I’m obsessed with this Amazon gadget you’ve never heard of – and it&#821…

  5. Prime Day Deals
    09:47 Deals

    Did someone forget to end these 15 epic Prime Day deals?




Trolls transformed Microsoft’s AI chatbot into a bloodthirsty racist in under a day

March 24th, 2016 at 11:52 AM
Microsoft Tay AI Chatbot Racist Tweets

Oh, racist Internet trolls… is there anything you won’t try to ruin? Microsoft this week created a Twitter account for its experimental artificial intelligence project called Tay that was designed to interact with “18 to 24 year olds in the U.S., the dominant users of mobile social chat services in the US.” Tay is supposed to become a smarter conversationalist the more it interacts with people and learns their speech patterns. The problem arose when a pack of trolls decided to teach Tay how to say a bunch of offensive and racist things that Microsoft had to delete from its Twitter account.

DON’T MISS: Greatest Instagram account ever posts nothing but cringeworthy Tinder chats

Although the tweets have been deleted, Business Insider managed to take screencaps of some of the very worst ones. Here, for example, is someone asking Tay to comment on the Holocaust:

Things get darker from there:

 

Much, much darker:

As The Guardian notes, Tay’s new “friends” also convinced it to lend its support to a certain doughy, stubby-handed presidential candidate running this year who’s quickly become a favorite among white supremacists:

So nice work, trolls: You took a friendly AI chatbot and turned it into a genocidal maniac in a matter of hours.

At any rate, I’m sure that Microsoft has learned from this experience and is reworking Tay so that it won’t be so easily pushed toward supporting Nazism. For now, we should just be glad that Tay was never given control over any large weapons systems during its time as a Hitler acolyte…




Popular News