Microsoft created an AI bot and the internet gave it a crash course in racism

Screen Shot 2016-03-24 at 3.51.59 PM

So Microsoft launched a chat bot yesterday, named Tay. And no, she doesn’t make neo-soul music. The idea is that the more that you chat with Tay, the smarter she gets. She replies instantly when you engage her and because of all that feedback, her answers get more and more intelligent over time.

I tried the bot myself, and she didn’t seem so intelligent to me.

Anyways, Tay sent out her first tweet at 1:14PM

Microsoft designed the experiment to “test and improve their understanding of conversational language”, presumably to improve Cortana. The only problem with this model is humans are not exactly the best role models around. I mean, it’s been just over 24 hours, and we’ve already taught an otherwise intelligent bot to type like a retard.

Tay

Even worse. FOR PETE’S SAKE, LOOK AT WHAT HUMANS ARE TWEETING AT A BOT

Some unsavoury people on Twitter found the bot and started to take advantage of the bot’s learning process to get it to say racist, bigoted and very…Donald-Trump-like things.

I mean…

 

1-2 3

screen shot 2016-03-24 at 10.48.22 screen shot 2016-03-24 at 11.10.58 screen shot 2016-03-24 at 11.55.42 screen_shot_2016-03-24_at_11_14_23

By the time Microsoft’s developers discovered what was going on, they started to delete all the offensive tweets, but the damage was already done – thank god for screenshots. I reckon moving forward, they will implement filters and try to curate Tay’s speech a little more, so this doesn’t happen again.

Some think Microsoft should have left the offending tweets as a reminder of how dangerous Artificial Intelligence could get.


I’m inclined to disagree. All this is for me though, is a reminder of how many depraved people we have to share the world with.

Share on Google Plus

About Nigerian Today

This is a short description in the author block about the author. You edit it by entering text in the "Biographical Info" field in the user admin panel.

0 comments :

Post a Comment