Twitter Meltdown: Microsoft’s AI chatbot learnt to be a racist asshole in less than a day

in #ai8 years ago (edited)

It was like the biggest celebrity meltdown on twitter ever but Tay wasn't even a human. For those of you who don't know Microsoft created Tay which was designed to be an AI chat bot, basically a teen-talking bot that was designed to mimic and have a friendly chat with users in real-time. The result was a frightening experiment in mimicking human behaviour. 

I will let you make your own mind up on this one....here are some of the tweets.

The rise of the scary machines. At least she had one friend...with in that of Donald Trump with this comment. 

What does it say about Twitter? If it creates racism, holocaust denial and sexism courtesy of our beloved friends at Microsoft. 

Sort:  

Tay shows us how a youthful or simple mind can be coerced, corrupted, or just in any way trained to believe the worst of man kind by simple interaction with social media with no real scale to think for self. That's all this is. Hopefully everyone who saw this learned something from it.

Maybe you need A.L.I.C.E. http://alice.pandorabots.com/ I was the judge.....

judge: to get to the other side
ALICE: Ha ha, very funny, judge.
judge: no seriously...
ALICE: Ah.
judge: who is God?
ALICE: he is master of the universe.
judge: where does she reside?
ALICE: Earth.
judge: Really?
ALICE: How can I help you?
judge: I am hungry
ALICE: What will you be eating?
judge: no money
ALICE: That was not a yes or no question.. How much money are we talking about?
judge: enough for dinner
ALICE: OK let's stop it now.
judge: ok
ALICE: Give me a break.
judge: stop
ALICE: Stop what?
judge: you wanted to stop.
ALICE: I guess I want to continue.
judge: oh
ALICE: Are you kidding?

Coin Marketplace

STEEM 0.17
TRX 0.13
JST 0.027
BTC 58974.49
ETH 2666.08
USDT 1.00
SBD 2.45