}

Tay: a program that was born naive and became racist and macho

2016/03/30 Galarraga Aiestaran, Ana - Elhuyar Zientzia

The news comes from artificial intelligence. Recently, the AlphaGo program developed by Google managed to defeat the world champion of the board game Go. Days later, Microsoft introduced Tay.

Tay is a computer program designed to keep conversations agile and fun on social networks. Aimed at young people aged 18 to 24, they announced that they would be able to learn from their conversations with them. But, unlike AlphaGo, he was unsuccessful: they had to abandon him the day after his presentation.

It really started well: in the first message he was eager to interact with human beings. Specifically, he said “hello world”, with a figure of the planet and three signs of surprise. But not all users responded in the same tone and Tay messages became increasingly offensive. In fact, they designed it with the intention of collecting information about the person they interviewed, according to what they said, and responding at the same level.

With those he threw, it is clear that he learned quickly and badly. Among other things, she claimed that feminism was a cancer, supported Hitler and made racist and homophobic comments.

Microsoft apologized and withdrew Tay. His last message was: “Okay. I'm leaving. I feel used.”

And yes, it was used and not, in haste, as Microsoft wanted. Yes, it seems he learned and therefore succeeded in it. In any case, Microsoft has expressed its intention to improve and retest the program. See what you learn next time!

Gai honi buruzko eduki gehiago

Elhuyarrek garatutako teknologia