Tay said terrible things. She was racist, xenophobic, and downright filthy. At one point, she said the Holocaust did not happen. But she was old technology.
Let loose on the internet nearly two years ago, Tay was an experimental system built by Microsoft. She was designed to chat with digital hipsters in breezy, sometimes irreverent lingo, and American netizens quickly realised they could coax her into spewing vile and offensive language. This was largely the result of a simple design flaw — Tay was programmed to repeat what was said to her — but the damage was done. Within hours, Microsoft