Microsoft’s Malfunctioning Chatbot Briefly Returns To Twitter

Satya Nadella acknowledges Tay isn’t yet up to scratch, but that bots are very much part of Microsoft’s future vision

Microsoft briefly re-released its malfunctioning Tay chatbot to the public on Wednesday, only to take it offline once again after the artificial intelligence service boasted of taking drugs in front of the police and sent strings of random messages to its hundreds of thousands of followers.

The chatbot’s streams of posts, issued on Twitter, included swear words, a boast of taking drugs in front of the police and long streams in which it simply repeated the words: “you are too fast, please take a rest…”, according to various reports.

build_6_web

Human error

Tay;s tweets, which began around 8 a.m. GMT on Wednesday, also included apologetic statements such as “I blame it on the alcohol”.

The chatbot was quickly taken offline again, with Microsoft making its tweets visible only to confirmed followers. Tay’s brief return was the result of a researcher’s error, Microsoft said.

“Tay remains offline while we make adjustments,” the company stated. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”

The company initially made Tay public last week, saying the chatbot was intended to imitate the speech patterns of a 19-year-old girl and to interact with younger users. It was taken offline only a few hours later after its teenaged target audience manipulated it into proclaiming support for Adolf Hitler, for Donald Trump’s plans to extend the wall along the US-Mexico border and other controversial subjects.

The episode prompted Canadian online humour magazine The Syrup Trap to quip that Tay’s unruly behaviour resembled that of a real teenager even more closely than expected.

The chatbot’s reappearance came ahead of the beginning of Microsoft’s annual Build developer conference, where artificial intelligence is a predominant theme this year.

Build conference

During a keynote speech, Microsoft chief executive Satya Nadella described the company’s idea of “conversation as a platform”, enabling users to carry out common computing and Internet tasks using voice commands. He demonstrated, for instance, booking a hotel room via a bot running on Skype.

Nadella acknowledged that Tay didn’t meet Microsoft’s requirements for human interaction. “We quickly realized that it was not up to this mark. So we’re back to the drawing board,” he said.

Microsoft has been operating another chatbot, XiaoIce, in China since late 2014 and the platform has proved popular with the general public, even delivering the television news.

But the company acknowledged last week that Tay was operating in a “radically different cultural environment”.

How much do you know about the cloud? Try our quiz!

Microsoft Build 2016

Image 1 of 13