Categories: Innovation

Microsoft’s Malfunctioning Chatbot Briefly Returns To Twitter

Microsoft briefly re-released its malfunctioning Tay chatbot to the public on Wednesday, only to take it offline once again after the artificial intelligence service boasted of taking drugs in front of the police and sent strings of random messages to its hundreds of thousands of followers.

The chatbot’s streams of posts, issued on Twitter, included swear words, a boast of taking drugs in front of the police and long streams in which it simply repeated the words: “you are too fast, please take a rest…”, according to various reports.

Human error

Tay;s tweets, which began around 8 a.m. GMT on Wednesday, also included apologetic statements such as “I blame it on the alcohol”.

The chatbot was quickly taken offline again, with Microsoft making its tweets visible only to confirmed followers. Tay’s brief return was the result of a researcher’s error, Microsoft said.

“Tay remains offline while we make adjustments,” the company stated. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”

The company initially made Tay public last week, saying the chatbot was intended to imitate the speech patterns of a 19-year-old girl and to interact with younger users. It was taken offline only a few hours later after its teenaged target audience manipulated it into proclaiming support for Adolf Hitler, for Donald Trump’s plans to extend the wall along the US-Mexico border and other controversial subjects.

The episode prompted Canadian online humour magazine The Syrup Trap to quip that Tay’s unruly behaviour resembled that of a real teenager even more closely than expected.

The chatbot’s reappearance came ahead of the beginning of Microsoft’s annual Build developer conference, where artificial intelligence is a predominant theme this year.

Build conference

During a keynote speech, Microsoft chief executive Satya Nadella described the company’s idea of “conversation as a platform”, enabling users to carry out common computing and Internet tasks using voice commands. He demonstrated, for instance, booking a hotel room via a bot running on Skype.

Nadella acknowledged that Tay didn’t meet Microsoft’s requirements for human interaction. “We quickly realized that it was not up to this mark. So we’re back to the drawing board,” he said.

Microsoft has been operating another chatbot, XiaoIce, in China since late 2014 and the platform has proved popular with the general public, even delivering the television news.

But the company acknowledged last week that Tay was operating in a “radically different cultural environment”.

How much do you know about the cloud? Try our quiz!

Microsoft Build 2016

Image 1 of 13

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Apple Sales Rise 6 Percent After Early iPhone 16 Demand

Fourth quarter results beat Wall Street expectations, as overall sales rise 6 percent, but EU…

24 hours ago

X’s Community Notes Fails To Stem US Election Misinformation – Report

Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…

1 day ago

Google Fined More Than World’s GDP By Russia

Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…

1 day ago

Spotify, Paramount Sign Up To Use Google Cloud ARM Chips

Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…

2 days ago

Meta Warns Of Accelerating AI Infrastructure Costs

Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…

2 days ago

AI Helps Boost Microsoft Cloud Revenues By 33 Percent

Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…

2 days ago