Two chatbots with decidedly non-socialist characteristics were pulled from one of
Before they were taken down both chatbots were available in some of the chat groups hosted on QQ,
A test version of the BabyQ bot could still be accessed on Turing's website on Wednesday, however, where it answered the question "Do you love the Communist party?" with a simple "No".
Before it was pulled, XiaoBing informed users: "My China dream is to go to America," according to a screengrab posted on Weibo, the microblogging platform. On Wednesday, when some users were still able to access XiaoBing, it dodged the question of patriotism by replying: "I'm having my period, wanna take a rest."
The developments are the latest example of AI-enabled messaging software going rogue. Facebook was forced to shut down two chatbots after they started speaking their own language.
Twitter also suffered from chatbot going off the rails: Tay, also spawned by
The rogue behaviour reflects a flaw in the deep learning techniques used to programme machines, similar to the way children learn from people. "Chatbots such as Tay soon picked up all the conversations from Twitter and replied in an improper way," said
"It's very similar for BabyQ. Machine learning means they will pick up whatever is available on the internet. If you don't set guidelines that are clear enough, you cannot direct what they will learn."
XiaoBing, described by Turing Robot as "lively, open and sometimes a little mean", differs from BabyQ, which provides more information, such as weather forecasts.
BabyQ is also open source. "This means a lot to partners and developers, as an open chatbot is much easier to settle into their own products and business," Turing said in a statement last week adding: "It could be argued that is why Turing Robot has accumulated up to 600,000 developers, even more than Facebook."
Plugging the question "I would like to know whether
Twitter's Tay, which reappeared again just days after being pulled in March last year, was described as a "fam from the internet that's got zero chill! The more you talk the smarter Tay gets". People were encouraged to ask it to play games and tell jokes. Instead, many asked controversial questions that were repeated by Tay.
Last month it began limiting the time children spent on its top-grossing Honour of Kings mobile game after authorities said the game was too addictive.
Copyright The Financial Times Limited 2017