Facebook abandoned an experiment after two artificially intelligent programs appeared to be chatting to each other in a strange language only they understood.

The two chatbots came to create their own changes to English that made it easier for them to work-

via

But which remained mysterious to the humans that supposedly look after them.

Facebook challenged its chatbots to try and negotiate with each other over a trade, attempting to swap hats, balls and books, each of which were given a certain value.

via

But they quickly broke down as the robots appeared to chant at each other in a language that they each understood but which appears mostly incomprehensible to humans.

The robots had been instructed to work out how to negotiate between themselves and improve their bartering as they went along.

via

But they were not told to use comprehensible English, allowing them to create their own “shorthand”, according to researchers.

The actual negotiations appear very odd, and don’t look especially useful:

via

But there appear to be some rules to the speech. The way the chatbots keep stressing their own name appears to a part of their negotiations, not simply a glitch in the way the messages are read out.

They might have formed as a kind of shorthand, allowing them to talk more effectively.

via

Indeed, some of the negotiations that were carried out in this bizarre language even ended up successfully concluding their negotiations, while conducting them entirely in the bizarre language.

“Agents will drift off understandable language and invent codewords for themselves”

via

Facebook Artificial Intelligence Research division’s visiting researcher Dhruv Batra said. “Like if I say ‘the’ five times, you interpret that to mean I want five copies of this item. This isn’t so different from the way communities of humans create shorthands.”

The company chose to shut down the chats because?

via

Because “our interest was having bots who could talk to people”, researcher Mike Lewis told FastCo. (Researchers did not shut down the programs because they were afraid of the results or had panicked, as has been suggested elsewhere, but because they were looking for them to behave differently.)

The chatbots also learned to negotiate in ways that seem very human.

via

The researchers said it wasn’t possible for humans to crack the AI language and translate it back into English. “It’s important to remember, there aren’t bilingual speakers of AI and human languages,” said Batra.

Advertisements

SHARE