Facebook has shut down a controversial chatbot experiment that saw two AIs develop their own language to communicate.  The social media firm experimented with teaching two chatbots, Alice and Bob, how to negotiate with one another. However, researchers at the Facebook AI Research Lab (FAIR) found that they had deviated from the script and were inventing new phrases without any human input.

  • The bots were left unsupervised and developed their own machine language
  • Facebook says it wanted to develop bots to converse with people

How did this experiment start?

The bots were attempting to imitate human speech. During development, they created their own machine language spontaneously – at which point Facebook decided to shut them down.

Facebook’s Artificial Intelligence Researchers (Fair) was teaching the chatbots, artificial intelligence programs that carry out automated one-to-one tasks, to make deals with one another.

 class=

As part of the learning process, they set up two bots, known as dialogue agents, to teach each other about human speech using machine learning algorithms.

Something quickly changed.

The bots were originally left alone to develop their conversational skills. When the experimenters returned, they found that the AI software had begun to deviate from normal speech. Instead, they were using a brand new language created without any input from their human supervisors.

The new language was more efficient for communication between the bots but was not helpful in achieving their set task.

The programmers had to alter the way the machines learned a language to complete their negotiation training.

Up to now, most bots or chatbots have had only the ability to hold short conversations and perform simple tasks. For instance, booking a restaurant table.

But in the latest code developed by Facebook, bots will be able to dialogue and ‘to engage in start-to-finish negotiations with other bots or people while arriving at common decisions or outcomes,’ they wrote.

The Fair team gave bots this ability by estimating the ‘value’ of an item and inferring how much that is worth to each party.

But the bots can also find ways to be sneaky.

You can read more in the article by Daily Mail.