Artificial intelligences now talk to each other
Robots could soon give verbal instructions to other robots, according to the University of Geneva.
A team led by Alexandre Pouget from the University of Geneva is enabling two artificial intelligences (AI) to communicate with each other profitably. "As far as we know, AI-supported language agents have not yet been able to translate a verbal or written instruction into a sensorimotor action, and even less to explain it to another AI so that it can reproduce it," says Pouget. It has now been possible to equip an artificial neuronal model with this dual ability.
"We started with S-Bert, an existing model that has 300 million neurons and is trained to understand language. We connected it to another, simpler network of a few thousand neurons," says Pouget's doctoral student Reidar Riveland. The neuroscientists then trained this network to simulate Wernicke's area, the part of the human brain that enables the perception and interpretation of language.
In the second phase, they trained Broca's area, which is responsible for the articulation of words. They then transmitted written instructions in English to the other network. The scientists then had the two networks chat with each other - with the aim of having one network do what the other network told it to do.
Attractive for the robotics sector
Initially, it was very simple actions that one network learned to perform and describe in such a way that the other could imitate them. "But this model opens up new horizons for understanding the interaction between language and behavior," write the two researchers.
It is now particularly promising for the robotics sector, where the development of technologies that enable machines to talk to each other is a priority. "Nothing now stands in the way of developing much more complex networks on this basis, which could be integrated into humanoid robots that are able to understand us and other robots," it concludes.
Source: University of Geneva