Robots can develop the basics of language skills through interacting with a human, according to new research from the University of Hertfordshire. The research has just been published in the journal PLoS ONE.
The researchers, Dr Caroline Lyon, Professor Chrystopher Nehaniv, and Dr Joe Saunders have recently done new experiments as part of the iTalk project with the childlike iCub humanoid robot to explore how language learning emerges. At first the robot can only babble and only perceives speech as a string of sounds, not differentiating words. But after a few minutes of “conversation” with humans, in which the participants were instructed to speak to the robot as if it were a small child, the robot then adapted its output to the most frequently heard syllables to produce some word forms, such as the names of simple shapes and colours.
Dr Caroline Lyon said: “It is known that infants are sensitive to the frequency of sounds in speech, and these experiments show how this sensitivity can be modelled and contribute to the learning of word forms by a robot.”
The iTalk project teaches a robot to speak by using methods similar to those used to teach children, and is a key part in the learning process of the human-robot interaction. Although the iCub robot learns to produce word forms, it does not know their meaning, and learning meanings is another part of the iTalk project’s research. This research could have significant impact on future, and future generations of interactive robotic systems.
Source: AlphaGalileo Foundation
Image Credits: University of Hertfordshire