When little human children start to speak they begin by babbling in gibberish and then learn to speak real words – often from the time they are between six and 14 months old. Could a humanoid robot be programmed to “learn” in a similar fashion?
A team of cutting-edge UK researchers from the University of Hertfordshire wanted to see if a humanoid robot could be taught to speak words this way, according to a report from Discovery News. So they programmed the iCub robot – named DeeChee – with syllables that are found in English – which total about 40,000.
The three-foot robot took part in dialogues that lasted some eight minutes and involved some 34 different people. DeeChee listened, spoke and also detected positive feedback from the people when it repeated any of the syllables and words that were uttered to it. DeeChee repeated syllables it heard, and after the repetition the robot recognized encouragement from the people and then paid more attention to syllables that came before any praise – following a path seen by human babies learning to speak. The researchers said DeeChee spoke real words more than random syllables after the learning sessions.
"This work shows the potential of human-robot interaction systems in studies of the dynamics of early language acquisition," Caroline Lyon, a University of Hertfordshire computer scientist, said in the article on the Adaptive Systems Research Group study that was recently published in PLoS ONE.
“In our experiments some salient one-syllable word forms are learnt by a humanoid robot in real-time interactions with naive participants,” the article says. “Words emerge from random syllabic babble through a learning process based on a dialogue between the robot and the human participant, whose speech is perceived by the robot as a stream of phonemes… We observe that salient content words are more likely than function words to have consistent canonical representations; thus their relative frequency increases, as does their influence on the learner… Word forms are usually produced by the robot after a few minutes of dialogue, employing a simple, real-time, frequency dependent mechanism.”
In another recent study, Lola Cañamero at the University of Hertfordshire worked with other researchers to come up with prototype robots capable of developing emotions as they interacted with human caregivers and expressed a range of emotions. The robots can express anger, fear, sadness, happiness, excitement and pride, according to the university research.
“They are programmed to learn to adapt to the actions and mood of their human caregivers, and to become particularly attached to an individual who interacts with the robot in a way that is particularly suited to its personality profile and learning needs,” according to a university statement. “The more they interact, and are given the appropriate feedback and level of engagement from the human caregiver, the stronger the bond developed and the amount learned.”
In a related matter, the University of Hertfordshire was awarded Entrepreneurial University of the Year 2010 award by Times Higher Education (THE), according to SDNzone.
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO West 2012, taking place Oct. 2-5, in Austin, TX. ITEXPO offers an educational program to help corporate decision makers select the right IP-based voice, video, fax and unified communications solutions to improve their operations. It's also where service providers learn how to profitably roll out the services their subscribers are clamoring for – and where resellers can learn about new growth opportunities. For more information on registering for ITEXPO click here.
Stay in touch with everything happening at ITEXPO. Follow us on Twitter.
Edited by Brooke Neuman