Home » Robotics

Robots that develop emotions in interaction with humans

By Damir Beciri
13 August 2010

robot-nao-and-dr-lola-canameroThe first prototype robots capable of developing emotions as they interact with their human caregivers and expressing a whole range of emotions have been finalized by researchers. Led by Dr. Lola Cañamero at the University of Hertfordshire, and in collaboration with a consortium of universities and robotic companies across Europe, these robots differ from others in the way that they form attachments, interact and express emotion through bodily expression.

Developed as part of the interdisciplinary project FEELIX GROWING (Feel, Interact, eXpress: a Global approach to development with Interdisciplinary Grounding), funded by the European Commission and coordinated by Dr. Cañamero, the robots have been developed so that they learn to interact with and respond to humans in a similar way as children learn to do it, and use the same types of expressive and behavioral cues that babies use to learn to interact socially and emotionally with others.

The robots have been created through modeling the early attachment process that human and chimpanzee infants undergo with their caregivers when they develop a preference for a primary caregiver. They are programmed to learn to adapt to the actions and mood of their human caregivers, and to become particularly attached to an individual who interacts with the robot in a way that is particularly suited to its personality profile and learning needs. The more they interact, and are given the appropriate feedback and level of engagement from the human caregiver, the stronger the bond developed and the amount learned.

The robots are capable of expressing anger, fear, sadness, happiness, excitement and pride and will demonstrate very visible distress if the caregiver fails to provide them comfort when confronted by a stressful situation that they cannot cope with or to interact with them when they need it. This is the first time that early attachment models of human and non-human primates have been used to program robots that develop emotions in interaction with humans.

“We are working on non-verbal cues and the emotions are revealed through physical postures, gestures and movements of the body rather than facial or verbal expression,” Dr Cañamero added.

The researchers led by Dr. Cañamero at the University of Hertfordshire are now extending the prototype further and adapting it as part of the EU project ALIZ-E, which will develop robots that learn to be a companion and care for diabetic children in hospital settings.

Within this project which will last the next four and a half years, the Hertfordshire group will lead research related to the emotions and non-linguistic behavior of the robots. The future robot companions will combine non-linguistic and linguistic communication to interact with the children and become increasingly adapted to their individual profiles in order to support both, therapeutic aspects of their treatment and their social and emotional wellbeing.

The complexity of algorithms needed to define reactions to moods of users and certain circumstances with the proper reaction and appropriate cues can be pretty big. Although expressing some emotions through cues could be useful in human-robot interaction, some of the emotions could seem a bit creepy (take a look at the video in our article about Nexi MDS robot).

There are many questions that need to be answered such as how well it will be accepted, is this program going to leave a positive effect during the interaction with people or treated patients, the consistency of some emotions robot could attach to some particular persons and the reset of the same emotions.

Tags: , , , ,

Leave your response!

Our website is protected by Akismet and any spam or non-related discussion will be blacklisted.

Please keep your comment under 2400 characters.

You can use these tags:
<a href="" title=""> <abbr title=""> <cite> <em> <q cite=""> <strike> <strong> <acronym title=""> <blockquote cite="">

If you want your image next to your comments, please register at Gravatar and set your image there.