Featured Article

September 11, 2012

What Robots Can Teach Us about Trust

We can imagine robots teaching us lots of things. We can picture robotic teachers wandering the aisles of a classroom, instructing students on topics like history, engineering or mathematics. We can also probably imagine robotic driving instructors or golf coaches. But did you ever think that a robot could teach us about human interactions, about the nature of trust itself?

Well, researchers from Northeaster University, MIT and Cornell recently conducted a study that used robots to help shed light on why people instinctively trust others. The study showed four different behaviors that work in our brains to warn us when people are not to be trusted.

The results, published in the journal Psychological Science, show some interesting results, telling us that common assumptions about avoidance of eye contact and fidgeting are not necessarily red flags for distrust. Researchers developed a game that allowed subjects to be trustworthy and cooperative or untrustworthy, creating different risk/reward scenarios for the spectrum of behavior.

Before the subjects started playing, researchers videotaped subject interactions and identified four gestures that predicted untrustworthiness: leaning away from others, crossing arms, rubbing or grasping hands and touching oneself on the face or abdomen. In combination, these gestures were strong predictors of untrustworthiness.

This is where the robot comes in. Instead of pairing subjects with other people, researchers teamed them up with friendly-faced robots and allowed them to have 10-minute conversations with the robot before playing. When the robot mimicked the hand and body gestures associated with distrust, the subjects paired with that robot made in-game decisions that indicated that they did not, in fact, trust the robot. The distrust seemed to be subconscious, as subjects did not rate the robots as untrustworthy in follow-up questionnaires.

“It makes no sense to ascribe intentions to a robot,” said an author of the study, Robert H. Frank, an economics professor at Cornell. “But it appears we have certain postures and gestures that we interpret in certain ways. When we see them, whether it’s a robot or a human, we’re affected by it, because of the pattern it evokes in our brain responses.”

Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO West 2012, taking place Oct. 2-5, in Austin, TX.  Stay in touch with everything happening at ITEXPO. Follow us on Twitter.




Edited by Rich Steeves


Back to RobotXworld Home


Comments powered by Disqus

Related Articles