What happens when psychologists and roboticists get together? Well, apparently they create trustworthy robots.
Why do we trust people? An interdisciplinary team of researchers from Northeastern, MIT and Cornell universities in the US believes that it's down to subtle non-verbal cues that we pick up on as we interact with people, and has created a wide-eyed robot to try and find out exactly which gestures help us decide whether a person is trustworthy or not.
Human participants make small talk with the team's social robot, Nexi, for ten minutes, and then play a game called "Give Some," where they have to decide how much money Nexi might give them at the expense of her personal profit, and conversely how much money they want to give her. What the participants don't know is that Nexi is programmed to make specific gestures during the conversation portion of the test to try and determine how those gestures affect how trustworthy she is perceived to be.
"People tend to mimic each other's body language, which might help them develop intuitions about what other people are feeling - intuitions about whether they'll treat them fairly," said David DeStano, a psychology professor at Northeastern. "Using a humanoid robot whose every expression and gesture we can control will allow us to better identify the exact cues and psychological processes that underlie humans' ability to accurately predict if a stranger is trustworthy."
"The goal was to simulate a normal conversation with accompanying movements to see what the mind would intuitively glean about the trustworthiness of another," said DeSteno. "Trust might not be determined by one isolated gesture, but rather a 'dance' that happens between the strangers."
Aside from simply fascinating in its own right, there's a lot of interesting things you could do with this research. A more complete understanding of how we express ourselves non-verbally could go a long way to overcoming the uncanny valley effect, although infiltrator droids using it against us will obviously remain a concern.