Robots are increasingly being considered for use in highly tense civilian encounters to minimize person-to-person contact and danger to peacekeeping personnel. Trust, along with physical qualities and cultural considerations, is an essential factor in the effectiveness of these robotic peacekeepers. New research to be presented at the HFES 2015 Annual Meeting in Los Angeles in October examines the importance of social cues when evaluating the role of trust in human-robot interaction.
Joachim Meyer, coauthor of "Manners Matter: Trust in Robotic Peacekeepers" and a professor at Tel Aviv University's Department of Industrial Engineering, notes that "interactions between machines and people should follow rules of behavior similar to the rules used in human-to-human interaction. Robots are not seen as mindless technology; rather, they are considered agents with intentions."
Meyer and coauthor Ohad Inbar asked 30 participants to report first impressions of a humanoid peacekeeping robot interacting with individuals using varying levels of politeness. The scenario they evaluated depicted the robot in charge of inspecting people who were trying to enter a building.
Results indicated that participants' attitude toward the robot was largely influenced by whether they perceived it to be polite or threatening. Neither the age nor gender of the person interacting with the robot was found to have a significant impact on participants' impressions.
The authors were surprised by the results. Accepted social etiquette suggests that participants might prefer a "gentler, more polite" approach toward the elderly and women and would therefore judge the robot more harshly in interactions with that population. However, Meyer says, "A robot who acted rudely toward an older lady was not evaluated more negatively than one with similar behavior toward a young man. Perhaps our rude robot's behavior was so rude that it overshadowed anything else."
No entries found