If Robots Need To Persuade Humans, They Shouldn’t Look Like Authority Figures: U of T Study

In the future, socially interactive robots could help the elderly age in place or help residents of long-term care facilities with their daily activities. But will people really take advice or instructions from a machine?

It depends on the behavior of the robot in question, according to a study from the University of Toronto.

“When robots present themselves as human-like social agents, we tend to play with that sense of humanity and treat them as we would a person,” explains Shane Saunderson, doctoral student in the Department of Mechanical and Industrial Engineering of the Faculty of Applied Sciences and Engineering who is the main author of an article recently published in Scientific robotics.

“But even simple tasks – like asking someone to take their medication – have a lot of social depth. If we are to put robots in these situations, we need to better understand the psychology of robot-human interactions. “

Saunderson says there is no silver bullet when it comes to persuasion, even among humans. But a key concept is authority, which can be divided into two types: formal authority and real authority.

“Formal authority comes from your role. If someone is your boss, teacher or parent, they have some formal authority, ”he explains. “True authority has to do with controlling decisions – often over such things as financial rewards or punishments. “

To simulate these concepts, Saunderson set up an experiment in which a humanoid robot named Pepper was used to help 32 volunteer test subjects perform a series of simple tasks, such as memorizing and recalling items in a sequence. .

For some participants, Pepper was presented as a formal authority figure: he played the role of researcher and was the only “person” with whom the subjects interacted. For others, Saunderson was presented as the researcher, while Pepper helped the subjects complete the tasks.

Each participant completed a set of three tasks twice. First, Pepper offered financial rewards for correct answers in order to simulate real positive authority. Pepper then proposed financial penalties for incorrect answers, simulating real negative authority.

In general, Pepper was less convincing when presented as an authority figure than when presented as a peer helper. Saunderson says the result could come from a question of legitimacy.

“Social robots are not common today, and in North America at least, people lack both relationships and a sense of shared identity with robots,” he says. “It might be difficult for them to see them as a legitimate authority. “

Another possibility is that people disobey an authoritative robot because they feel threatened by it. Saunderson notes that aversion to being persuaded by a robot acting with authority seemed to be particularly strong among male participants, who were found in previous studies to be more defiant of authority figures than women and may perceive an authoritarian robot as a threat to their status. or autonomy.

“The social behaviors of a robot are essential to the acceptance, use and confidence in this type of distribution technology – by society as a whole,” says Goldie Nejat, professor of mechanical engineering
who is Saunderson’s supervisor and the other co-author of the article.

Nejat holds the Canada Research Chair in Robots for Society and is a fellow of the Institute of Robotics at the University of Toronto. She and Saunderson led the work with support from AGE-WELL, a national network dedicated to creating technologies and services that benefit seniors and caregivers, as well as CIFAR.

“This groundbreaking research helps to understand how persuasive robots should be developed and deployed in everyday life, and how they should behave to help different demographics, including our vulnerable populations such as the elderly,” says- she.

Saunderson says the big advantage for social robot designers is to position them as collaborative and peer-driven, rather than dominant and authoritative.

“Our research suggests that robots face additional barriers to successful persuasion than those faced by humans,” he says. “If they are to take on these new roles in our society, their designers will have to take this into account and find ways to create positive experiences through their behavior. “


Source link

Comments are closed.