Whiting T, Gautam A, Tye J, Simmons M, Henstrom J, Oudah M, Crandall JW. Confronting barriers to human-robot cooperation: balancing efficiency and risk in machine behavior.
iScience 2021;
24:101963. [PMID:
33458615 PMCID:
PMC7797565 DOI:
10.1016/j.isci.2020.101963]
[Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/20/2020] [Revised: 11/27/2020] [Accepted: 12/15/2020] [Indexed: 11/19/2022] Open
Abstract
Many technical and psychological challenges make it difficult to design machines that effectively cooperate with people. To better understand these challenges, we conducted a series of studies investigating human-human, robot-robot, and human-robot cooperation in a strategically rich resource-sharing scenario, which required players to balance efficiency, fairness, and risk. In these studies, both human-human and robot-robot dyads typically learned efficient and risky cooperative solutions when they could communicate. In the absence of communication, robot dyads still often learned the same efficient solution, but human dyads achieved a less efficient (less risky) form of cooperation. This difference in how people and machines treat risk appeared to discourage human-robot cooperation, as human-robot dyads frequently failed to cooperate without communication. These results indicate that machine behavior should better align with human behavior, promoting efficiency while simultaneously considering human tendencies toward risk and fairness.
Experiments show that people learned risk-averse solutions without communication
With and without communication, robot pairs learned risky, but efficient, outcomes
Human-robot pairs often learned risky, but efficient, solutions with communication
Without communication, behavioral asymmetries inhibited human-robot cooperation
Collapse