In Risk analysis : an official publication of the Society for Risk Analysis
Human-automation cooperation has become ubiquitous. In this concept, automation refers to autonomous machines, robots, artificial intelligence, and other autonomous nonhuman agents. A human driver will share control of semiautonomous vehicles (semi-AVs) with an automated system and thus share responsibility for crashes caused by semi-AVs. Research has not clarified whether and why people would attribute different levels of blame and responsibility to automation (and its creators) and its human counterpart when each causes an equivalent crash. We conducted four experiments in two studies (total N = 1,045) to measure different responses (e.g., severity and acceptability judgment, blame and responsibility attribution, compensation judgment) to hypothetical crashes that are caused by the human or the automation in semi-AVs. The results provided previously unidentified evidence of a bias, which we called the "blame attribution asymmetry," a tendency that people will judge the automation-caused crash more harshly, ascribe more blame and responsibility to automation and its creators, and think the victim in this crash should be compensated more. This asymmetry arises in part because of the higher negative affect triggered by the automation-caused crash. This bias has a direct policy implication: a policy allowing "not-safe enough" semi-AVs on roads could backfire, because these AVs will lead to many traffic crashes, which might in turn produce greater psychological costs and deter more people from adopting them. Other theoretical and policy implications of our findings were also discussed.
Liu Peng, Du Yong
affect heuristic, blame and responsibility attribution, blame attribution asymmetry, human-automation cooperation, semiautonomous vehicles