PEOPLE BLAME A VEHICLE’S AUTOMATED SYSTEM MORE THAN ITS DRIVER WHEN ACCIDENTS HAPPEN

Experts predict that autonomous vehicles (AVs) will eventually make our roads safer since the majority of accidents are caused by human error. However, it may be some time before people are ready to put their trust in a self-driving car.

A new study in the journal Risk Analysis found that people are more likely to blame a vehicle’s automation system and its manufacturer than its human driver when a crash occurs.

Semi-autonomous vehicles (semi-AVs), which allow humans to supervise the driving and take control of the vehicle, are already on the road. For example, the 2020 Tesla Model S offers an Autopilot system, and the 2020 Cadillac CT6 has a Super Cruise system. In both, the driver must be ready to take control of the car at any moment.

However, this new study suggests that questions are likely to arise regarding blame, responsibility, and compensation when a semi-AV is involved in a collision.

Researchers led by Peng Liu, an associate professor in the College of Management and Economics at Tianjin University, conducted experiments to measure participants’ responses to hypothetical semi-AV crashes. When an accident was caused by a vehicle’s automated system, participants assigned more blame and responsibility to the automation and its manufacturer and indicated that the victim should be compensated more, compared to a crash caused by a human driver. They also judged the automation-caused crash to be more severe and less acceptable than one caused by a human, regardless of the seriousness of the crash (involving an injury or fatality). 

Liu and his colleagues call this bias against automated systems “blame attribution asymmetry.” It indicates the tendency for people to over-react to automation-caused crashes, possibly owing to the higher negative affect, or feelings and emotions, evoked by these crashes. Negative emotions such as anger can amplify attributions of legal responsibility and blame.

The authors point out that the same kind of affect-induced blame attribution asymmetry may come into play in other cooperative situations involving humans and machines. For example, surgeons working with medical robots and pilots working with military drones.

Policymakers and regulators need to be aware of people’s potential over-reaction to crashes involving AVs when they set policies for deploying and regulating them, particularly with regard to financial compensation for victims injured or killed by automated systems. “According to our findings, they might need to consider the possibility that to lay people, victims of AV crashes should be compensated more than commonly calculated,” the authors write.

A policy that allows what people feel are “unsafe” semi-AVs on roads could backfire as the inevitable accidents that will occur may deter more people from adopting them. To change people’s negative attitudes about semi-AVs, Liu argues that “public communication campaigns are highly needed to transparently communicate accurate information, dispel public misconceptions, and provide opportunities to experience semi-AVs.”

In a previous study, Liu and his colleagues conducted a field experiment where 300 participants experienced being a passenger in a semi-AV. “This direct experience led to a significant increase in trust and a reduction in negative feelings and emotions about semi-AVs,” he says.

###

About SRA 

The Society for Risk Analysis is a multidisciplinary, interdisciplinary, scholarly, international society that provides an open forum for all those interested in risk analysis. SRA was established in 1980 and has published Risk Analysis: An International Journal, the leading scholarly journal in the field, continuously since 1981. For more information, visit www.sra.org

withyou android app