Bitte verwenden Sie diesen Link, um diese Publikation zu zitieren, oder auf sie als Internetquelle zu verweisen: https://hdl.handle.net/10419/293955 
Erscheinungsjahr: 
2024
Quellenangabe: 
[Journal:] Health Economics [ISSN:] 1099-1050 [Volume:] 33 [Issue:] 5 [Publisher:] Wiley [Place:] Hoboken, NJ [Year:] 2024 [Pages:] 929-951
Verlag: 
Wiley, Hoboken, NJ
Zusammenfassung: 
Using a representative survey with 1317 individuals and 12,815 moral decisions, we elicit Swedish citizens' preferences on how algorithms for self-driving cars should be programmed in cases of unavoidable harm to humans. Participants' choices in different dilemma situations (treatments) show that, at the margin, the average respondent values the lives of passengers and pedestrians equally when both groups are homogeneous and no group is to blame for the dilemma. In comparison, the respondent values the lives of passengers more when the pedestrians violate a social norm, and less when the pedestrians are children. Furthermore, we explain why the average respondent in the control treatment needs to be compensated with two to six passengers spared in order to sacrifice the first pedestrian, even though she values the lives of passengers and pedestrians equally at the margin. We conclude that respondents' choices are highly contextual and consider the age of the persons involved and whether these persons have complied with social norms.
Schlagwörter: 
choice experiments
ethical preferences
random utility model
relative values of life
robot cars
self‐driving cars
Persistent Identifier der Erstveröffentlichung: 
Creative-Commons-Lizenz: 
cc-by-nc-nd Logo
Dokumentart: 
Article
Dokumentversion: 
Published Version

Datei(en):
Datei
Größe
1.19 MB





Publikationen in EconStor sind urheberrechtlich geschützt.