○ Self-driving cars - How moral should we make our robots?
Source: wired
Ethical dilemmas and legal problems for the assignment of responsibilities in civil matters. This is the hot topic in recent years.
Self-driving cars are already here. Widespread adoption of this technology will lead to less traffic, less pollution, and fewer car accidents. These cars bring to life a longstanding hypothetical question: How moral should we make our robots?
What if an autonomous vehicle is faced with the choice to either kill its passengers or kill others? How should it be programmed?
In online surveys, participants approve of autonomous vehicles that sacrifice the driver to save many others. Basically a utilitarian approach that aims to preserve the most amount of life but respondents would prefer not to buy these kinds of cars. And, they don’t want regulations that require self-sacrificing algorithms. In fact, such regulations would make them less willing to buy a self-driving car. The rise of autonomous vehicles creates a social dilemma: Roads would be much safer if there were more autonomous cars. But programming that would actually make driving safer, might prevent people from buying these vehicles.
Imagine shopping for a car and having the choice: a car that sacrifices the driver on occasion or a “preserve the passenger at all costs” car. What would you do?
Previous posts
○ 3D PRINTING INCREASINGLY PRECISE
Thank you for reading.
Join the comunity: @cristianv.itali
Tags: #technology #news #science #education #new
Congratulations @cristianv.itali! You received a personal award!
You can view your badges on your Steem Board and compare to others on the Steem Ranking
Vote for @Steemitboard as a witness to get one more award and increased upvotes!