○ Self-driving cars - How moral should we make our robots?

in #technology6 years ago (edited)

#2.1.png
Source: wired

Ethical dilemmas and legal problems for the assignment of responsibilities in civil matters. This is the hot topic in recent years.

Self-driving cars are already here. Widespread adoption of this technology will lead to less traffic, less pollution, and fewer car accidents. These cars bring to life a longstanding hypothetical question: How moral should we make our robots?

What if an autonomous vehicle is faced with the choice to either kill its passengers or kill others? How should it be programmed?

In online surveys, participants approve of autonomous vehicles that sacrifice the driver to save many others. Basically a utilitarian approach that aims to preserve the most amount of life but respondents would prefer not to buy these kinds of cars. And, they don’t want regulations that require self-sacrificing algorithms. In fact, such regulations would make them less willing to buy a self-driving car. The rise of autonomous vehicles creates a social dilemma: Roads would be much safer if there were more autonomous cars. But programming that would actually make driving safer, might prevent people from buying these vehicles.

Imagine shopping for a car and having the choice: a car that sacrifices the driver on occasion or a “preserve the passenger at all costs” car. What would you do?

g4269.png

Previous posts

○ 3D PRINTING INCREASINGLY PRECISE

g4269.png

Thank you for reading.
Join the comunity: @cristianv.itali
Tags: #technology #news #science #education #new

Footer.png

Sort:  

Congratulations @cristianv.itali! You received a personal award!

Happy Birthday! - You are on the Steem blockchain for 1 year!

You can view your badges on your Steem Board and compare to others on the Steem Ranking

Vote for @Steemitboard as a witness to get one more award and increased upvotes!

Coin Marketplace

STEEM 0.19
TRX 0.14
JST 0.030
BTC 62900.39
ETH 3357.78
USDT 1.00
SBD 2.47