Tesla Model S Incident Overview

in #driverless7 years ago (edited)

The Tesla Model S features an autopilot mode that allows for what the company describes as semi-autonomous performance that can help reduce accidents. The Model S achieves autopilot through numerous sensors placed around the vehicle including a forward radar, forward camera, and ultrasonic sensors. This sensor suite allows the Model S to understand its environment, detect obstacles or other vehicles at any speed, and react to a dynamic environment (Thompson, 2016).

In 2015, the Tesla Model S updated the software update that allowed the vehicle to function on autopilot mode, including steering, changing lanes, and parking. (Truong, 2015). Less than a year later, in 2016, a Tesla Model S carrying one passenger in autopilot mode crashed into a tractor trailer, killing the occupant inside of the Tesla vehicle. The passenger was described as not paying attention to the road and watching a Harry Potter movie. Tesla CEO, Elon Musk, explained that the crash occurred because the vehicle’s sensors were unable to differentiate a white tractor trailer from an overhead road sign underneath a bright sky. As a result, the vehicle did not automatically brake and crashed under the trailer (Levin & Woolf, 2016).

This incident skylined the Tesla company and brought to the surface what is at stake with autonomous technology when it comes to carrying humans as passengers. A Federal investigation was conducted to determine whether or not the autopilot was to blame and a recall necessary and in 2017, the investigation found that Tesla’s autopilot was not to blame for the crash (Boudette, 2017). While some in the automotive industry were critical of the release, and even the naming of Tesla’s “autopilot,” the company responded by comparing fatality rates. “Tesla says Autopilot has been used for more than 130 million miles, noting that, on average, a fatality occurs every 94 million miles in the US and every 60 million miles worldwide” (Golson, 2016). Within this context, the vehicles perform above the norm when it comes to safety.

In response to the fatal incident, Tesla expressed sadness for the loss of the driver and released a software update which causes the Model S to rely more heavily on information from the radar sensors than camera sensors so that such an incident will not likely occur again (Schiefelbein, 2016). Although the 2015 crash was an anomaly, and occurred because of the fault of the passenger, the system still was unable to perform to what may have been inflated expectations that Tesla itself does not promise. Additionally, it was highly publicized event that encouraged the public to consider the potential risk of autonomy and driverless cars, and the importance of having a monitor the vehicle’s performance in the case of an emergency. Despite the improved fatality data, incidents like this one will likely occur again and will delay confidence in the industry. While this technology will improve and undoubtedly make roads safer, I think people will be hesitant until the technology meets or exceeds expectations and is shown to be practically faultless.

References:

Boudette, N. (2017). Tesla’s self-driving system cleared in deadly crash. Nytimes.com. Web. 10
June 2017.

Edmunds (2016). Tesla Model S. Retrieved from

Golson, J. (2016). Tesla driver killed in crash with autopilot active, NHTSA investigating. The Verge. Web. 10 June 2017.

Levin, S & Woolf, N. (2016). Tesla driver killed while using autopilot was watching harry potter, witness says. the Guardian. Web. 10 June 2017.

Shiefelbein, M. (2016). Tesla to update software in wake of fatal crash. Cbsnews.com. Web. 10 June 2017.

Thompson, C. (2016). Here's how tesla's autopilot works. Business Insider. Web. 10 June 2017.

Truong, A. (2015). Tesla just transformed the model s into a nearly driverless car. Quartz. Web. 10 June 2017.

Sort:  

Joshua Brown (like many other Tesla operators) was led into believing that the Autopilot was perfectly safe.
In fact NHTSA has published a claim (based on Tesla's data) that Autopilot is 40 % safer than a human driver.
Hardly surprising that Joshua felt very confident in the safety of the Autopilot.

The 40% safer claim was based on the number of airbag deployments before and after introduction of Autosteer and Automatic Emergency braking.
In order to provide even vaguely reliable data you would need a minimum sample size of 1000.
Are you telling me that Tesla had 1000 crashes resulting in airbag deployment in the study period?

Coin Marketplace

STEEM 0.18
TRX 0.16
JST 0.030
BTC 68236.23
ETH 2643.41
USDT 1.00
SBD 2.70