Tesla Autopilot Kills Again?

Tesla Autopilot Kills Again?
By George Khoury, Esq. on May 20, 2019 3:01 PM

Tesla's autopilot feature may be the stuff of science-fiction fantasy, but it’s turning out to look more like one of those dystopian stories where the robots rise up and take control. After all, there have been more than a few autopilot-related fatalities since the feature was released.

It was recently discovered that in the fatal Tesla accident in Florida this past March, that not only was the driver speeding, the autopilot feature was engaged when the vehicle crashed into a semi-truck trailer, ripping the roof right off the car and killing the driver. And while Tesla maintains that the autopilot feature is safe so long as there is an attentive driver still behind (and holding) the wheel, this, and other autopilot crashes, have raised some serious questions for the eclectic electric automaker. Shockingly, in the Florida fatality, the driver reportedly had only taken his hands off the wheel for eight seconds.

Autopilots Don't Need Maps

Apparently, according to one commentator, a significant issue with Tesla's autopilot is the fact that it does not rely on detailed maps. Instead, the car creates its own details while driving using a multitude of sensors. Other companies exploring autonomous driving seem to disagree with this concept, and are opting to use both sensors and detailed maps.

Basically, it's suggested that if the data from those sensors was used in conjunction with data from detailed maps, some of these Tesla autopilot crashes could have been avoided. There have been two crashes where Tesla vehicles auto-piloted underneath semi-truck trailers, which could have been avoided, potentially, if the vehicle was able to cross-reference a map to see that the trailers were obstacles that shouldn't be there, and not something like an overpasses.

Autopilot at Your Own Risk

With all the new automotive technology waiting right around the corner, it's clear that people who want to use features like autopilot, shouldn't put blind trust into the early versions of the tech. Being an early adopter of a technology that can kill or injure you and others means being extra-aware of the risks. This also means that the person who engages autopilot is still liable for any damages and injuries caused by an accident their vehicle causes. Tesla also is looking at liability, potentially, as well, as these recent cases are likely to spawn wrongful death lawsuits against the company.

Related Resources: