Self-Driving Cars, Accidents and Who is at Fault

For the longest time, self-driving cars were something we could only find in science fiction novels. However, in recent years Tesla has changed this. Tesla’s autopilot feature was unveiled in October 2015, and combined cruise control and auto steer to keep Tesla cars within the painted lines. [1] While this feature is exciting, it also creates new legal questions regarding fault in a fatal car crash. Most recently, a California prosecutor filed two counts of vehicular manslaughter against a Tesla driver on autopilot who ran a red light, resulting in a collision and the death of two people in 2019. Officials say the Tesla was moving at very high speeds while leaving the freeway and eventually running the red light. National Highway and Traffic Safety Administration later confirmed autopilot was being used at the time of the accident. This marks the first person to be charged with a felony for a fatal crash involving automated driving systems in the United States. In addition to the criminal charges filed, the families of the deceased have sued Tesla as well as the driver. They accused the driver of negligence and Tesla of selling defective vehicles that accelerate without warning and lack emergency braking systems.

Automated Driving Systems – A History of Failure

The criminal charges in this case are the first in relation to a widely used automated system; however, there have been charges against automated driving systems in the past. For example, in 2020 Arizona officials filed negligent homicide charges against a driver who was taking part in a fully autonomous vehicle Uber test. The vehicle struck and killed a pedestrian.

In addition, Autopilot seems to encourage driver misuse and inattention. Some examples of these include the 2018 crash in Culver City, California where a Tesla hit a firetruck. Officials said the autopilot system allowed the driver to disengage from the road and do other things. Another example was last May when a man was arrested for sitting in the backseat while his Tesla was auto-driving on the freeway. These are just a few of the 26 incidents and 11 deaths that have resulted from Autopilot since 2016, according to the (National Highway Traffic Safety Administration) NHTSA records.

Autopilot can control steering, speed and braking which can be useful to a driver. But this California filing serves as a warning that autopilot is a tool, not something drivers can rely on to completely control their vehicle. There are an estimated 765,000 Tesla vehicles equipped with this technology on United States roads alone.

These car crashes have also placed a spotlight on Tesla, as people question the full abilities of their self-driving software and their stance on these incidents. The company has attempted to update its software by improving emergency vehicle sensing as well as making it harder for drivers to abuse it. The company has also made it clear that their self-driving feature do not mean that the system can drive itself, and still requires drivers to pay attention to the road and react as a normal driver would. [2]

Legal professionals have also weighed in on these charges, and a University of South Carolina law professor said that these felony charges are unprecedented and believed that Tesla could possibly be held “criminally, civilly, or morally culpable” for putting dangerous technologies on the road. Donald Slavic, a consultant in automotive technology held similar sentiments and believed that Tesla’s disbandment of its media relations department does not provide a good look for the company [3].

Who is to Blame for Self-Driving Accidents?

Who is to blame in these situations? The NHTSA and National Transportation Safety Board (NTSB) have been reviewing these recent crashes and the misuse of Autopilot as well as the overconfidence and inattention that it results in. They label this as “automation complacency” and state that while Autopilot exists, it should be a tool, and drivers are still responsible for the vehicle and should be attentive to their surroundings at all times.

While the case will go to trial only in mid-2023, we can understand the following about who takes the blame in autopilot crashes. The driver is at fault because autopilot is an assistive technology, and the driver is still responsible for paying attention to the road and stepping in if the technology begins to malfunction. However, Tesla would also take part of the blame if there was a malfunction within their system; in this case, the family of the deceased claims that the Tesla autopilot malfunctioned and accelerated unexpectedly. It is also possible that the collision avoidance system did not activate at the time of the crash. If this can be proven, then Tesla is also culpable in the crash.

Despite this, Tesla and CEO Elon Musk has often blamed the driver rather than the technology in these accidents. Many people choose to buy a Tesla due to its auto-driving feature and the company’s promise that it enhances safety and convenience on the road. It would be unfair for the company to advertise the product in this way and then take no responsibility when crashes occur. This also raises questions on whether Tesla has received complaints regarding malfunctions in the auto-pilot technology in the past. If so, this would mean they have not been as transparent as they could be with their customers regarding possible safety concerns, and had the customers known they would not have put blind faith in a possibly flawed technology. [4]  It is certain that this isn’t the first case that will be heard by United States Courts as auto-pilot features are being released by nearly every auto manufacturer in the world. And if and when  self-driving trucks and semis hit the road, what would the causalities look like if these were to malfunction?

For more information on the future of self-driving cars, please reference Felipe Hernandez’s prophetic essay on The Future of Driver-less Cars and How this will Affect the Field of Personal Injury

[1] https://www.consumerreports.org/autonomous-driving/timeline-of-tesla-self-driving-aspirations-a9686689375

[2] https://www.8newsnow.com/news/national-news/felony-charges-are-1st-in-a-fatal-crash-involving-autopilot/

[3] https://www.npr.org/2022/01/18/1073857310/tesla-autopilot-crash-charges

[4] https://kramerlawgroup.org/tesla-autopilot-fails-blames-driver-for-crash/

Free Consultation

This field is for validation purposes and should be left unchanged.