National Highway Traffic Safety Administration (NHTSA) recently ended an investigation into a fatal car accident. This investigation was unlike any other, because its purpose was to determine what happened in a deadly motor vehicle accident involving the autopilot system of a Tesla car. NHTSA concluded its investigation with a warning that even when a car is operating with an autopilot system, drivers still must pay careful attention to the road around them.
The case raises interesting questions, especially in light of NHTSA's admonition to drivers that they cannot just rely solely on an autopilot system when they are in a self-driving car. This is a question on which the law will evolve on over the next several years as self-driving cars become more common.
Because of the complexity associated with these types of cases, crash victims - and especially victims of early accidents - will need to make certain they are represented by attorneys who can make the most compelling arguments possible on the issue of liability.
According to the LA Times, the crash involving the Tesla happened when a Model S in autopilot mode drove underneath a big rig truck that was making a left turn across a highway. The driver of the Tesla was killed in the incident. It was the only fatal accident involving Tesla's autopilot mode up to this point and Tesla's CEO claims there have been substantially fewer accidents per mile driven by autopilot as compared with accidents per mile in cars driven by people.
The driver of the big rig involved in the accident told police he heard a Harry Potter movie playing after the incident, which suggests that the driver of the Tesla may not have been paying attention to the road at all but instead may have been watching a movie. Evidence that the driver was not focused at all on the road prompted NHTSA's warning to drivers about the need to pay attention.
The question in cases like this, and in other cases where autopilot systems contribute to car accidents, is whether the maker of the car and automated driving system is responsible for accidents, or whether the driver is to blame because ultimately it is the driver's obligation to be safe.
Scientific America reports that: "When a computerized driver replaces a human one, experts say the companies behind the software and hardware sit in the legal liability chain-not the car owner or the person's insurance company. Eventually, and inevitably, the car makers will have to take the blame." However, Tesla's on-screen instructions and owner's manual try to make clear that human drivers are ultimately the ones responsible. Until case law begins to evolve on this issue, and/or until lawmakers pass laws clearly outlining who is liable and under what circumstances, this will remain an open question.