Now you know all about Autonomous Vehicles, but an issue remains: who is responsible when something goes wrong?
As we continue the transition to higher automation vehicles, liability and responsibility will become a murkier concept. After all, who is truly at fault when a highly automated vehicle is involved in a collision: (a) the driver, (b) the software company/manufacturer, or (c) some combination of the two?
At least one auto manufacturer seems to think that higher automation means liability on the manufacturer. In 2015, Volvo publicly stated that Volvo would accept all liability when one of their autonomous vehicles is involved in an incident/accident when in autonomous mode.1
On the other hand, the NHTSA seems to think the question of liability - at least for Level 3 vehicles - is still an issue of driver error. For example, in May 7, 2016, a Tesla S driven by Joshua Brown was involved in a motor vehicle accident. This crash is considered the first Level 3 autonomous vehicle crash. Mr. Brown was known for over relying on the autonomous system of his vehicle, and at the time of this accident he had turned on the auto-pilot and set the cruise control. When a tractor-trailer came out of an intersecting road, the system did not properly register or respond to this change in road conditions and a collision occurred. However, the NHTSA determined that since the vehicle was a Level 3 AV, the driver was still in charge and he failed to pay attention; therefore, the cause of the accident was human error.
The National Transportation Safety Board ("NTSB") has stated that auto manufacturer must be pro-active in curtailing driver overreliance on the automation. With this in mind, the NTSB has placed at least some responsibility on the auto manufacturers to implement restrictions designed to prevent the human driver from becoming over reliant on the AV system.2 In other words, automated vehicle control systems should have limitations that restrict operation of the system to conditions for which such systems were designed and are appropriate. For example, Tesla has improved their Autopilot by adding new limits on hands-off driving and other features to monitor the driver's attention. Therefore, liability can potentially be placed on auto manufacturers for not properly designing the software system to stop driver overreliance.
So, how will liability be determined via a regular negligence claim, products liability, or no fault litigation? Recently, a potential for a test case arose when Uber's self-driving test vehicle fatally struck a pedestrian while in autonomous mode. However, the accident occurred on March 19, 2018, and by March 29, 2018, Uber had reached a settlement with the deceased pedestrian's family.3
Since we are still dealing with Level 3 autonomous vehicles, driver input and attention is still required, and negligence suits against drivers are likely to remain the most common claims arising from motor vehicle accidents.
When Level 4 and Level 5 autonomous vehicles enter the market place, which the NHTSA states should be around 2025,4 driver negligence is likely to fade as the focus of motor vehicle litigation, and suits against manufacturers and sellers may become the norm.
Automation technology is here, and higher automation of motor vehicles is likely to become more common place. For now, the duty to exercise ordinary care remains on drivers of motor vehicles, even Level 3 automated vehicles. However, the focus of responsibility and liability in the context of motor vehicle accidents is likely to shift as even higher levels of autonomous vehicles are introduced to the driving public. Be warned and be prepared. A change is coming.
2. William B. Pentecost, Jr., Autonomous Vehicles: Is the Florida Tesla Crash a Harbinger of an Onslaught of Design Defect Claims, or Is It Simply Another Case of Driver Negligence?, The Transportation Lawyer, NEED YEAR FROM MHB, at 16.