Current events focus a spotlight on potential criminal liability for operation of an automated vehicle. In Arizona, the safety driver in an Uber robotaxi pled guilty to an undesignated felony in response to a charge of negligent homicide for a fatality that occurred while an automated driving system (ADS) was engaged. Shortly before that, the owner of a Tesla pled no contest to a charge of negligent homicide for fatalities caused while Tesla AutoPilot, which automates vehicle control under driver supervision, was engaged.
In both cases, automation controlled the braking, speed and steering of the vehicle at the time of the accident. Prosecutors in both cases pursued criminal charges against the human operator on the theory that, despite use of an automation system, both drivers had ultimate responsibility for the safe operation of the vehicle. Assignment of responsibility to the human operator in these cases is consistent with limited existing case law. However, the decision to prosecute ignored the very real problem of automation complacency as an excuse, though it may have been a mitigating factor in sentencing without jail time.
The SAE J3016 terminology standard bases the automation level on the manufacturer’s design intent. Tesla states that the automation used in the California fatalities was a Level 2 feature. Level 2 requires that the human driver remains vigilant at all times, ready to assume immediate control of the vehicle to avoid any accident or dangerous situation. The Tesla owner’s manual similarly required constant driver attention. Uber stated that their robotaxi operated at SAE Level 4, which does not require any human intervention for a series production vehicle. However, during the testing that resulted in the fatality, Uber assigned a safety driver the responsibility to intervene to prevent an accident, just as in the case of the Level 2 Tesla.
Enter the Level 3 Mercedes-Benz, set for deployment in Nevada and California. A series production Level 3 driving feature does not require that a human driver remain vigilant at all times. Rather, it contemplates that the human driver can focus on other tasks while the automated driving system is engaged (e.g., reading a book or watching a video). However, as part of the vehicle’s safety concept, a human driver must remain able to respond to a system request that the human driver assume control of the vehicle. The UN ECE #157 standard for Automated Lane Keeping Systems (ALKS) used as a basis for permission to operate in Europe specifies a 10 second grace period after which the human driver is expected to assume control.
Law reform should clarify several liability points. First, the manufacturer’s stated design intent of a vehicle feature does not necessarily control the legal determination of criminal liability. A court currently is free to decide that an operator of a Level 3 vehicle has liability for automation failures while the automation is engaged, just as in the Arizona and California cases. Mercedes does not dictate the parameters of criminal law via a paragraph in an owner’s manual or a press release. The law might hold the operator responsible while any type of automation is engaged. Manufacturers should be anxious to get clarity from state legislatures because certainty gives assurance to its customers. Selling a Level 3 product for which a human operator has potential criminal responsibility at all times should present a significant marketing problem.
Second, there is the thorny question of liability following a takeover request. Does the operator of the Level 3 vehicle potentially have liability for any accident immediately following a takeover request, or only for accidents occurring after a grace period such as the 10 seconds specified for ALKS (which is a European standard that is not adopted anywhere in the United States)? Will there be real world scenarios in which a reasonable driver could not have assumed control within the 10 second grace period? This might occur if the AV placed the driver in an untenable or unrecoverable position at the time of a takeover request.
Third, there is the question of potential liability for manufacturers. In those cases for which an operator of a Level 3 vehicle does not have potential criminal liability, does the manufacturer have potential liability instead? This might arise if, for example, an automated vehicle exceeded the speed limit by an amount sufficient to constitute a felony under state law and did not issue a takeover request, or ran a red light and caused a fatal crash.
Primarily with respect to civil tort liability, the automated vehicle industry repeatedly asserts that existing law and legal frameworks are sufficient to address liability questions. The above criminal law concerns show that this is not the case. Automated vehicle technology is sufficiently novel that society needs new legal approaches to account for differences between the old and the new ways of driving.
William H. Widen is a Professor at University of Miami School of Law, Coral Gables, Florida, researching the regulatory implications of autonomous vehicles. Philip Koopman is an Associate Professor at Carnegie Mellon University, Pittsburgh, Pennsylvania, specializing in autonomous vehicle safety.
Suggested citation: William H. Widen and Philip Koopman, Level 3 Automated Vehicles and Criminal Law, JURIST – Academic Commentary, August 8, 2023, https://www.jurist.org/commentary/2023/08/widen-koopman-automated-vehicles-criminal-law/.
This article was prepared for publication by Hayley Behal, JURIST Commentary Managing Editor. Please direct any questions or comments to her at commentary@jurist.org