Skip to content

Elon Take the Wheel


By: Stephen Maier, Volume 101 Staff Member

In May 2016, 40-year-old Joshua Brown was driving a Tesla Model S in “Autopilot mode” when a semi turned in front of him.[1] The self-driving computer did not recognize the truck against the horizon, and the car drove straight into it—killing Mr. Brown.[2] Earlier that year, in Handan, China, 23-year-old Gao Yaning was killed when Tesla’s Autopilot failed to notice a street sweeper driving in front of it and, consequently, did not apply the brakes.[3] Tesla creator Elon Musk has been quick to place the blame away from the company, claiming that the Autopilot mode is designed to be used with “hands on the steering wheel at all times,” and that the driver was equally at fault.[4] There are many assurances that self-driven cars will prove safer than current, “manual” cars.[5] Regardless of the gains in safety, there will be times, like these, where the technology fails and an accident occurs. When that happens, the legal implications aren’t clear. Volvo has come forward to say that they will accept full liability for crashes caused by their self-driving cars;[6] for other companies, however, there have been no such assurances.[7] This only covers crashes, however—what if, as the FBI fears, self-driving cars allow criminals to “conduct tasks that require the use of both hands,” like firing at pursuing vehicles?[8] Is Volvo liable for felony-murder if someone, using Volvo’s self-driving cars, commits a drive-by? Clearly not; this is taking the issue to the extreme. But as self-driving cars continue to grow,[9] the law must be prepared to respond when questions of fault or negligence arise. While the US Department of Transportation and National Highway Traffic Safety Administration have released a “Federal Automated Vehicles Policy,”[10] they have not clarified the federal government’s position on who is liable if an automated vehicle crashes. This Blog purports to explain three of the challenges facing self-driving cars and offer potential solutions to those problems.

Self-driving cars are safer than those driven by humans, but only if properly maintained. Despite indications that routine maintenance and following service bulletins saves money and extends a car’s life,[11] there is evidence that many Americans delay or avoid bringing their car in for service, even when they will not be responsible for the costs.[12] Further, there are a subset of car enthusiasts that find joy in modifying or altering their cars,[13] and tuning companies who make their trade in the modification of production cars. In cases such as these, the liability calculus should be shifted, based on the modifications. Alterations that might have an impact on the competency of the self-driving system should lower the company’s liability. Just as there are laws regulating safety features in cars,[14] states must develop laws preventing the modification of elements comprising the automated system.[15] Further, even an unmodified car that is not maintained can pose a hazard. A misaligned sensor or improperly maintained autonomous system will negate the safety gains from self-driving technology; recognizing that, states must be willing to penalize owners of self-driving cars who fail to properly service their car.

Outside of the realm of alterations, there is a risk that laws related to automated vehicles will create perverse incentives for drivers to be distracted behind the wheel. It might be tempting to assign more liability to a driver who is focused on the road, but using automated controls. After all, if they are aware of a potential crash, but take no action to take the wheel, shouldn’t they be responsible for their inaction?[16] This rationale, however, encourages drivers to be willfully ignorant of potential road hazards. On the other hand, requiring drivers to stay focused on the roads, rather than being able to ignore “driving,” takes away what many perceive to be the main benefit of self-driving cars.[17] In cases when an able-bodied person is controlling a self-driving car, liability should be the same, regardless of whether the “driver” is watching for risks. Companies producing self-driving cars must be willing to stand by their product: they cannot, as Tesla has done, deny that they are implicated in a crash simply because the driver also failed to prevent a crash. Even attentive drivers will have slowed reaction times if they believe that their car is going to keep them safe; that slowed reaction might lead to otherwise preventable accidents. The onus (and, consequently, monetary burden) should be on the company, not the driver, when these situations arise.

When there has been a crash, it has been suggested that product liability laws should apply.[18] This framework is helpful, but has some crucial gaps. For example, how would a court establish that a particular car is safe enough? There is currently no standard for how safe a car must be.[19] Placing too high of a safety burden on cars—for example, requiring that they “perform at least as well as a perfect human driver for every individual driving maneuver”—might have a chilling effect on the development of technology.[20] Further, in situations where the car works as it should, but causes harm (e.g., crashing to prevent a greater harm),[21] where does liability lie? Punishing the company for developing a product that works as it should will also have a chilling effect. In both circumstances, the chilling effect is desirable to the alternative: having cars that will, occasionally, make errors, and leaving aggrieved families and customers without recourse. This is a developing technology that provides a luxury service. It is imperative that courts protect individuals, even if that means the technology’s development is slowed.

The development of self-driving cars has been an exciting development for some: people have jumped at the chance to “test” Tesla’s Autopilot mode,[22] and independent programmers have released an open-source program to “create” a self-driving car.[23] However, self-driving cars have serious ramifications for safety, and they should be regulated accordingly. This regulation is best achieved by forcing carmakers that implement autonomous technology to accept liability for crashes caused by their system and by enforcing heightened safety standards against such cars.

  1. Danielle Muoio, Here’s the Latest on the Investigation into Tesla’s First Fatal Autopilot Crash, Business Insider (Aug. 11, 2016),
  2. Jordan Golson, Tesla Driver Killed in Crash with Autopilot Active, NHTSA Investigates, The Verge (June 30, 2016),
  3. Neal E. Boudette, Autopilot Cited in Death of Chinese Tesla Driver, N.Y. Times (Sept. 14, 2016),
  4. A Tragic Loss, Tesla Blog (June 30, 2016),
  5. See Myra Blanco et al., Automated Vehicle Crash Rate Comparison Using Naturalistic Data (Jan. 8, 2016),; Jemima Kiss, Self-Driving Cars: Safe, Reliable—But a Challenging Sell for Google, The Guardian (Oct. 6, 2015), But see Patrick Lin, The Ethics of Saving Lives with Autonomous Cars is Far Murkier Than You Think, Wired (July 30, 2016), the difficulty in assessing the gains in safety).
  6. Press Release, Volvo, US Urged to Establish Nationwide Federal Guidelines for Autonomous Driving (Oct. 7, 2015) (
  7. See Danielle Muoio, Elon Musk: Tesla Not Liable for Driverless Car Crashes Unless It’s Design Related, Business Insider (Oct. 19, 2016), (describing Elon Musk’s belief that Tesla is not responsible for crashes).
  8. Mark Harris, FBI Warns Driverless Cars Could Be Used as “Lethal Weapons”, The Guardian (16 July 2014),
  9. See, e.g., Dave Lee, Ford’s Self-Driving Car “Coming in 2021”, BBC News (Aug. 17, 2016), (discussing Ford’s plans to have fully autonomous cars available by 2021); Mark Snider, At CES 2017, the Frenzy over Self-Driving Cars Is Palpable, USA Today (Jan. 5, 2017), (describing the self-driving technology on display at the Consumer Electronics Show 2017); Stephen Dobie, CES 2017: NASA’s Helping Nissan Make Driverless Cars, Top Gear (Jan. 6, 2017), (describing Nissan’s plans for autonomous cars).
  10. U.S. Dep’t of Transp. & NHTSA, Federal Automated Vehicles Policy: Accelerating the Next Revolution in Roadway Safety (Sept. 2016) (
  11. See, e.g., Russ Heeps, 5 Car Maintenance Moves Consumers Put Off, ABC News (Jan. 1, 2014),; Tracey Coenen, Trying to Save Money by Not Maintaining Your Car? It Will Cost You More in the Long Run, AOL (Feb. 5, 2008),
  12. Press Release, Erin Stepp, Director, External Association Communication, AAA, Roadside Breakdowns Preventable with Proper Maintenance, Finds AAA (Oct. 8, 2015) ( (“a recent AAA survey found that 35% of Americans have skipped or delayed service or repairs”); Jeff Green, Drivers Ignoring Recall Notices Pose Hurdle for GM’s CEO, Bloomberg (Apr. 14, 2014), (“about a third of all recalled cars and trucks don’t get repaired, and about one out of every seven vehicles . . . still on the road have an unrepaired defect”).
  13. Max Power: How the Modified Car Scene Lives On, The Telegraph (May 19, 2016), the “broad church” of modifying cars).
  14. See, e.g., Minn. Stat. § 169.96 (2016) (regulating brakes), Minn. Stat. § 169.71 (2016) (regulating windshields); Minn. Stat. § 169.723 (describing “tires considered unsafe”).
  15. These elements include things like cameras, radar or Lidar sensors, and a central computer. Other systems that might be implicated are ride height, steering, and braking systems. James Armstrong, How Do Driverless Cars Work?, The Telegraph (July 1, 2016),
  16. For an example of this reasoning, see Jeffrey K. Gurney, Sue My Car Not Me: Products Liability and Accidents Involving Autonomous Vehicles, 2013 U. Ill. J.L. Tech. & Pol’y 247, 267–68 (2013).
  17. Chris Woodyard, McKinsey Study: Self-Driving Cars Yield Big Benefits, USA Today (Mar. 4, 2015),
  18. See Claire Cain Miller, When Driverless Cars Break the Law, The Upshot, N.Y. Times (May 13, 2014), (expressing a view that civil liability will be found in manufacturers, using product liability laws as a basis); Damien A. Riehl, Car Minus Driver: Autonomous Vehicle Regulation, Liability, and Policy, Part II, Bench & B. Minn. (Nov. 4, 2016),
  19. For a discussion on “how safe is safe enough,” see Bryant Walker Smith, The Reasonable Self-Driving Car, Volokh Conspiracy (Oct. 3, 2013),
  20. Id.
  21. This decisionmaking process has been suggested as a chief concern over self-driving cars. See generally Jean-François Bonnefon, Azim Shariff, & Iyad Rahwan, The Social Dilemma of Autonomous Vehicles, 352 Science 1573 (2016) (describing a survey related to public opinion over that question).
  22. Jack Stewart, Tesla’s Cars Have Driven 140M Miles on Autopilot. Here’s How, Wired (Aug. 17, 2016),
  23. Andrew Silver, Who’s Liable for George Hotz’s Driving Software?, IEEE Spectrum (Dec. 14, 2016),