By: Nicole Bruner
In recent past, industries such as Tesla have engineered autonomous features for vehicles. There has been a gradual build from adding various autopilot features to enabling a vehicle to operate without human control. In driving the vehicles towards autonomous control, auto companies have been forming tie-ups separate from mergers and acquisitions. For example, major competitors in the auto industry are collaborating to share costs and expertise in pushing autopilot vehicles into the market, thus accommodating consumer interests and trends. Seemingly controversial pairings have formed such as Volkswagen agreeing to invest $2.6 billion into Ford Motor’s autonomous vehicle startup, Argo AI.
While the auto industry has been making headway in the race for autonomy, there have been several deadly incidents rendered vague as to the root cause. Autonomous vehicles open doors to disputes of culpability, and therefore, several disputes arise where there is no clear blame. In issues where a vehicle operating under autopilot has been involved in a crash, there is a vast array in which blame can be placed. Tesla, facing a variety of lawsuits, argues that drivers are instructed to remain able to takeover the vehicle at any point while using autopilot features. Conversely, drivers argue that the autopilot features cause accidents that place Tesla liable. From a product liability standpoint, fault in cases preceding the age of autonomy was directed largely towards drivers, sellers, or manufacturers depending on the facts of the case. However, as autopilot vehicles have been introduced to the market, more scapegoats could be introduced into legal proceedings. In analyzing product liability of autopilot vehicles that rely on software updates, entities party to disputes could expand at least to the companies offering funds and expertise to the autonomous venture and to the engineers of the software driving the autopilot features. In a world where technology is programmed to perform in place of humans, entities providing such features arguably take more legal risks while making innovative gains.
Some states have addressed the age of autonomy with legal regulations. For example, Tennessee allows drivers to view a visual display since research shows drivers are better able to re-engage a vehicle when instructed to take over for the autopilot features. Further, Michigan allows autonomous vehicles to be driven on public roads and eases testing restrictions for manufacturers, thus allowing for commercial use of the technology. However, there is currently an imbalance among state regulations since decisions for regulating self-driving vehicles are notably left to the states. For example, Michigan passed legislation passing limited liability to manufacturers when an autonomous vehicle’s systems cause a collision, but California has not yet assigned a clear standard for fault. In a fatal crash in California, the state will need to decide if it will follow Michigan’s lead in finding liability with the manufacturer, or if it will find that the driver should have been able to mitigate the crash.
In sum, instilling autonomous features in consumer vehicles has driven corporations such as Tesla to dominate sales, but questions of liability remain when the technology in these vehicles fails. Government agencies, namely the National Transportation Safety Board, are at odds with corporations in assigning liability and addressing defective products. Additionally, current government standards are not evolving at the speed at which businesses are introducing products to the market, thereby causing legal repercussions for businesses faced with product liability concerns. Corporations must seek means to balance the need for innovation with safety concerns for the consumer public, and regulations must be adapted to address company trends in innovation.
Zachary Mider, Tesla’s Autopilot Could Save the Lives of Millions, But It Will Kill Some People First(Oct. 9, 2019), https://www.bloomberg.com/news/features/2019-10-09/tesla-s-autopilot-could-save-the-lives-of-millions-but-it-will-kill-some-people-first.
Johana Bhuiyan, The complete timeline to self-driving cars(May 16, 2016) https://www.vox.com/2016/5/16/11635628/self-driving-autonomous-cars-timeline (discussing a “revolutionary path” to self-driving cars).
Michael Wayland, Automakers investing billions in partnerships as industry races toward autonomous and electric vehicles(Dec. 7, 2019), https://www.cnbc.com/2019/12/07/gm-lg-venture-adds-to-multibillion-dollar-partnerships-on-evs-avs.html.
See id. (stating that manufacturers are collaborating to share funds and expertise as part of a global alliance).
See id.(highlighting the pursuit for autonomous vehicles); see alsoNeal E. Boudette and Jack Ewing, Ford and VW Agree to Share Costs of Self-Driving and Electric Cars(July 12, 2019), https://www.nytimes.com/2019/07/12/business/ford-vw-self-driving-electric-cars.html (stating that both Ford and VW are “alpha wolves” and questioning whether the companies could make necessary compromises for the alliance).
Jay Ramey, Tesla Suspects Brake Issue in Model S Crash(Aug. 1, 2016, 8:00 AM), https://autoweek.com/article/technology/tesla-suspects-brakes-fatal-model-s-crash; Jay Ramey, Lawsuit Labels Tesla Autopilot as ‘Dangerously Defective’(April 21, 2017), https://autoweek.com/article/autonomous-cars/tesla-autopilot-dangerously-defective-lawsuit-claims;U.S. agency to determine cause of 2018 fatal Tesla ‘Autopilot’ crash, yourNEWS(Jan. 14, 2020), https://yournews.com/2020/01/14/1399557/u-s-agency-to-determine-cause-of-2018-fatal-tesla-autopilot/.
Ryan Felton, Who’s at Fault When Crashes Happen with Tesla’s Autopilot?(June 7, 2018) https://jalopnik.com/who-is-really-responsible-for-tesla-autopilot-crashes-1826638132 (contending that Tesla faults drivers for crashes and may need to consider problems within the company).
See id.(stating that Tesla encourages driver attentiveness).
See id. (finding that drivers are encouraged to remain inattentive with Tesla’s autopilot features).
Maggiano, DiGirolamo & Lizzi P.C., Product Liability Claims Involving Defective Cars(2020), https://www.maggianolaw.com/product-liability-claims-involving-defective-cars/.
New Mexico Legal News, Who is Responsible for Autopilot Collisions(Sept. 15, 2019), https://fergusonlaw.com/who-is-responsible-for-autopilot-collisions/ (finding that companies behind the hardware and software face liabilities in cases involving self-driving vehicles).
SeeParesh Dave, Tesla’s ‘autopilot mode’ puts it at risk for liability in crashes(July 6, 2016), https://www.latimes.com/business/technology/la-fi-tn-tesla-liabilty-20160705-snap-story.html (finding that industries increase chances of liability with increases in autopilot features). Try to expand on what the increased legal risks are in this parenthetical
Ben Husch and Anne Teigen, Regulating Autonomous Vehicles(April 2017), https://www.ncsl.org/research/transportation/regulating-autonomous-vehicles.aspx (explaining that the government may offer funding for more research in autopilot vehicles).
Auto Insurance, Which States Allow Self-Driving Cars(2019), https://www.autoinsurance.org/which-states-allow-automated-vehicles-to-drive-on-the-road/ (contending that the United States does not appear united in self-driving decisions).
Riya Bhattacharjee, Tesla Sued by Family of Apple Engineer Who Died in Model X Autopilot Crash in Silicon Valley(April 30, 2019), https://www.nbcbayarea.com/news/local/tesla-sued-by-family-of-apple-engineer-who-died-in-tesla-model-x-crash-on-hwy-101-in-mountain-view/189906/ (showing Tesla directing potential fault to the driver with statements that the driver was warned to take control of the vehicle seconds before the crash).
Kirsten Korosec, Tesla Sued in Wrongful Death Lawsuit That Alleges Autopilot Caused Crash(2019), https://techcrunch.com/2019/05/01/tesla-sued-in-wrongful-death-lawsuit-that-alleges-autopilot-caused-crash/.
See, e.g., 39 No. 15 Westlaw Journal Automotive 05 (addressing the balance between innovation and the law in auto industries).