By Georgia D. Koutouzos, J.D.
First-of-its-kind action involving pedestrian death charges that the auto maker knew that its vehicle’s auto pilot system was defective and failed to warn consumers of those defects.
The family of a Japanese man who had been struck and killed by a self-driving Tesla Model X along a highway in Tokyo is suing the auto maker in California federal court, contending that the company’s auto pilot technology and suite of driver assistance features are defective in their design/manufacture and are dangerous to consumers and others in that the Model X is prone to the risk of driver overreliance on that technology. Demanding a jury trial, the complaint alleges strict products liability and negligence claims, and is seeking punitive and exemplary damages (Umeda v. Tesla, Inc., filed April 28, 2020, van Keulen, S.).
A 44-year-old husband and father was killed while standing with others and two stopped vehicles at an accident site on the side of an expressway in Tokyo, Japan, when a Tesla Model X operating under auto pilot while its occupant was dozing suddenly accelerated when the car in front of it cut into another lane. The dead man’s spouse and daughter filed suit against Tesla and others, alleging claims for strict products liability—design defects, strict products liability—failure to warn, negligence, wrongful death, survival, and loss of consortium.
According to the decedent’s representatives, the Tesla Model X and Tesla’s Auto pilot technology and suite of driver assistance features are defective in their design, construction, assembly, and manufacture and are dangerous to users and occupants in that the Model X is prone to the risk of driver overreliance on the vehicle manufacturer’s Auto pilot technology and its suite of driver assistance features. For one thing, Tesla’s method of measuring steering wheel torque to determine driver engagement is a fatally defective flaw in the company’s Auto pilot Driver Monitoring System, the complaint states, contending that the Auto pilot system and suite of technology erroneously determined that the driver of the at-issue Model X was engaged and otherwise alert because he had his hands on the steering wheel. The manufacturer’s steering-wheel torque method of measuring driver awareness failed to detect that the driver had started to become drowsy and was dozing off in the moments just prior to the crash.
Tesla has long known that driver inattentiveness and drowsiness is a risk that is keyed to overreliance on autonomous driving technologies such as the company’s Auto pilot system, the complaint maintains, adding that the Auto pilot technology and driver assistance features are defective and incapable of handling common driving scenarios such as the "cut-out" situation preceding the accident in which the lead car in front of the Tesla had switched over to another lane. The Model X was unable to recognize the objects and people that were in its path and its Auto pilot and Traffic Aware Cruise Control (TACC) system automatically engaged in acceleration in order to reach the cruising speed that the driver already had set.
While Tesla’s Auto pilot system is not the only automated driver assistance technology that faces difficulties in dealing with some common driving scenarios such as the "cut-out" or "cut-in" situation, Tesla failed to implement reasonable safeguards in its Auto pilot software as to how a Tesla vehicle should react in certain driving circumstances. More specifically, the company knew or should reasonably have known that certain scenarios were more difficult to handle and in the current instance of a "cut-out" situation, should have implemented a code in its software that would require the driver to confirm that there are no obstacles ahead and to affirmatively take action to confirm this fact by requiring the driver to engage the accelerator pedal. The auto maker’s failure to introduce safe and effective automated vehicle driving technology and the current iterations of its Auto pilot system and suite of technologies continues to pose a significant threat to Tesla drivers and those that share the road with Tesla vehicles, the complaint asserts.
Furthermore, the complaint maintains that Tesla’s forward-facing cameras and sensors are defective and that the company has failed to adopt more effective means of detecting physical objects than its current, Light Detection and Ranging (LIDAR) technology. Prior to the sale of its vehicles, including the Model X, Tesla knew of such defects, which would not be recognized by the ordinary user, and yet the company failed to provide adequate warnings of those dangers.
Strict liability—design defects. The complaint argues that the risk of danger in the design of Tesla’s Model X vehicle outweighs any benefits of the design—especially when safer alternative designs were available at the time of manufacture. Such reasonably safer alternative designs include, but are not limited to: (1) driver-facing cameras that would monitor the driver’s eyes and/or head position as a way to determine driver engagement and awareness; (2) LIDAR, or any other reasonable alternative system that may or may not include the use of radar technology for the detection of obstacles and surroundings of a Tesla vehicle; and (3) recoding of Tesla’s proprietary software for its Auto pilot technology and suite of driver assistance features—specifically, the Traffic Aware Cruise Control feature—and requiring that drivers must take affirmative steps to confirm acceleration of a Tesla vehicle when the vehicle encounters a "cut-out" driving scenario.
Strict liability—failure to warn. The complaint further asserts that Tesla knew that consumers would drive their vehicles in the same manner as the driver of the at-issue Tesla Model X did at the time of the accident and that an ordinary consumer would not have recognized the potential risks and dangers inherent in the operation and use of a Tesla vehicle with the auto pilot engaged. Nevertheless, the company failed to warn of the dangers in the reasonably foreseeable use of its vehicles, the complaint maintains.
Negligence. Building upon its previous allegations, the complaint asserts that the auto maker breached its duty to the decedent and his representatives, and the company acted unreasonably in designing, manufacturing, marketing, and releasing products, including the Tesla Model X and Tesla’s Auto pilot technology and suite of driver assistance features, which it knew would present a substantial and unreasonable risk of injury to vehicle occupants, as well as other drivers and pedestrians that surround the vehicle during operation.
Relief sought. Demanding a jury trial, the complaint requests judgment against Tesla for all damages recoverable under the laws of California and the United States—including California’s Wrongful Death Act—and is seeking general damages, special damages, and punitive and exemplary damages, attorneys’ fees and costs, and any other further relief as deemed appropriate, according to proof at trial.
The case is No. 5:20-cv-2926.
Attorneys: Edward C. Chen (Law Offices of Edward C. Chen) for Tomomi Umeda and Miyu Umeda.
Companies: Tesla, Inc.
MainStory: TopStory ComplaintNewsStory DesignManufacturingNews WarningsNews DamagesNews MotorVehiclesNews CaliforniaNews
Interested in submitting an article?
Submit your information to us today!Learn More
Product Liability Law Daily: Breaking legal news at your fingertips
Sign up today for your free trial to this daily reporting service created by attorneys, for attorneys. Stay up to date on product liability legal matters with same-day coverage of breaking news, court decisions, legislation, and regulatory activity with easy access through email or mobile app.