Tesla ordered to pay $243M for fatal Autopilot crash
In a landmark decision, a federal jury in Florida has found Tesla partly liable for a fatal 2019 crash involving its Autopilot driver-assist software, ordering the company to pay over $200 million in damages. This verdict marks a rare loss for Tesla in a civil trial concerning its advanced driver-assistance systems and could set a significant precedent for future lawsuits.
The incident, which occurred on April 25, 2019, in Key Largo, Florida, involved a Tesla Model S driven by George McGee. With Autopilot engaged, McGee’s vehicle reportedly plowed through a stop sign and caution light at a T-intersection, striking 22-year-old Naibel Benavides Leon and her boyfriend, Dillon Angulo, who were standing outside a parked SUV on the road’s shoulder. Benavides Leon was killed, and Angulo suffered grave injuries.
During the trial, it was revealed that McGee admitted to looking down to retrieve his dropped cell phone moments before the collision. While Tesla maintained that McGee’s distracted driving was solely responsible for the crash, the jury, after less than a day of deliberation, determined that Tesla’s Autopilot technology also played a significant role. The jury found Tesla 33% liable and McGee 67% responsible for the crash.
The awarded damages total $243 million, consisting of $200 million in punitive damages against Tesla and $43 million in compensatory damages to the plaintiffs. The compensatory damages include $35 million to Benavides Leon’s mother, $24 million to her father, and $70 million to Angulo, with McGee ordered to pay two-thirds of these compensatory damages. Some reports indicate the total compensatory damages awarded were approximately $129 million, bringing the total verdict to $329 million. The plaintiffs had initially sought $345 million in damages.
This case is particularly notable as it is the first federal wrongful death lawsuit over Tesla’s driver-assistance software to proceed to trial, with previous similar cases often being dismissed or settled out of court to avoid public scrutiny. Lawyers for the plaintiffs argued that Tesla deceptively marketed Autopilot as being more capable than it was, leading drivers like McGee to become overly reliant and complacent. Evidence presented suggested that the Tesla had recognized the stopped SUV and at least one pedestrian before the crash but failed to respond appropriately, and that Autopilot’s inability to override driver input when the accelerator was pressed was a central issue.
Tesla, in its post-verdict statement, asserted that “No car in 2019 — and none today — could have prevented this crash,” calling the verdict “wrong” and warning it could “jeopardize Tesla’s and the industry’s efforts to develop life-saving technology.” The company has stated its intent to appeal the decision.
The verdict comes at a critical time for Tesla, as CEO Elon Musk continues to push for wider adoption of more advanced autonomous driving features, including plans for a driverless taxi service. The National Highway Traffic Safety Administration (NHTSA) has been conducting ongoing investigations into Tesla’s Autopilot and Full Self-Driving (FSD) systems, examining incidents where the systems may have failed to detect and disengage in specific situations, particularly in conditions with limited visibility. In late 2023, Tesla issued a recall of over 2 million vehicles to address “insufficient” safeguards against misuse of Autopilot, following a two-year NHTSA investigation into the driver monitoring system. Despite this, NHTSA has continued to investigate the effectiveness of the recall and new crashes.
Legal experts suggest this verdict could “open the floodgates” for other costly lawsuits against Tesla, emboldening more plaintiffs to pursue claims related to the company’s driver-assistance technology. While Tesla has historically seen juries side with them in Autopilot-related cases, attributing accidents to human error, this outcome highlights a shift in judicial scrutiny and a potential legal precedent for holding manufacturers accountable for the capabilities and limitations of their autonomous systems.