Tesla to Face Jury Trial over Autopilot Defects Following 70-Page Summary Judgment Opinion

Tesla’s “Autopilot” has been implicated in over a dozen deaths in the U.S. alone, and yet the company has yet to face a significant finding of liability in a litigated case. That may end soon, as trial is set to begin in federal court today following a blockbuster summary judgment opinion issued only a few weeks ago.

Benavides v. Tesla involves a crash that occurred on a two-lane county road in Key Largo, Florida in 2019. George McGee was driving his Tesla Model S from his office in Boca Raton to his home, a distance of around 100 miles, when he ran through a stop sign at a T-intersection and collided with a Chevy Tahoe that was parked on the far side of the road at around 60 miles per hour. Naibel Benavides, a 22-year-old college student, was standing next to the Tahoe and was killed. Her friend Dillon Angulo—the two were on a date—was severely injured and is also a plaintiff in the case.

The Benavides crash implicates many of the same issues raised by other fatal crashes involving Autopilot. The system, despite its name, is a “driver assistance system” that requires constant oversight by an attentive driver, far short of what most people think of when they imagine an autonomous vehicle. Nor is it capable of functioning in any environment; the instructions explicitly warn drivers not to use it on anything less than a divided, limited-access highway, one without stop signs or crossing traffic.

Because of these limitations, every fatal Autopilot crash has involved a distracted driver. In the Huang case, for example, the plaintiff was killed when his car collided with a concrete barrier on the highway while he played a game on his phone (that case was settled for an undisclosed sum on the eve of trial). The Benavides crash is no different: McGee, the driver, testified in his deposition that he was on the phone with American Airlines trying to book a flight across the country when he dropped his phone and bent down to the floor to pick it up. It was at that moment that he sped through the stop sign and into the parked Chevy. (Benavides filed suit against McGee as well; that suit was settled for an undisclosed sum). McGee also used Autopilot on an inappropriate road, manually accelerated to a speed of 62 miles per hour in an area where the speed limit was 45, and repeatedly triggered Autopilot’s warning system for driver inattention.

Unsurprisingly given the facts outlined above, Tesla’s strategy in these cases has been to cast blame on the driver. At times this has been successful. The first trial involving a fatal crash linked to Autopilot involved a plaintiff-driver who had been drinking, and the jury had no trouble concluding that Tesla bore no blame for the accident. In Benavides, for the first time, the victim is a third party. Still, Tesla argued, it was the driver who was to blame for the crash, not Autopilot.

This theory was largely rejected in a 70-page opinion denying Tesla’s motion for summary judgment as to plaintiffs’ most important theories of defect.[1] A reasonable jury could conclude, the court held, that Autopilot was defectively designed in two ways: first, Autopilot could be activated on roads it was not designed to be able to handle, like the two-lane surface street on which the crash occurred. This implicates the concept of “operational design domain,” the context in which an autonomous system is supposed to be able to function, and has been a theme of Autopilot fatalities going back to the death of Joshua Brown in 2016.

Second, Autopilot arguably does not do enough to ensure that drivers are in fact paying attention. For years Tesla relied on steering wheel torque to detect hands on the wheel, itself a proxy for an attentive driver, and Tesla owners swapped favorite ways to trick the system, like propping an orange or a water bottle in the steering wheel. McGee’s car would give audible warnings if he failed to pay attention, and if these were not heeded he would receive a “strikeout,” which disabled Autopilot, but only until the car was placed in park and then back into drive.[2] McGee did this often; he received 23 strikeouts in the three months he owned the car, including one on the very drive during which the collision occurred. One of plaintiffs’ experts opined that a longer waiting period following a strikeout would have prevented the crash, an idea that Tesla attacked as too speculative but that the court saw no need to keep from the jury.

Perhaps most damningly for Tesla, the court also rejected its attempt to strike plaintiffs’ claim for punitive damages. With this claim included in the trial, a host of evidence illustrating Tesla’s callous attitude toward the safety of Autopilot will presumably reach the jury, including Elon Musk’s statements that Tesla’s cars can read road signs and detect inanimate objects with “superhuman sensors,” that Autopilot is safer than a human driver, and that soon “having a human intervene will decrease safety.”[3]  The court’s opinion also mentions Tesla’s notorious “Paint it Black” commercial from 2016, in which a car is shown driving with Autopilot engaged and the viewer is told that “the person in the driver’s seat is only there for legal reasons. He’s not doing anything. The car is driving itself.” (The footage was later revealed to have been staged.)

The ultimate question here is an old one in products liability law: how far must companies go to make their products foolproof? With cutting-edge systems marketed as ushering in an “autonomous” future, it is possible that a jury might be more willing to direct blame at a product’s designers than the flawed users who interact with it. Presumably the verdict in Benavides will give some insight into the answer.


[1] Benavides v. Tesla, Inc., No. 21-cv-21940, 2025 WL 1768469 (S.D. Fla. Jun. 26, 2025).

[2] Id. at *5.

[3] Id. at *8-*9. These statements prompted FTC Chairman Joseph Simon to urge the FTC to investigate Tesla’s “deceptive and unfair practices,” specifically statements that imply that “Autopilot is an autonomous vehicle capable of ‘self-driving.’” Id. at *9.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.