Tesla to Face Jury Trial over Autopilot Defects Following 70-Page Summary Judgment Opinion

Tesla’s “Autopilot” has been implicated in over a dozen deaths in the U.S. alone, and yet the company has yet to face a significant finding of liability in a litigated case. That may end soon, as trial is set to begin in federal court today following a blockbuster summary judgment opinion issued only a few weeks ago.

Benavides v. Tesla involves a crash that occurred on a two-lane county road in Key Largo, Florida in 2019. George McGee was driving his Tesla Model S from his office in Boca Raton to his home, a distance of around 100 miles, when he ran through a stop sign at a T-intersection and collided with a Chevy Tahoe that was parked on the far side of the road at around 60 miles per hour. Naibel Benavides, a 22-year-old college student, was standing next to the Tahoe and was killed. Her friend Dillon Angulo—the two were on a date—was severely injured and is also a plaintiff in the case.

The Benavides crash implicates many of the same issues raised by other fatal crashes involving Autopilot. The system, despite its name, is a “driver assistance system” that requires constant oversight by an attentive driver, far short of what most people think of when they imagine an autonomous vehicle. Nor is it capable of functioning in any environment; the instructions explicitly warn drivers not to use it on anything less than a divided, limited-access highway, one without stop signs or crossing traffic.

Because of these limitations, every fatal Autopilot crash has involved a distracted driver. In the Huang case, for example, the plaintiff was killed when his car collided with a concrete barrier on the highway while he played a game on his phone (that case was settled for an undisclosed sum on the eve of trial). The Benavides crash is no different: McGee, the driver, testified in his deposition that he was on the phone with American Airlines trying to book a flight across the country when he dropped his phone and bent down to the floor to pick it up. It was at that moment that he sped through the stop sign and into the parked Chevy. (Benavides filed suit against McGee as well; that suit was settled for an undisclosed sum). McGee also used Autopilot on an inappropriate road, manually accelerated to a speed of 62 miles per hour in an area where the speed limit was 45, and repeatedly triggered Autopilot’s warning system for driver inattention.

Unsurprisingly given the facts outlined above, Tesla’s strategy in these cases has been to cast blame on the driver. At times this has been successful. The first trial involving a fatal crash linked to Autopilot involved a plaintiff-driver who had been drinking, and the jury had no trouble concluding that Tesla bore no blame for the accident. In Benavides, for the first time, the victim is a third party. Still, Tesla argued, it was the driver who was to blame for the crash, not Autopilot.

Continue ReadingTesla to Face Jury Trial over Autopilot Defects Following 70-Page Summary Judgment Opinion

Tesla Wins First Trial Involving “Autopilot,” but More Serious Cases Loom

On April 21, Tesla was handed a victory in the first-ever trial involving Autopilot. The case provides an early glimpse into how efforts to hold Tesla liable for Autopilot crashes might fare with juries, and the press has called the trial a “bellwether” for others that are currently pending. Nevertheless, a close look at the facts indicates that plaintiffs might have more success in the future.

The plaintiff, Justine Hsu, was driving her Tesla in stop and go traffic on a surface street with a concrete median in Arcadia, California. She activated Autopilot, a suite of features that includes “traffic aware cruise control,” a system that automatically adjusts speed based on traffic conditions and that is, according to her complaint, popular among Tesla owners in heavy traffic. The car was driving about 20-30 miles per hour when it suddenly swerved off the road and into the median. At this point the car’s airbag deployed, shattering Hsu’s jaw and knocking out several of her teeth. (The airbag manufacturer was also a defendant in the case, and was also found not liable by the jury).

In a few ways, Hsu’s case represents a classic fact pattern involving Autopilot. On one hand, the system clearly did not function as it was designed to do. Autonomous vehicles should not suddenly swerve off the road, and yet in several cases Autopilot has done just that. One such case was the crash that killed Walter Huang, who was using Autopilot on his commute to work when his car veered off the highway and into a concrete barrier at more than 70 miles per hour. On a basic level, Hsu’s case was an effort to hold Tesla liable for this malfunction, and also for the misleading way Autopilot is marketed.

On the other hand, Hsu was using Autopilot in a setting where it was not supposed to be used.

Continue ReadingTesla Wins First Trial Involving “Autopilot,” but More Serious Cases Loom

A Glimpse into Tesla’s New “Full Self Driving” Technology

Aerial photo of Tesla car in Full Self Driving mode making a left turnThis week the New York Times published a fascinating look at the latest iteration of Tesla’s automated driving technology, which the company calls “Full Self Driving.” Reporters and videographers spent a day riding with Tesla owner Chuck Cook, an airline pilot who lives in Jacksonville, Florida and has been granted early access to the new technology as a beta tester. What they found was, to my eye anyway, disturbing.

Mr. Cook’s Tesla navigated a broad range of city streets, selecting a route to a destination, recognizing and reacting to other cars, seeing and understanding traffic lights, and even making unprotected left-hand turns—a routine situation that autonomous vehicles struggle to handle. But the car also behaved erratically at times, requiring Mr. Cook to take over and correct its course. In one instance it veered off the street and into a motel parking lot, almost hitting a parked car. In another, it tried to make a left turn onto a quiet street but then, fooled by shade and branches from an overhanging tree, aborted the turn and ended up heading into oncoming traffic on a divided street. These incidents occurred in a single day of testing.

It is worth considering the experience of the Times reporters in the broader context of autonomous vehicle development, something the article largely fails to do.

Continue ReadingA Glimpse into Tesla’s New “Full Self Driving” Technology