Tesla Wins First Trial Involving “Autopilot,” but More Serious Cases Loom

On April 21, Tesla was handed a victory in the first-ever trial involving Autopilot. The case provides an early glimpse into how efforts to hold Tesla liable for Autopilot crashes might fare with juries, and the press has called the trial a “bellwether” for others that are currently pending. Nevertheless, a close look at the facts indicates that plaintiffs might have more success in the future.

The plaintiff, Justine Hsu, was driving her Tesla in stop and go traffic on a surface street with a concrete median in Arcadia, California. She activated Autopilot, a suite of features that includes “traffic aware cruise control,” a system that automatically adjusts speed based on traffic conditions and that is, according to her complaint, popular among Tesla owners in heavy traffic. The car was driving about 20-30 miles per hour when it suddenly swerved off the road and into the median. At this point the car’s airbag deployed, shattering Hsu’s jaw and knocking out several of her teeth. (The airbag manufacturer was also a defendant in the case, and was also found not liable by the jury).

In a few ways, Hsu’s case represents a classic fact pattern involving Autopilot. On one hand, the system clearly did not function as it was designed to do. Autonomous vehicles should not suddenly swerve off the road, and yet in several cases Autopilot has done just that. One such case was the crash that killed Walter Huang, who was using Autopilot on his commute to work when his car veered off the highway and into a concrete barrier at more than 70 miles per hour. On a basic level, Hsu’s case was an effort to hold Tesla liable for this malfunction, and also for the misleading way Autopilot is marketed.

On the other hand, Hsu was using Autopilot in a setting where it was not supposed to be used. Several warnings in the owner’s manual inform users that Autosteer “is intended for use only on highways and limited-access roads.” Divided highways are the safest type of road on a per-vehicle-mile-traveled basis, and for autonomous vehicles, navigating them successfully is a much easier task than handling the relative chaos of city streets.

Despite instructions to the contrary, Tesla owners have been using Autopilot on surface streets for years, and there have already been several high-profile fatalities involving crashes on surface streets. These problems are foreseeable, as Telsa’s optical systems are bad at things like recognizing crossing traffic and stationary objects.

Nevertheless, the jury concluded that Autopilot did not fail to perform as a reasonable consumer would expect, with several jurors telling Reuters that the instructions clearly informed users that they were required to pay attention and be ready to take over from Autopilot at a moment’s notice.

That the warnings carried such weight with the jury certainly spells trouble for other plaintiffs in Autopilot cases. In every fatal crash involving Autopilot, it has been obvious that the driver failed to pay attention to the road. In the Huang crash, for example, NTSB investigators determined that Huang was playing a game on his phone when he died, and that if he had been watching the road he would have noticed the car slowly steering into the barrier for several long seconds before impact. Joshua Brown and Jeremy Banner, similarly, were not paying attention when their cars drove straight into tractor-trailers that were turning left across their paths, as the NTSB determined in both cases that the trucks would have been plainly visible to an attentive driver for several seconds before impact.

The Huang crash, however, occurred on a divided, limited access highway, exactly the type of road Autopilot is designed to handle. It’s possible that this difference (as well as the fact that Huang, a father, was killed) will make jurors more receptive to Huang’s suit, which is currently scheduled to go to trial in July. On the other hand, not only was Huang playing a game on his phone when he crashed, he had previously noticed and mentioned (in text messages to family and friends) his car’s tendency to veer off the road in the exact spot where he died, the NTSB found.

Also on the horizon are cases involving bystanders, several of which have already been filed. In one case, a Tesla driving on a surface highway in the Florida Keys ran a stop sign at 70 miles per hour and collided with a large SUV that was parked on the far side of the intersection, killing a woman who was standing next to it. There have also been dozens of crashes in which Teslas operating with Autopilot engaged have crashed into stationary fire trucks, ambulances, and police cars parked on the highway. It remains to be seen whether Tesla will escape liability by deflecting blame onto the drivers.

It is possible that the Hsu verdict presages a wave of defense verdicts in personal injury cases against Tesla based on Autopilot crashes. Such crashes do not happen, after all, when drivers are paying attention, as they have been instructed to do. If the tort system decides in this manner that Autopilot is not defective, advocates who are concerned about its spotty safety record will have to place their trust in the NHTSA, which is slowly conducting a sprawling investigation of Autopilot. In the meantime, drivers using Autopilot—and those around them—are likely to keep dying.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.