Tesla Wins First Trial Involving “Autopilot,” but More Serious Cases Loom

On April 21, Tesla was handed a victory in the first-ever trial involving Autopilot. The case provides an early glimpse into how efforts to hold Tesla liable for Autopilot crashes might fare with juries, and the press has called the trial a “bellwether” for others that are currently pending. Nevertheless, a close look at the facts indicates that plaintiffs might have more success in the future.

The plaintiff, Justine Hsu, was driving her Tesla in stop and go traffic on a surface street with a concrete median in Arcadia, California. She activated Autopilot, a suite of features that includes “traffic aware cruise control,” a system that automatically adjusts speed based on traffic conditions and that is, according to her complaint, popular among Tesla owners in heavy traffic. The car was driving about 20-30 miles per hour when it suddenly swerved off the road and into the median. At this point the car’s airbag deployed, shattering Hsu’s jaw and knocking out several of her teeth. (The airbag manufacturer was also a defendant in the case, and was also found not liable by the jury).

In a few ways, Hsu’s case represents a classic fact pattern involving Autopilot. On one hand, the system clearly did not function as it was designed to do. Autonomous vehicles should not suddenly swerve off the road, and yet in several cases Autopilot has done just that. One such case was the crash that killed Walter Huang, who was using Autopilot on his commute to work when his car veered off the highway and into a concrete barrier at more than 70 miles per hour. On a basic level, Hsu’s case was an effort to hold Tesla liable for this malfunction, and also for the misleading way Autopilot is marketed.

On the other hand, Hsu was using Autopilot in a setting where it was not supposed to be used.

Continue ReadingTesla Wins First Trial Involving “Autopilot,” but More Serious Cases Loom

A Glimpse into Tesla’s New “Full Self Driving” Technology

Aerial photo of Tesla car in Full Self Driving mode making a left turnThis week the New York Times published a fascinating look at the latest iteration of Tesla’s automated driving technology, which the company calls “Full Self Driving.” Reporters and videographers spent a day riding with Tesla owner Chuck Cook, an airline pilot who lives in Jacksonville, Florida and has been granted early access to the new technology as a beta tester. What they found was, to my eye anyway, disturbing.

Mr. Cook’s Tesla navigated a broad range of city streets, selecting a route to a destination, recognizing and reacting to other cars, seeing and understanding traffic lights, and even making unprotected left-hand turns—a routine situation that autonomous vehicles struggle to handle. But the car also behaved erratically at times, requiring Mr. Cook to take over and correct its course. In one instance it veered off the street and into a motel parking lot, almost hitting a parked car. In another, it tried to make a left turn onto a quiet street but then, fooled by shade and branches from an overhanging tree, aborted the turn and ended up heading into oncoming traffic on a divided street. These incidents occurred in a single day of testing.

It is worth considering the experience of the Times reporters in the broader context of autonomous vehicle development, something the article largely fails to do.

Continue ReadingA Glimpse into Tesla’s New “Full Self Driving” Technology

Palsgraf and Humanity in the Age of Covid

My grandfather recently passed away. It wasn’t Covid; not directly at least. A lifetime of kidney problems and other assorted ailments weren’t helped by the pandemic-induced lock-down. Rather than go out to eat or graze at the local grocery store buffet, as he normally would, he dined on pre-cooked meals and unsurprisingly his health suffered for it. So no, Covid didn’t kill him, but it certainly helped. In legal-speak it was more of a proximate cause.

In any law school tort class, students learn about proximate cause as it relates to negligence. One case, which is widely cited, is Palsgraf v. Long Island Railroad. In this slice of history, a remarkable and tragic chain of events took place. The plaintiff, Mrs. Palsgraf, waited for her train, at the railroad’s train station. As she waited, an employee of the train company unknowingly helped two men load explosives onto a different train. The explosives detonated, and had one of the two men been injured by that explosion this case would almost assuredly be lost to the sands of time, a simple case of negligence with a simple resolution. Instead, in the hubbub that ensued, a large scale Mrs. Palsgraf was standing near struck and injured her. The exact manner in which the scale injured her isn’t mentioned in the opinion itself.

Every law student learns about this case and its meaning. The legal rules and principles of law that the majority and dissenting opinions announced are followed to this day. But the decision doesn’t spill any ink about Mrs. Palsgraf. A terse statement of facts accompanies the majority opinion, in which Mrs. Palsgraf isn’t even mentioned by name. She is simply “Plaintiff.” Thus, she is reduced to something less than human. I thought of this case as my grandfather lay in hospice, near the end of his life.

Continue ReadingPalsgraf and Humanity in the Age of Covid