Did Jurors React to Musk in Damage Award Against Tesla?

During a webinar this week devoted to analyzing possible factors that fueled a $243 million jury verdict against Tesla, a lawyer not involved in the case fielded a question about Elon Musk’s impact.

Did animus toward Tesla’s CEO affect the size of the award, an audience participant asked Mike Nelson, a partner in the law firm Nelson Niehaus LLC.

Nelson, who is also an expert in advanced driving assistance systems and electric vehicles, believes its possible.

“It’s hard to climb inside the mind of the jury…but these are huge numbers,” he said, referring not just to the $200 million in punitive damages but to $129 million in compensable damages awarded in the case Benavides vs. Tesla, decided last week. The compensable damage figure, for which the jury decided Tesla had 33 percent responsibility (roughly $43 million), was set at this level even though jurors knew that the parties harmed in the accident at the center of the case had already received a substantial amount of money in a separate settlement with the driver.

Related articles: Jury Decides Tesla Partly Responsible in Autopilot Crash Case; Must Pay $243M; Tesla Must Pay $243 Million Over Fatal Autopilot Crash; Digging Into Tesla’s Liability in Crash Case: Where’s the Data?

“To value a life [and] these injuries in the tens of millions of dollars is what seems to happen more and more nowadays. These would easily qualify for what lawyers called nuclear verdicts. And you usually find that when the jury is frustrated, angry with the other party….They’re almost, in a way, adding their own punitive damages before they get to punitive damages,” Nelson said.

Returning directly to the question posed, “Do I think some of this is about Musk? Absolutely,” he said.

While jurors with anti-Musk views were reportedly screened out the jury pool, Nelson said another factor could have put Musk at the center of deliberations. “He has so much identified himself as the party who launched this platform and is bringing it to fruition. He’s made it very much about him, even though there are plenty of people in that company,” said Nelson, who has tried cases against Tesla, including one as recently as a month and a half ago.

Without disclosing the status of that case, Nelson noted that his side tried to get a copy of a episode of the TV program 60 Minutes, featuring Musk taking his hands off the wheel of a Tesla in the early days of Autopilot, a partial self-driving feature that figured in the crash in the Benavides case.

Nelson and Philip Koopman, faculty emeritus at Carnegie Mellon University, an embedded systems design specialist and co-panelist during the webinar, highlighted such “autonowashing”—the exaggeration of the capabilities of autonomous technology—as a factor that could contribute to drivers’ overreliance on technology and one that have riled the jury in the Benavides case.

Related article: Digging Into Tesla’s Liability in Crash Case: Where’s the Data?

For the most part, Koopman and Nelson were most concerned with shedding light on the implications of last week’s verdict for driver-assist systems and product safety disclosures—and steps toward minimizing fatalities on the nation’s roadways as vehicles become more automated.

Taking aim at general claims that autonomous vehicles are safer, Koopman stressed, “There’s no credible data. They’re marketing statements.”

“And that’s true of every company. That’s not a Tesla statement. That’s true of every company.”

“The Insurance Institute for Highway Safety has said these kind of systems, where you’re supposed to be keeping an eye on what’s going on, so-called Level Two-plus systems, are a convenience feature, not a safety feature.”

Koopman also spent a lot of time during the webinar reviewing human behavior and response times when faced with potential crash situations.

“Asking someone to sit there and watch a system that works impressively well almost all the time, and then to jump in and stop something when they’re about to die, is basically asking them all to be superhuman.”

“We’ve known since the 1940s that people have trouble paying attention to boring things. And we’ve known since the 1990s that that applies specifically to driving automation. When you automate steering, people drop out. And it’s worse than that. Because they dropped out, their perception, reaction, response time, their ability to respond slows down. They lose situational awareness. And asking someone to sit there and watch a system that works impressively well almost all the time, and then to jump in and stop something when they’re about to die, is basically asking them all to be superhuman.”

“When you hold them responsible in a courtroom for not being superhuman, that’s just not a good outcome,” he said. “You can say they should have known better. They signed up for the risk. But it’s not going to stop the next fatality because asking people to be superhuman doesn’t work out. ”

Nelson referenced a 2015 Department of Transportation study, also written by some OEMs and National Transportation Safety Board, that suggests it takes a human being somewhere between 1.7 to 2.0 seconds to “wake up” to a dangerous situation.

“That’s on a good day,” Koopman said, referencing a traffic design handbook that says 2.7 seconds is perception response time for the 85th percentile. “But that’s for drivers who are in a simulator,” anxiously waiting for a stoplight to turn from green to red light, he said, contrasting the situation for a relaxed driver trusting that a car’s automated will “do the right thing. “You can see numbers up to 10 seconds, 25 seconds, 40 seconds depending on how complicated the environment is” and how inattentive the driver has been.

“You’re used to this thing driving excellently for mile after mile after mile after mile,” he said, describing a state of “automation complacency.”

“You’re not going to be below one second responding when all heck breaks loose. It’s going to take you a while to ask, ‘What was that? What’s going on?’ And then it’s too late.”

“That’s not lack of moral fiber because you didn’t pay attention. It’s called being human. That’s what people do,” he said.

Source link

Leave a Comment