A ju judge in a federal court in Miami was partially revealed due to a fatal crash in 2019, including Tesla’s use of its autopilot driver assistance system. The ju appellant awarded the plaintiff $329 million in punitive and compensatory damages.
Neither the car driver nor the autopilot system braked in time to avoid passing through the intersection. There, a car hit an SUV to kill a pedestrian. The ju apprentices assigned two-thirds of responsibility to the driver, with a third attributing to Tesla. (The drivers were sued separately.)
The verdict comes at the end of a three-week trial over the crash that killed 20-year-old Niberbena Videth Leon and seriously injured her boyfriend Dillon Anglo. The verdict was one of the first major legal decisions on driver assistance technology that opposed Tesla. The company has previously resolved lawsuits that included similar claims regarding autopilot.
Brett Schreiber, the lead lawyer for the plaintiffs in the case, Tesla “deliberately chose not to restrict drivers from using them elsewhere only for controlled access highways,” Tesla told Techcrunch in a statement.
“Tesla’s lies turn our path into a fundamentally flawed technology test track, harming everyday Americans like Nybel Benavides and Dillon Anglo,” Schreiber said. “Today’s verdict represents justice in the tragic death of Nybel and the lifelong injuries of Dillon, and Tesla and Musk are responsible for helping the company evaluate the trillion dollars with self-driving hype at the expense of human lives.”
Tesla said in a statement provided to TechCrunch it plans to appeal the verdict “in light of the substantial errors in law and irregularities at trial.”
“Today’s verdict is wrong, only putting the safety of cars at risk and the entire industry’s efforts to develop and implement Tesla and life-saving technologies,” the company wrote. “To be clear, there were no cars in 2019 and we didn’t prevent this crash today. This was not about autopilot. It was a fiction concocted by plaintiff’s lawyers who denounce the car when the driver acknowledged and accepted the responsibility from day one.”
Tesla and Musk have spent years claiming about the ability of Autopilots to bring excessive confidence to their driver assistance systems. This is a reality that government officials and Musk himself have spoken about for years.
The National Transportation Safety Board (NTSB) came to this resolution in 2020 after investigating a crash in 2018, in which a driver died after hitting a concrete barrier. That driver Walter Huang was playing mobile games while using Autopilot. The NTSB has made many recommendations following its investigation. This was largely ignored by Tesla, the Safety Committee later argued.
During a 2018 conference call, Musk said “self-satisfaction” with driver assistance systems like Autopilot was the problem.
“They’re way too used to it, and that tends to be more of a problem. This isn’t a lack of understanding of what autopilot can do. [drivers] I think they know more about autopilot than they do,” Musk said at the time.
The trial came as Tesla was taking place in the midst of rolling out the first version of the long-standing Robotaxi network, which now begins in Austin, Texas. These vehicles use an expanded version of Tesla’s more capable driver assistance system. This is called fully automated driving.
Update: This story has been updated to include the amount of total compensatory damage.
Source link