Multiple Tesla engineers have testified that the company made no changes to its Autopilot driver-assistance feature to account for limitations that contributed to a fatal 2016 crash, before a similar fatal accident in 2019.
The testimony was excerpted in recent filings in a case brought by the family of the man who died in the 2019 crash, Jeremy Banner, a 50-year-old father of three, Bloomberg reported.
The Banner case is likely to go to a jury trial in October, in a first for a fatal accident blamed on Autopilot.
In the 2016 incident Florida resident Joshua Brown drove into the side of a truck that was crossing the road. The death of Banner, also a Florida resident, was very similar.
In 2021 testimony Tesla engineer Chris Payne said that although the company knew “that there’s cross traffic or potential for cross traffic, the Autopilot at the time was not designed to detect that”.
Engineer Nicklas Gustafsson made similar remarks in a 2021 deposition.
Banner’s widow earlier this month revised her complaint to seek punitive damages, arguing Tesla should have modified Autopilot to simply switch off in dangerous circumstances following Brown’s 2016 death.
“There is evidence in the record that the defendant Tesla engaged in intentional misconduct and/or gross negligence for selling a vehicle with an Autopilot system which Tesla knew to be defective and knew to have caused a prior fatal accident,” the revised complaint reads.
Banner family lawyer Trey Lytal told Bloomberg Tesla had allowed the “same defect” to cause a second death three years after the first.
Lytal said the company “not only knew of this defect, but was warned by regulators for the US government that the system should not be used on roads with cross traffic”.
Tesla says Autopilot is intended for use on highways and limited-access roads, although the system is not disabled in other environments.
It says drivers are repeatedly informed that they must remain alert and ready to take over from Autopilot at a moment’s notice. The drivers involved in accidents were not paying attention, the company has said.
But critics say Tesla has lulled drivers into a false sense of security with public statements about Autopilot’s capabilities and with a product design that allows their attention to wander while the system controls the vehicle.
In April Tesla attracted scorn for briefly arguing in a separate case that statements about Autopilot by chief executive Elon Musk might be deepfakes, a tactic judge Evette Pennypacker called “deeply troubling”.
Earlier this year the firm won its first jury trial for a non-fatal Autopilot-linked accident, in a case in which a woman said the feature caused her Model S to suddenly veer into the centre median of a Los Angeles street.
Fourth quarter results beat Wall Street expectations, as overall sales rise 6 percent, but EU…
Hate speech non-profit that defeated Elon Musk's lawsuit, warns X's Community Notes is failing to…
Good luck. Russia demands Google pay a fine worth more than the world's total GDP,…
Google Cloud signs up Spotify, Paramount Global as early customers of its first ARM-based cloud…
Facebook parent Meta warns of 'significant acceleration' in expenditures on AI infrastructure as revenue, profits…
Microsoft says Azure cloud revenues up 33 percent for September quarter as capital expenditures surge…