Tesla Faces Engineer Claims In Autopilot Fatal Crash Case

Multiple Tesla engineers have testified that the company made no changes to its Autopilot driver-assistance feature to account for limitations that contributed to a fatal 2016 crash, before a similar fatal accident in 2019.

The testimony was excerpted in recent filings in a case brought by the family of the man who died in the 2019 crash, Jeremy Banner, a 50-year-old father of three, Bloomberg reported.

The Banner case is likely to go to a jury trial in October, in a first for a fatal accident blamed on Autopilot.

In the 2016 incident Florida resident Joshua Brown drove into the side of a truck that was crossing the road. The death of Banner, also a Florida resident, was very similar.

Image credit: Tesla

‘Not designed to detect that’

In 2021 testimony Tesla engineer Chris Payne said that although the company knew “that there’s cross traffic or potential for cross traffic, the Autopilot at the time was not designed to detect that”.

Engineer Nicklas Gustafsson made similar remarks in a 2021 deposition.

Banner’s widow earlier this month revised her complaint to seek punitive damages, arguing Tesla should have modified Autopilot to simply switch off in dangerous circumstances following Brown’s 2016 death.

“There is evidence in the record that the defendant Tesla engaged in intentional misconduct and/or gross negligence for selling a vehicle with an Autopilot system which Tesla knew to be defective and knew to have caused a prior fatal accident,” the revised complaint reads.

‘Same defect’

Banner family lawyer Trey Lytal told Bloomberg Tesla had allowed the “same defect” to cause a second death three years after the first.

Lytal said the company “not only knew of this defect, but was warned by regulators for the US government that the system should not be used on roads with cross traffic”.

Tesla says Autopilot is intended for use on highways and limited-access roads, although the system is not disabled in other environments.

It says drivers are repeatedly informed that they must remain alert and ready to take over from Autopilot at a moment’s notice. The drivers involved in accidents were not paying attention, the company has said.

Ambiguous message

But critics say Tesla has lulled drivers into a false sense of security with public statements about Autopilot’s capabilities and with a product design that allows their attention to wander while the system controls the vehicle.

In April Tesla attracted scorn for briefly arguing in a separate case that statements about Autopilot by chief executive Elon Musk might be deepfakes, a tactic judge Evette Pennypacker called “deeply troubling”.

Earlier this year the firm won its first jury trial for a non-fatal Autopilot-linked accident, in a case in which a woman said the feature caused her Model S to suddenly veer into the centre median of a Los Angeles street.

Matthew Broersma

Matt Broersma is a long standing tech freelance, who has worked for Ziff-Davis, ZDnet and other leading publications

Recent Posts

Craig Wright Sentenced For Contempt Of Court

Suspended prison sentence for Craig Wright for “flagrant breach” of court order, after his false…

2 days ago

El Salvador To Sell Or Discontinue Bitcoin Wallet, After IMF Deal

Cash-strapped south American country agrees to sell or discontinue its national Bitcoin wallet after signing…

2 days ago

UK’s ICO Labels Google ‘Irresponsible’ For Tracking Change

Google's change will allow advertisers to track customers' digital “fingerprints”, but UK data protection watchdog…

2 days ago

EU Publishes iOS Interoperability Plans

European Commission publishes preliminary instructions to Apple on how to open up iOS to rivals,…

3 days ago

Momeni Convicted In Bob Lee Murder

San Francisco jury finds Nima Momeni guilty of second-degree murder of Cash App founder Bob…

3 days ago