Electric car maker Tesla as well as US regulators have been sharply criticised by the US National Transportation Safety Board (NTSB).
According to Reuters, NTSB board members questioned Tesla’s design of its semi-automated driving assistance system and condemned the National Highway Traffic Safety Administration (NHTSA) for a “hands-off approach” to regulating the increasingly popular systems.
Last month the NHTSA said they would examine a petition demanding the recall of some 500,000 Tesla electric vehicles over reports of sudden unintended acceleration.
There have been a number of serious and fatal accidents regarding Tesla vehicles over the years, with concern centering over whether the vehicles in question may have been on autopilot at the time.
On 29 December for example, an NHTSA investigation began after a fatal accident took place in Gardena, a suburb in Los Angeles. A black Model S reportedly left the 91 freeway in Gardena and was moving at a high rate of speed when it ran a red light and crashed into a 2006 Honda Civic at an intersection. Both occupants of the Honda Civic (a man and a woman) died at the scene.
Are we ready for ready for driverless transport?
But this was not the only fatal crash involving a Tesla. Also on Sunday 29 December 2019, another Tesla crash killed a woman in Indiana.
State police reportedly that the driver, Derrick Monet, 25, of Prescott Valley, Arizona, was seriously injured after he rear-ended a fire truck parked along Interstate 70 in Putnam county.
His wife, Jenna Monet, 23, was pronounced dead at a hospital.
Derrick Monet told investigators he regularly used his Tesla’s Autopilot mode, but didn’t recall whether he had it activated at the time of the accident.
There have been other autopilot related crashes as well.
In March 2018, a NTSB report concluded that a fatal Tesla crash in March found that its Autopilot self-driving technology was engaged for 10 seconds before the crash.
The roof of the Tesla Model X was sheared off and its 50-year-old driver was killed when the vehicle drove under the trailer of a semi truck that was crossing its path in March 2019.
That March incident had similarities to a May 2016 crash in which a Model S also drove under the trailer of a semi truck crossing its path. That crash found that autopilot had failed to detect the white trailer against a bright sky.
But it concluded that the driver was not paying attention to the road and exonerated Tesla.
There are reportedly at least five other fatalities worldwide involving Tesla vehicles on autopilot.
Meanwhile the NHTSA is already investigating a crash in early December when a Tesla Model 3 on autopilot crashed into the rear of a police car.
That accident took place in Connecticut, when a Model 3 in its autonomous driving mode failed to avoid crashing into a stationary police car which had its blue flashing lights on, as it attended to a broken down car in the “left centre lane”.
The driver admitted to State police that he had placed his Tesla Model 3 on autopilot so he could check on his dog in the back seat. The driver of the Tesla was charged with Reckless Driving and Reckless Endangerment.
Tesla recommends users be prepared to take over from autopilot at all times, and that drivers do not remove their hands from the steering wheel whilst autopilot is engaged.
So far it seems that the NHTSA has sent teams to investigate 14 Tesla crashes in which Autopilot is suspected of being in use.
Into this mix the NSTB has produced a stinging indictment of both NHTSA and Tesla.
It said that the NHTSA has “taken a nonregulatory approach to automated vehicle safety” and should “complete a further evaluation of the Tesla Autopilot system to ensure the deployed technology does not pose an unreasonable safety risk,” NTSB was quoted by Reuters as saying.
The board also cited Apple and other smartphone makers for refusing to disable devices when users are driving. It also called on the US Occupational Safety and Health Administration to use its authority to take action “against “employers who fail to address the hazards of distracted driving.”
It should noted that the NTSB can only make recommendations, while NHTSA regulates US vehicles.
The report cited a crash in 2018 in Mountain View, California, which involved a driver who was playing a game on his phone during a fatal trip.
Walter Huang, a 38-year-old Apple software engineer, was driving his Tesla Model X in 2018 in Autopilot mode at about 70 miles per hour (113 kph) when it crashed into a safety barrier. The NTSB said Huang had been using an iPhone and recovered logs showed a word-building game was active.
NHTSA said it will carefully review the NTSB’s recommendations. The agency added that commercial motor vehicles “require the human driver to be in control at all times.”
Tesla reportedly did not respond to requests for comment.
NTSB Chairman Robert Sumwalt reportedly said Tesla – unlike five other auto manufacturers – has ignored NTSB safety recommendations issued in 2017.
The NTSB will meanwhile release in coming days the probable cause of a third Tesla Autopilot fatal crash in March 2019 in Florida that showed no evidence the driver’s hands were on the steering wheel in the final 8 seconds before striking a semi-trailer truck.
Quiz: Know all about tech transport?
Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector
Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…
Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…
Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…
Explore the future of work with the Silicon In Focus Podcast. Discover how AI is…
Executive hits out at the DoJ's “staggering proposal” to force Google to sell off its…