“Critical Safety Gap” at Tesla’s Autopilot Linked to Hundreds of Accidents | Car

--

The operation of Tesla’s much-discussed Autopilot software has led to “foreseeable misuse and avoidable accidents.” That is the conclusion of a large-scale investigation into the system by the American traffic regulator.

The US National Highway Traffic Safety Administration (NHTSA) this week shared the results of its investigation into the operation of Tesla’s Autopilot function. To this end, no fewer than 956 accidents were analyzed over three years. The analysis shows that the Autopilot function played a role in 467 of those accidents. Thirteen cases involved fatal accidents.

System “misused”

According to the NHTSA, all of those accidents were caused by drivers “misusing” the system. However, the regulator blames Tesla for the fact that the manufacturer has not implemented enough safety measures to prevent such abuse. The regulator says that “under certain circumstances” the Autopilot system does not sufficiently ensure that drivers pay attention and use the function correctly.

A Tesla Model S with autopilot on. Archive image. © Reuters

The bottom line is that drivers expect the Autopilot system to require far less supervision than it actually does, leading to “a critical safety gap,” according to the NHTSA. According to the regulator, Tesla must improve the effectiveness of Autopilot warnings and ensure that users better know what this system can and cannot be used for.

Tesla’s Autopilot is intended to automatically steer cars within their lane, as well as accelerate and brake. In addition, the advanced version of Autopilot can help with lane changes on highways. However, according to the regulator, there was an increased risk of accidents in situations where the driver assistance system was switched on and insufficient care was taken to ensure that drivers themselves continued to pay attention.

The name alone

The regulator is concerned about the name alone; ‘Autopilot’. That name would suggest that this mode allows the car to drive autonomously, while a name with ‘assist’ or ‘team’ in it would cover the load much better, the government body claims. Attempts to make adjustments during Autopilot would also cause the system to be completely deactivated. This can discourage drivers from engaging in driving, the NHTSA writes.

Last December, Tesla released a software update for a part of Autopilot, namely the self-steering Autosteer function. The company did this because the NHTSA had accused Tesla that the manufacturer did not sufficiently check whether drivers were still holding their steering wheel. Tesla then adapted its assistance system to more than two million vehicles.

© Getty Images via AFP

New research

The accidents analyzed in the current study all occurred before this update was released. It is therefore not clear whether the update addresses the regulator’s concerns. The NHTSA has launched a new investigation to find out.

Hand in hand with the announcement of the new investigation on Friday, the government body said that the update is “probably not sufficient” to solve the problems, as several new reports have since emerged of accidents related to Autopilot: since the After this adjustment, twenty accidents were reported in four months involving cars that received the software update. In addition, drivers can choose whether they want to download the update and it is also possible to undo it.

Also read: Tesla brings cheaper cars to market faster: share shoots up 12 percent

WATCH ALSO. Tesla sales figures are downhill compared to last year

WATCH ALSO. 12 facts about the Tesla Cybertruck

Free unlimited access to Showbytes? Which can!

Log in or create an account and never miss anything from the stars.

Yes, I want free unlimited access

The article is in Dutch

Tags: Critical Safety Gap Teslas Autopilot Linked Hundreds Accidents Car

-

NEXT Everything to do again: Bayern Munich and Real Madrid keep each other in balance in the Champions League semi-final