Tesla Autopilot Investigated Following Series of Crashes Involving Emergency Vehicles

Tesla Autopilot Investigated Following Series of Crashes Involving Emergency Vehicles

Tesla Autopilot has resulted in more than a dozen injuries and deaths since 2018. Artur Widak/NurPhoto via Getty Images

The National Highway Traffic Safety Administration (NHTSA) on Monday said it had opened a formal safety probe into Tesla’s Autopilot features to assess their role in a series of crashes involving Tesla vehicles and stationary emergency vehicles.

In a document posted Friday opening the investigation, the NHTSA said it had identified 11 crashes since January 2018 in which Tesla cars “encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” Those accidents, four of which occurred in 2021, have resulted in 17 injuries and one death. The latest crash happened last month in San Diego, when a Tesla car on Autopilot blew through a freeway closure and slammed into the back of a vacant police car.

The NHTSA already had a preliminary evaluation open on Autopilot’s role in some of those incidents. The formal safety probe will “assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” the document said.

The investigation covers about 765,000 Tesla vehicles (Models Y, X, S and 3) made between 2014 and 2021. The NHTSA could demand a recall after conducting an engineering analysis.

It’s not the first time the NHTSA looked into Tesla’s driver assistance capabilities. In 2016, the agency opened a preliminary evaluation of Autopilot covering 43,000 vehicles. But the probe closed in January 2017 without any further action.

The National Transportation Safety Board (NTSB), which is responsible for civil transportation accident investigation, has criticized Tesla’s lack of system safeguards for Autopilot, which allows drivers to keep their hands off the wheel for extended periods. (The NHTSA focuses more on establishing safety regulations for vehicles on the market.)

Tesla in October 2020 rolled out a beta version of FSD (Full Self-Driving), an advanced version of Autopilot. Despite what its name suggests, Tesla said the final software will stay at level 2 semi-autonomous driving, which will require an attentive driver behind the wheel at all times.

“NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said in a statement Monday. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles.”

Tesla Autopilot Investigated Following Series of Crashes Involving Emergency Vehicles

Be the first to comment

Leave a Reply

Your email address will not be published.


*