The US National Highway Traffic Safety Administration (NHTSA) has opened a probe into Tesla’s Autopilot software, citing the cars’ repeated collisions with parked emergency vehicles.
The NHTSA investigation will cover Tesla Models Y, X, S, and 3 vehicles released from 2014 through 2021. The federal agency says since 2018 it has logged 11 incidents (which include 17 injuries and one fatality) in which Tesla vehicles using the company’s Autopilot features, like Traffic-Aware Cruise Control, have crashed into stationary emergency vehicles. The agency says most of these incidents took place after dark, with the software ignoring scene control measures including warning lights, flares, cones, and an illuminated arrow board.
The investigation will “assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” according to NHTSA’s notes.
A spokesperson for the agency said that the investigation was in its preliminary stages and primarily concerned with uncovering additional information about the incidents. The spokesperson noted that the public should be aware that no commercially available cars are able to drive themselves, and vehicles always require a human in control at all times.
The inability of driving-assistance software from Tesla and other automakers to spot parked emergency vehicles is well known. Experts told Wired in 2018 that the likely cause is that these systems are programmed to mostly ignore stationary objects, otherwise they might react to all sorts of items on the side of roads, from signs to buildings.
Wired’s report notes that both Tesla and Volvo’s driving assist manuals warn drivers about this problem. As Tesla’s says: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”
The NHTSA has stepped up its scrutiny of Tesla in recent months as the company’s Autopilot software has been the focus of an increasing number of crashes. Many industry experts have criticized Tesla for its marketing of the software, which often suggests that human oversight of the vehicle is optional.
In April, senators urged the NHTSA to take “corrective actions” against Tesla and prevent misuse of its driving-assist software, and in June the agency issued new rules requiring companies like Tesla and Alphabet’s Waymo to report all incidents involving such systems.
Leave a Reply