NHTSA Puts Tesla Under The Microscope

NHTSA Puts Tesla Under The Microscope

The investigation into a series of Tesla crashes with parked emergency vehicles or trucks with warning signs has escalated. This brings Teslas with partially automated driving systems one step closer to being recalled. The National Highway Traffic Safety Administration said Thursday that the Tesla investigation has been upgraded to an engineering analysis, indicating that the electric vehicle maker and automated systems that conduct at least some driving responsibilities are being scrutinized more closely.

The agency's documents, which were released on Thursday, raise major concerns about Tesla's Autopilot system. The agency discovered that it's being utilized in regions where its capabilities are limited, and that many drivers ignore the vehicle's warnings to avoid crashes. The investigation has now expanded to include practically all of the Austin, Texas-based automaker's vehicles sold in the United States since the start of the 2014 model year.

According to the National Highway Traffic Safety Administration, 16 crashes involving emergency vehicles and trucks with warning flags occurred, resulting in 15 injuries and one death. Additional data, vehicle performance, and the degree to which Autopilot and related Tesla systems may worsen human factors or behavioral safety hazards, compromising the effectiveness of the driver's supervision, will be evaluated by investigators.

An engineering analysis is the final part of an investigation, and in most cases, the NHTSA chooses whether to issue a recall or terminate the enquiry within a year. The Teslas delivered collision alarms to the drivers in the majority of the 16 collisions right before contact. In roughly half of the situations, automatic emergency brakes acted to at least decelerate the autos. According to NHTSA records summarizing the investigation, Autopilot gave over control of the Teslas on average less than a second before the crash.

The National Highway Traffic Safety Administration is also investigating collisions with similar patterns that did not involve emergency vehicles or trucks with warning signs, according to the agency. The agency discovered that many drivers had their hands on the steering wheel, as Tesla required, yet did nothing to avoid a collision. This indicates that drivers are following Tesla's monitoring system, but it does not guarantee that they are paying attention.

According to the EPA, drivers should have spotted first responder vehicles an average of eight seconds before contact in cases where video is available. Before initiating a recall, the EPA must determine whether Autopilot has a safety flaw. The usage or misuse of the driver monitoring system, or the operation of a vehicle in an unanticipated manner, does not necessarily rule out a system failure, according to the investigators.

Tesla incident that rear-ended a Culver City fire truck

 

Autonomous Driving Under Scrutiny

According to Bryant Walker Smith, a University of South Carolina law expert who studies automated vehicles, the agency paper almost states Tesla's technique of ensuring drivers pay attention isn't good enough, and that it's defective and should be recalled. He claims that having a hand on the wheel and being entirely detached from driving is really easy. Monitoring a driver’s hand position is not effective because it only measures a physical position. It is not concerned with their mental capacity, their engagement, or their ability to respond.”

Other companies' systems, such as GM's Super Cruise, utilize infrared cameras to monitor a driver's eyes or face to ensure they're looking forwards. However, according to Walker Smith, even these may allow a motorist to zone out. In total, the agency investigated 191 crashes, but 85 were ruled out because other drivers were involved or there was insufficient material to make a definitive determination. Running Autopilot in locations where it has limitations or in conditions that potentially interfere with its functions appears to be the main cause of around one-quarter of the remaining 106 crashes.

Other automakers restrict the usage of their systems to divided highways with limited access. NHTSA and Tesla should limit Autopilot's deployment to regions where it can safely operate, according to the National Transportation Safety Board, which has also studied some of the Tesla crashes dating back to 2016. The NTSB also advised that the National Highway Traffic Safety Administration compel Tesla to have a better mechanism in place to ensure that drivers are paying attention. The recommendations have yet to be implemented by the NHTSA. The National Transportation Safety Board can only make recommendations to other government agencies.

The National Highway Traffic Safety Administration said in a statement that no self-driving vehicles are currently available for purchase. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” said the agency.