Follow
Subscribe

Tesla Ordered to Turn Over Autopilot Data to the NHTSA After the 12th Crash Involving an Emergency Vehicle

Home > News > Content

【Summary】Tesla is facing growing scrutiny into the safety of its Autopilot autonomous driving system after investigators at the National Highway Traffic Safety Administration (NHTSA) identified the 12th crash involving a police or fire vehicle last week in Florida. Less than two weeks before the latest incident on Aug 28, the NHTSA said it had opened a formal safety probe into Tesla’s Autopilot after 11 prior crashes involving an emergency vehicle.

FutureCar Staff    Oct 08, 2021 1:25 PM PT
Tesla Ordered to Turn Over Autopilot Data to the NHTSA After the 12th Crash Involving an Emergency Vehicle

Tesla is facing growing scrutiny into the safety of its Autopilot autonomous driving system after investigators at the National Highway Traffic Safety Administration (NHTSA) identified the 12th crash involving a police or fire vehicle last week in Florida.

On Aug 28, an Orange County, Florida state trooper stopped to offer assistance to a disabled motorist on I-4. While the officer was assisting the driver, a Tesla Model 3 operating with Autopilot engaged struck the patrol car. Luckily the trooper was outside of the car and was not injured. But the incident has once again put focus on the safety of Tesla's Autopilot.

Less than two weeks before the latest incident on Aug 28, the NHTSA said it had opened a formal safety probe into Tesla's Autopilot after 11 prior crashes involving police or fire vehicles. 

The agency sent Tesla an 11-page letter with new questions on Tuesday as part of its expanding probe.

As reported by Reuters, the NHTSA wants the Tesla to answer to the "date and mileage at which the 'Full Self Driving' (FSD) option was enabled" for all of its vehicles, along with all consumer complaints, field reports, crash reports and lawsuits.

The NHTSA also wants Tesla to explain its "methods and technologies used to prevent subject system usage outside" the operational design domain.

NHTSA said earlier it had reports of 17 injuries and one death in the 11 crashes, including the December 2019 crash of a Tesla Model 3 that left one passenger dead after the vehicle collided with a parked fire truck in Indiana.

Florida-State-Trooper-police-car-cruiser-crash-Tesla-Autopoilot.jpeg

This police car was damaged last week after a Tesla driver allegedly using Autopilot crashed into it while the officer stopped to assist another motorist on a Florida highway.

Tesla recently rolled out the latest beta version of its more advanced Full Self Driving (FSD) autonomous driving software, which is available to a select group of owners. FSD unlocks advanced autonomous driving features on secondary roads, whereas Autopilot is designed to work on highways.

The FSD beta program is being used as a test platform, so that Tesla can make improvements to the software before it rolls out to more Tesla owners via an over-the-air software update. So in a sense, Tesla owners are serving as testers of the new software before its safety can be determined, which has caught the attention of regulators.

In July, Tesla launched a more affordable $199 monthly subscription service for its FSD feature for the first time so drivers can try it out beforehand. 

Previously, the only way to get its was to pay an extra $10,000. Tesla Chief Executive Elon Musk has touted the robust capabilities of FSD, but others disagree with its capability, including employees.

During a Tesla earnings call in January, Musk said that he was "highly confident that a Tesla vehicle will be able to drive itself with reliability in excess of humans this year." But Musk's tweet did reflect the opinion of one Tesla engineer, documents revealed in May confirmed.

During a March conference call with Tesla representatives and the California DMV, Tesla Autopilot engineer CJ Moore wrote in a memo that "Elon's tweet does not match engineering reality." 

"Tesla indicated that they are still firmly in L2 (SAE Level-2)," California DMV wrote in a memo after the meeting with Tesla representatives. "As Tesla is aware, the public's misunderstanding about the limits of the technology and its misuse can have tragic consequences."

According to the Society of Automotive Engineers (SAE), Level 2 autonomous driving technology requires human supervision at all times without exceptions, meaning that Tesla drivers must monitor the operation of the vehicle while using Autopilot.

Tesla also says the current, more advanced FSD features "do not make the vehicle autonomous."

In June, the NHTSA announced it will require automakers selling vehicles equipped with advanced driver assistance systems (ADAS) or SAE Levels 3-5 automated driving systems to report any crashes involving their vehicles.

These systems include Tesla's Autopilot and Full Self-Driving (FSD) feature, as well as the Super Cruise automated highway driving feature offered by General Motors, Ford Motor Co's BlueCruise, Nissan's ProPilot and others.

The NHTSA said this action will allow it to collect information necessary for the agency to play its role in keeping roads safe in the U.S. 

Tesla must respond to NHTSA's questions by Oct. 22, the agency said.

Prev                  Next
Writer's other posts
Comments:
    Related Content