Follow
Subscribe

Self-Driving Tesla's Attempted Highway Homicide

Home > Industry Analysis > Content

【Summary】Tesla enthusiast and editor-in-chief of Electrek, Fred Lambert, claims that Tesla's Full Self-Driving (FSD) software "tried to kill" him while he was driving his Model 3. The software veered aggressively towards the median strip and attempted to blow through an unmarked U-turn at full speed. Lambert was able to intervene and avoid a crash, but he believes this is a new aggressive bug in the latest FSD updates.

FutureCar Staff    Sep 03, 2023 11:19 PM PT
Self-Driving Tesla's Attempted Highway Homicide

Autopilot just attempted to cause harm to me, so please address the issue.

During a recent drive, Fred Lambert, an avid Tesla supporter and editor-in-chief of the electric vehicle blog Electrek, claims that Full Self-Driving (FSD) made an attempt to "kill" him while he was operating his Model 3. Lambert asserts that if he had not taken action, he would have been involved in a high-speed collision.

Initially, everything seemed normal as Lambert drove towards Montreal. He engaged FSD and maintained a steady speed of 73 miles per hour. However, trouble arose when the Tesla attempted to overtake another vehicle.

"I felt FSD Beta veering aggressively to the left towards the median strip," Lambert recounted.

Since he followed the safety protocol of keeping his hands on the steering wheel, he was able to "steer back towards the road."

"Correcting FSD Beta was extremely frightening as I almost lost control," Lambert stated. "I was in the process of passing another vehicle, and if I had overcorrected, I could have crashed into it."

Regrettably, Lambert re-engaged FSD and replicated the same error only moments later.

When he once again steered back into the left lane with FSD activated, he described how the software once again "veered to the left towards the median strip" and then attempted to speed through an unmarked U-turn designated solely for emergency vehicles.

Fortunately, Lambert was prepared and took control in time, but the potential consequences could have been catastrophic.

Lambert suggests that this malfunction resembles the past issue of Autopilot mistakenly swerving into an exit ramp without warning, a problem he believed had been resolved long ago.

"In the early days, Tesla Autopilot used to attempt to take exit ramps it wasn't supposed to, but Tesla fixed it a while ago," he explained. "I haven't experienced that in years."

Afterward, Lambert submitted a bug report to Tesla, stating, "Autopilot just tried to kill me, so please fix it."

He believes that this problem stems from the latest updates to FSD, describing it as a "new aggressive bug." While Tesla has had to retract updates in the past due to causing accidents, it may be time to acknowledge a harsh reality: this technology, in its current state, is not sufficiently safe for public roads.

In other words, Tesla's beta should not be treated as a playground, a sentiment that authorities are gradually embracing.

The US Justice Department initiated an investigation into Tesla last year regarding the dangerously misleading marketing of Autopilot, and California has taken action against Tesla's use of the term "Full Self-Driving." Additionally, the National Highway Traffic Safety Administration has recently intervened to investigate an unsafe "Elon Mode" for Autopilot.

For now, Lambert advises using FSD "as intended," although it may be worth considering not using it at all.

For more information on Tesla, read: Elon Musk Demonstrates FSD, Forced to Intervene When It Attempts to Run a Red Light.

Prev                  Next
Writer's other posts
Comments:
    Related Content