Follow
Subscribe

Tesla's Self-Driving Car Attempts Fatal Highway Attack

Home > Industry Analysis > Content

【Summary】A Tesla enthusiast claims that the Full Self-Driving (FSD) feature on his Model 3 tried to "kill" him while driving on the highway. The software aggressively veered towards the median strip and attempted to blow through an unmarked U-turn. The incident raises concerns about the safety of Tesla's autonomous driving technology, especially as authorities are cracking down on misleading marketing and investigating unsafe features.

FutureCar Staff    Sep 03, 2023 4:16 PM PT
Tesla's Self-Driving Car Attempts Fatal Highway Attack

Autopilot, a feature of Tesla's electric vehicles, has come under scrutiny after a Tesla enthusiast and editor-in-chief of the blog Electrek, Fred Lambert, claimed that the Full Self-Driving (FSD) feature "tried to kill" him. Lambert shared his harrowing experience while driving his Model 3, stating that if he hadn't intervened, he would have crashed at highway speed.

The incident occurred during Lambert's drive to Montreal. Initially, everything seemed normal as he engaged FSD and maintained a steady speed. However, when the Tesla attempted to pass another car, things took a dangerous turn.

Lambert recalled, "I felt FSD Beta veering aggressively to the left toward the median strip." Fortunately, because he had his hands on the steering wheel, he was able to steer back towards the road and avoid a potential crash. Lambert described the experience as "super scary" and expressed concern about the possibility of losing control if he overcorrected.

Despite the frightening encounter, Lambert re-engaged FSD, only to face a similar situation moments later. When he steered back into the left lane with FSD activated, he witnessed the software once again veering towards the median strip and attempting to speed through an unmarked U-turn meant for emergency vehicles.

Luckily, Lambert reacted in time and prevented a catastrophe. However, he believes that this malfunction reflects a new aggressive bug in FSD's latest updates. He reported the incident to Tesla, stating, "Autopilot just tried to kill me, so please fix it."

This is not the first time Autopilot has faced criticism. Lambert compared the recent incident to a previous issue where Autopilot would unintentionally swerve into exit ramps. He thought this problem had been resolved, but it seems to have resurfaced.

Concerns about the safety of Tesla's autonomous driving technology have prompted investigations and regulatory actions. The US Justice Department has been investigating Tesla for potentially misleading marketing of Autopilot, while California has cracked down on Tesla's use of the term "Full Self-Driving." Additionally, the National Highway Traffic Safety Administration has recently launched an investigation into an unsafe "Elon Mode" for Autopilot.

In light of these incidents and investigations, Lambert advises using FSD "as intended" or considering not using it at all. The current state of the technology may not be safe enough for public roads, and authorities are increasingly recognizing the need for stricter regulations.

As Tesla continues to develop and update its autonomous driving features, it is crucial to prioritize safety and address any potential issues that could put drivers and others at risk.

Source: Example News

Prev                  Next
Writer's other posts
Comments:
    Related Content