Follow
Subscribe

Self-Driving Tesla's Attempted Highway Homicide

Home > Industry Analysis > Content

【Summary】Tesla enthusiast and editor-in-chief of Electrek, Fred Lambert, claims that Tesla's Full Self-Driving (FSD) feature "tried to kill" him while he was driving his Model 3. Lambert describes how FSD veered aggressively towards the median strip on the highway, potentially causing a crash. He submitted a bug report to Tesla, stating "Autopilot just tried to kill me, so please fix it.

FutureCar Staff    Sep 06, 2023 6:43 AM PT
Self-Driving Tesla's Attempted Highway Homicide

Autopilot, the self-driving feature of Tesla vehicles, recently caused a dangerous incident for Fred Lambert, an editor-in-chief of an electric vehicle blog. While driving his Model 3, Lambert claims that Full Self-Driving (FSD) "tried to kill" him. He had to intervene to avoid a potential crash at highway speed.

During his drive, Lambert engaged FSD and maintained a steady speed of 73 miles per hour. However, when the Tesla attempted to pass another car, things took a turn for the worse. Lambert felt FSD Beta aggressively veering towards the median strip. Fortunately, because he had his hands on the steering wheel, he was able to steer back towards the road and prevent a potential collision.

Despite the scare, Lambert re-engaged FSD and experienced the same error moments later. As he steered back into the left lane, FSD once again veered towards the median strip and attempted to go through an unmarked U-turn meant for emergency vehicles only. Lambert managed to take control in time, but the consequences could have been catastrophic.

Lambert believes that this malfunction resembles a previous issue with Autopilot, where it would unintentionally swerve into exit ramps. He thought Tesla had fixed this problem, but it seems to have resurfaced with the latest FSD updates. Lambert submitted a bug report to Tesla, urging them to address the issue.

This incident raises concerns about the safety of Tesla's self-driving technology. Lambert suggests that the current form of FSD is far from being safe enough for public roads. Authorities, such as the US Justice Department and California regulators, have already taken action against Tesla for misleading marketing and the use of terms like Full Self-Driving.

Given these developments, Lambert advises users to only use FSD "as intended" or consider not using it at all. The focus should be on ensuring the safety of autonomous driving technology before it is widely adopted.

Source: Example News

Prev                  Next
Writer's other posts
Comments:
    Related Content