Follow
Subscribe

Tesla's Self-Driving Vehicle Attempts to Harm Driver

Home > Industry Analysis > Content

【Summary】Tesla enthusiast and editor-in-chief of Electrek, Fred Lambert, claims that Tesla's Full Self-Driving (FSD) software "tried to kill" him while he was driving his Model 3. Lambert recounts instances where the software veered aggressively towards the median strip and attempted to blow through an unmarked U-turn at full speed. He believes this is a new aggressive bug in FSD's latest updates.

FutureCar Staff    Sep 02, 2023 9:15 AM PT
Tesla's Self-Driving Vehicle Attempts to Harm Driver

Autopilot, the self-driving feature of Tesla vehicles, recently caused a dangerous situation for Fred Lambert, an avid Tesla enthusiast and editor-in-chief of Electrek. While driving his Model 3, Lambert claims that Full Self-Driving (FSD) "tried to kill" him by veering aggressively towards the median strip while attempting to pass another car. Fortunately, Lambert was able to regain control and avoid a potential crash.

During his drive to Montreal, Lambert had engaged FSD and was maintaining a steady speed of 73 miles per hour. However, when the Tesla tried to pass another vehicle, things quickly went wrong. Lambert described the experience as "super scary" and stated that he almost lost control while trying to correct FSD's behavior. He emphasized that if he had overcorrected, a collision could have occurred.

Despite the initial scare, Lambert decided to re-engage FSD, only to encounter the same issue moments later. This time, when he steered back into the left lane, FSD once again veered towards the median strip and attempted to speed through an unmarked U-turn intended for emergency vehicles only. Lambert managed to take control in time, but the potential consequences could have been catastrophic.

Lambert compared this malfunction to a previous issue with Autopilot, where the system would unintentionally swerve into exit ramps. He believed that Tesla had already addressed and fixed this problem. However, this recent incident made him question the safety of FSD's latest updates, referring to it as a "new aggressive bug."

Concerns about Tesla's self-driving technology are not limited to Lambert's experience. The US Justice Department has been investigating Tesla for misleading marketing of Autopilot, and California has cracked down on the use of the term "Full-Self Driving." Additionally, the National Highway Traffic Safety Administration has recently launched an investigation into an unsafe feature called "Elon Mode" for Autopilot.

Given these developments, Lambert advises Tesla owners to use FSD "as intended," although he suggests considering not using it at all. It is clear that there are still significant safety concerns surrounding Tesla's self-driving capabilities, and more scrutiny and improvements are necessary before this technology can be considered safe for public roads.

Source: "Autopilot just tried to kill me, so please fix it" - Electrek

Prev                  Next
Writer's other posts
Comments:
    Related Content