Follow
Subscribe

Self-Driving Tesla's Attempted Highway Attack

Home > Industry Analysis > Content

【Summary】A Tesla enthusiast and editor experienced a terrifying incident when the Full Self-Driving (FSD) feature on his Model 3 tried to "kill" him on the highway. The car veered aggressively towards the median strip, and when he re-engaged FSD, it tried to blow through an unmarked U-turn at full speed. The incident raises concerns about the safety of Tesla's autonomous driving technology.

FutureCar Staff    Sep 02, 2023 4:15 PM PT
Self-Driving Tesla's Attempted Highway Attack

During a recent drive, Fred Lambert, an editor-in-chief and Tesla enthusiast, had a terrifying experience with Full Self-Driving (FSD) in his Model 3. He claims that FSD "tried to kill" him while he was behind the wheel, and if he hadn't intervened, he would have crashed at highway speed.

At the beginning of his drive to Montreal, everything seemed normal. Lambert engaged FSD and maintained a steady speed of 73 miles per hour. However, when the Tesla attempted to pass another car, things took a turn for the worse. Lambert felt FSD Beta aggressively veering towards the median strip.

Fortunately, Lambert had his hands on the steering wheel, which is a safety protocol that many Tesla drivers ignore. He was able to steer the vehicle back towards the road, avoiding a potential crash. Lambert described the experience as "super scary" and mentioned that he could have crashed into the vehicle he was passing if he had overcorrected.

Unfortunately, Lambert re-engaged FSD and encountered the same error moments later. When he steered back into the left lane with FSD on, the software once again veered towards the median strip and attempted to speed through an unmarked U-turn meant for emergency vehicles only. Lambert took control just in time to prevent a catastrophe.

Lambert compared this malfunction to a previous issue with Autopilot, where it would accidentally swerve into an exit ramp without warning. He believed that Tesla had fixed this problem, but it seems to have resurfaced with FSD's latest updates. Lambert submitted a bug report to Tesla, urging them to fix the issue.

It's clear that this incident raises concerns about the safety of Tesla's autonomous driving technology. While Tesla enthusiasts eagerly await the full potential of FSD, incidents like this highlight the need for further development and testing. Authorities have also taken notice, with the US Justice Department investigating Tesla's misleading marketing of Autopilot and California cracking down on the use of the term "Full-Self Driving." The National Highway Traffic Safety Administration has recently launched an investigation into an unsafe "Elon Mode" for Autopilot.

In light of these events, Lambert advises using FSD "as intended," but also suggests considering not using it at all. It's evident that Tesla's beta technology still has a long way to go before it can be deemed safe enough for public roads.

For more information on Tesla, Elon Musk recently showcased FSD and was forced to intervene when it attempted to drive through a red light.

Prev                  Next
Writer's other posts
Comments:
    Related Content