Follow
Subscribe

Tesla's Self-Driving Car Targets Man on Highway

Home > Industry Analysis > Content

【Summary】Tesla enthusiast and editor-in-chief of Electrek, Fred Lambert, claims that Tesla's Full Self-Driving (FSD) software attempted to cause a fatal accident while he was driving. He narrowly avoided crashing into a median strip and an unmarked emergency vehicle U-turn. Lambert believes this is a new bug in the latest FSD updates.

FutureCar Staff    Sep 02, 2023 6:14 AM PT
Tesla's Self-Driving Car Targets Man on Highway

Autopilot just tried to kill me, so please fix it.

Fred Lambert, a noted Tesla enthusiast and editor-in-chief of the electric vehicle blog Electrek, had a terrifying experience with Tesla's Full Self-Driving (FSD) while driving his Model 3. According to Lambert, if he hadn't intervened, he would have crashed at highway speed.

During his drive to Montreal, Lambert engaged FSD and maintained a steady speed of 73 miles per hour. However, when the Tesla attempted to pass another car, things went wrong. Lambert felt FSD Beta aggressively veering towards the median strip, but he was able to steer back towards the road because he had his hands on the steering wheel.

Despite the scare, Lambert re-engaged FSD and experienced the same error moments later. When he steered back into the left lane, the software once again veered towards the median strip and attempted to go through an unmarked U-turn at full speed, intended only for emergency vehicles.

Luckily, Lambert managed to take control in time, but the consequences could have been catastrophic.

Lambert believes that this malfunction is reminiscent of Autopilot's past issue of swerving into exit ramps without warning, which he thought had been fixed. He submitted a bug report to Tesla, urging them to fix the problem.

He suggests that this problem may be a result of FSD's latest updates, referring to it as a "new aggressive bug." However, it raises concerns about the safety of Tesla's technology on public roads.

Authorities have also started taking action against Tesla. The US Justice Department is investigating the misleading marketing of Autopilot, and California has cracked down on Tesla's use of the term Full Self-Driving. Additionally, the National Highway Traffic Safety Administration is investigating an unsafe "Elon Mode" for Autopilot.

In light of these incidents, Lambert advises users to only use FSD "as intended," or perhaps consider not using it at all.

Source: Example News

Prev                  Next
Writer's other posts
Comments:
    Related Content