Follow
Subscribe

Self-Driving Tesla's Attempted Highway Homicide

Home > Industry Analysis > Content

【Summary】A Tesla enthusiast claims that the Full Self-Driving (FSD) feature on his Model 3 tried to "kill" him while driving on the highway. The Tesla veered aggressively towards the median strip and nearly caused a crash. The incident raises concerns about the safety of Tesla's autonomous driving technology, with authorities already investigating the company's misleading marketing of Autopilot. The Tesla enthusiast advises using FSD "as intended" or not using it at all.

FutureCar Staff    Sep 02, 2023 11:16 PM PT
Self-Driving Tesla's Attempted Highway Homicide

A Tesla enthusiast and editor-in-chief of the electric vehicle blog Electrek, Fred Lambert, recently had a terrifying experience with Tesla's Full Self-Driving (FSD) feature. While driving his Model 3, Lambert claims that FSD "tried to kill" him and that he narrowly avoided a crash at highway speed by intervening.

During his drive to Montreal, Lambert engaged FSD and maintained a steady speed of 73 miles per hour. However, when the Tesla attempted to pass another car, things took a dangerous turn. Lambert felt FSD Beta aggressively veering towards the median strip, but he was able to steer back towards the road because he had his hands on the steering wheel.

Despite the scare, Lambert re-engaged FSD and unfortunately experienced the same error moments later. When he steered back into the left lane with FSD on, the software once again veered towards the median strip and even attempted to drive through an unmarked U-turn intended only for emergency vehicles at full speed. Fortunately, Lambert took control in time to prevent a catastrophic outcome.

Lambert compared this malfunction to a previous issue with Autopilot, where it would accidentally swerve into exit ramps without warning. He believed that Tesla had fixed this problem, but it seems to have resurfaced with FSD's latest updates. Lambert submitted a bug report to Tesla, urging them to fix the issue.

These incidents raise concerns about the safety of Tesla's autonomous driving technology. While Tesla enthusiasts embrace the beta version of FSD, authorities are becoming increasingly skeptical. The US Justice Department has been investigating Tesla for misleading marketing of Autopilot, and California has cracked down on the use of the term "Full-Self Driving." Additionally, the National Highway Traffic Safety Administration is now investigating an unsafe "Elon Mode" for Autopilot.

In light of these incidents, Lambert advises users to only use FSD "as intended" and even suggests considering not using it at all. It is clear that Tesla's autonomous driving technology still has a long way to go before it can be deemed safe for public roads.

Source: Elon Musk Shows Off FSD, Forced to Intervene When It Tries to Drive Through Red Light

Prev                  Next
Writer's other posts
Comments:
    Related Content