According to documents posted by safety regulators, Tesla will send out a software update to fix the problems.
The recall follows a two-year investigation conducted by the National Highway Traffic Safety Administration (NHTSA) into a series of crashes that happened while the Autopilot partially automated driving system was in use. Some of the investigated crashes were fatal.
The NHTSA reported that Autopilot's method of ensuring that drivers are paying attention can be inadequate and may result in the "foreseeable misuse of the system."
The recall includes Tesla models Y, S, 3 and X produced between October 5, 2012, and December 7, 2023. (Related: Over 90% of reported crashes involving self-driving cars were caused by Tesla vehicles – some led to DEATHS.)
The software update includes additional controls and alerts that will allegedly help encourage drivers to "adhere to their continuous driving responsibility," according to the documents.
The update was to be sent to certain affected vehicles on Dec. 12, with the rest getting it at a later time.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control. Autosteer is intended for use on limited access freeways when it's not operating with a more sophisticated feature called Autosteer on City Streets.
Human knowledge is under attack! Governments and powerful corporations are using censorship to wipe out humanity's knowledge base about nutrition, herbs, self-reliance, natural immunity, food production, preparedness and much more. We are preserving human knowledge using AI technology while building the infrastructure of human freedom. Use our decentralized, blockchain-based, uncensorable free speech platform at Brighteon.io. Explore our free, downloadable generative AI tools at Brighteon.AI. Support our efforts to build the infrastructure of human freedom by shopping at HealthRangerStore.com, featuring lab-tested, certified organic, non-GMO foods and nutritional solutions.
The software update will reportedly limit where Autosteer can be used.
According to the recall documents, if a driver tries to engage Autosteer when the proper conditions are not met for engagement, the feature will notify the driver that it is unavailable through visual and audible alerts. Additionally, Autosteer will not engage.
Depending on a Tesla's hardware, the added controls include "increasing prominence" of visual alerts, simplifying how Autosteer is turned on and off, "additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices and eventual suspension from Autosteer use if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility."
The documents also revealed that agency investigators met with Tesla starting in October to explain "tentative conclusions" about fixing the monitoring system.
While Tesla did not agree with the agency's analysis, the company agreed to the recall on Dec. 5 to help resolve the investigation.
For years, auto safety advocates have been calling for stronger regulation of the driver monitoring system, which mainly detects if a driver's hands are on the steering wheel.
Advocates have also demanded cameras, which are used by other automakers with similar systems, to make sure that a driver is paying attention.
While Autopilot can steer, accelerate and brake automatically in its lane, it is just a driver-assist system and cannot drive itself despite its name.
According to independent tests, the monitoring system is easy to fool. Some drivers have been caught while driving drunk or sitting in the back seat.
In a defect report filed with the NHTSA, Tesla claimed that Autopilot's controls "may not be sufficient to prevent driver misuse."
On the Tesla website, the company said Autopilot and a more sophisticated Full Self Driving system cannot drive autonomously, with both only meant to help drivers who have to be ready to intervene at all times.
Full Self-Driving is still being tested by Tesla owners on public roads.
In a statement posted on X, Tesla claimed that safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to look into 35 Tesla crashes since 2016 in which the agency believes that the vehicles were running on an automated system. At least 17 people have been killed in the investigated crashes.
The investigations are part of a larger inquiry by the agency NHTSA into several instances of Teslas using Autopilot crashing into parked emergency vehicles that are tending to other crashes. The agency has become more aggressive in tracking safety problems with Teslas in 2022, announcing several recalls and investigations, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes the NHTSA, said Tesla shouldn't be calling the system Autopilot because it can't drive itself.
In a statement, the NHTSA announced that the Tesla investigation remains open as the agency continues to monitor the efficacy of the company's corrections. The agency also continues to work with the automaker to "ensure the highest level of safety."
In January, California prosecutors filed charges against Kevin George Aziz Riad who reportedly ran a red light. He killed two people in 2019 while driving a Tesla on Autopilot. The man pled not guilty to two counts of vehicular manslaughter.
Tesla's automatic system was questioned again when a Model Y car crashed into a Hyundai, killing the drivers of both cars.
The Tesla driven by mortgage adviser Fredrick Scheffler II, 49, lost control and veered into oncoming traffic along the Sunset Highway, close to the Pacific Coast near Necanicum. Scheffler's 2020 car crashed into a Hyundai Santa Fe being driven by Kyle Riegler, 26.
Riegler was a band teacher who worked at the nearby Seaside Middle and High Schools. Scheffler, a married father from Portland, died shortly following the accident.
In July, an unnamed female driver, 66, and her male passenger, 67, were killed when their Tesla crashed into the back of a Walmart truck. The impact cut the roof off their vehicle.
The NHTSA reported that, at the time, it was too early to disclose a possible cause. However, Tesla's controversial autopilot self-driving technology was being probed because of the incident.
In another incident in July, one motorcyclist was killed by a Tesla driving on autopilot on a Utah highway.
Landon Embry, 34, was killed when a 2020 Tesla Model 3 merged into his lane and struck the back of his motorcycle. Embry was catapulted from his bike, where he died at the scene.
The motorcycle and car collision was similar to several Tesla crashes that have occurred since 2015, where drivers were killed when their cars merged into tractor-trailers and wound up beneath them.
While the 2015 Teslas were the first to utilize autopilot technology, the cars did not include fully automated options. Instead, the 2015 cars used several features designed to assist drivers, such as automated in-lane steering and automated lane changing prompted by the driver's commands.
Meanwhile, the electronic door handles of a Tesla that crashed in May are being investigated as part of the cause of death of Dr. Omar Awan, 48, who died in February 2019.
While Awan survived the initial impact, his family sued Tesla because they claimed his death could have been avoided if it weren't for the Tesla's door handles.
One Davie Police officer who witnessed the crash rushed to the scene. He told investigators that he and others tried to get Awan out of his car but they couldn't open the doors.
Watch the video below to learn why car companies are struggling with electric vehicles.
This video is from the High Hopes channel on Brighteon.com.
Widow SUES Tesla over “dangerous” electric vehicle that KILLED HER HUSBAND.
Social media influencer exposes problems that come with owning a Tesla Model 3.
German police officers chase Tesla on autopilot, driver literally asleep.
Sources include:
Static.NHTSA.gov[PDF]