Tesla’s massive recall will make it harder – but not impossible – to abuse Autopilot

Tesla today issued a recall for 2 million vehicles in the US to fix a “defect” with Autopilot, the company’s groundbreaking and controversial advanced driver assistance system. Safety experts say the recall will make Autopilot more difficult to abuse.

More difficult, but not impossible.

“It’s progress,” says Mary “Missy” Cummings, a robotics expert who wrote a paper in 2020 evaluating the risks of Tesla’s Autopilot system, “but minimal progress.”

Cummings said the National Highway Traffic Safety Administration missed an opportunity to force the company to address concerns about Tesla owners using Autopilot on roads where it was not intended to work. Last week, The Washington Post published a study linking at least eight fatal or serious crashes to Tesla drivers using Autopilot on roads that could not “reliably navigate.”

According to the recall, Tesla will issue a software update to about 2 million cars in the U.S. — nearly every vehicle it has ever sold in the country — that will increase the number of alerts and alerts when drivers aren’t paying attention.

Autopilot, which comes standard in all Tesla vehicles, combines a number of features including Traffic-Aware Cruise Control and Autosteer, which is only intended for use on limited-access highways when not working with the more advanced Autosteer on City. Streets.

According to the recall, the software update should address concerns surrounding the use of Autosteer on roads where it was not intended to operate.

“If the driver attempts to engage Autosteer when the conditions for activation are not met, the feature will alert the driver that it is not available through visual and audible alerts, and Autosteer will not engage,” the recall document states.

Tesla is also expanding its “three strikes and you’re out” system to include Autopilot for the first time. Previously, drivers with the more expensive and comprehensive Full Self-Driving feature were locked out of the system if they were not paying attention to the road. Now this feature will also include Autopilot users. According to the recall, drivers will face “eventually suspension from use of Autosteer if the driver repeatedly fails to demonstrate continuous and sustained driving responsibility while the feature is enabled.”

But Cummings, who spent a year as a senior safety adviser at NHTSA, isn’t convinced this will be enough to prevent future incidents. “It’s very vague,” she said.

That’s likely because the recall was the result of a two-year negotiation between NHTSA and Tesla, said Phil Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies the safety of autonomous vehicles. The company disagreed with the agency’s findings but ultimately agreed to issue the recall, indicating that some key elements may have been left out of the scope of the recall.

“This has all the hallmarks of a compromise to get the solution out and avoid another year of negotiations between NHTSA and Tesla,” he said. “So the remedy will likely not be as robust as NHTSA would like to see.”

“This has all the hallmarks of a compromise to get the cure out”

Cummings agrees. “Tesla probably fought back,” she said. “NHTSA really wants recalls to be voluntary, so Tesla probably used that as a bargaining chip.”

Of course, the nature of the recall itself has some safety experts calling it a huge missed opportunity. Allowing Tesla to push an over-the-air software update ignores many of Autopilot’s structural defects, said Sam Abuelsamid, principal research analyst at Guidehouse Insights.

Tesla’s driver monitoring system, which includes torque sensors in the steering wheel to detect hand position and a camera in the cabin to track head movements, is inadequate and can be easily fooled, Abuelsamid said. The torque sensors are prone to false positives, such as when drivers try to trick the system by adding a weight to the steering wheel that counteracts automatic movement, and false negatives, such as when the steering wheel does not detect a driver’s hands as they hold them. stable.

Meanwhile, the camera, which only came into use in 2021 for Autopilot driver monitoring, doesn’t work in low light, he noted. Other automakers use infrared sensors that can detect depth and operate in low light conditions. Consumer Reports recently showed that Tesla’s cameras could be tricked into thinking someone was in the driver’s seat when that wasn’t the case.

“This absolutely could have ended a different way,” Abuelsamid said. “NHTSA could do its job and actually force Tesla to issue a recall and install robust eye and hand monitoring for the driver and true geofencing of the system, or disable Autosteer altogether if they can’t do a hardware update.”

“NHTSA has consistently dropped the ball when it comes to Tesla,” he added. “Unfortunately, given the agency’s history of dealing with Tesla, this was probably the best outcome.”

Leave a Reply

Your email address will not be published. Required fields are marked *