How U.S. Safety Regulators Have Struggled to Get a Grip on Tesla’s Autopilot

REUTERS/Mike Blake/File Photo
A Tesla Model 3 vehicle warns the driver to keep their hands on the wheel and be prepared to take over at anytime while driving using FSD (Full Self-Driving) in Encinitas, California, U.S., October 18, 2023.

Carl Hunter was driving his Tesla Model S home on a highway northeast of Seattle last month. The Autopilot system was turned on and he was looking at his phone, Hunter later told police. He heard a bang as the vehicle lurched forward, ramming into a motorbike.

“I’m not sure how it happened,” Hunter said as he called 911 from the scene. “But I am freaking out.”

Jeffrey Nissen, the bike’s 28-year-old rider, was trapped under the Tesla, according to police. He died at the scene.

The crash happened four months after scrutiny by federal regulators led Tesla to recall more than 2 million cars to fix safety risks in the Autopilot system. The death added to the toll of at least 20 Tesla crashes under review since the recall, raising concerns not just about the effectiveness of Tesla’s fix, but also the adequacy of oversight by the nation’s auto safety regulator, the National Highway Traffic Safety Administration (NHTSA).

Critics say incidents like Nissen’s death show in stark terms how the agency is struggling to keep pace with risks introduced on American highways by Tesla’s driver-assistance system, as well as similar technology being advertised to consumers by other car manufacturers, from Ford to Mercedes.

Some say NHTSA has taken an overly deferential approach to industry in an era of profound automotive innovation, one that promises to make driving easier but creates new risks on the road. Nearly a decade since Tesla first rolled out Autopilot, the agency has yet to set basic standards for self-driving technology.

“Their risk averse approach to regulation puts drivers at greater risk on the road,” Sen. Richard Blumenthal (D-Conn.), who has long called for more safeguards for driver-assistance systems, said in an interview. “They live in fear of potential criticism or opposition, and that’s just no way for a regulator to view the world. The Elon Musks of the world are not going to just sit still.”

NHTSA has not made full use of its powers to protect the public, nor come to Congress asking for new powers to keep up with the rapidly evolving technology, he said.

The agency’s slow pace is compounded by its inability, under federal law, to act as a gatekeeper to protect U.S. motorists before new features with safety implications are sold in cars. Instead, it scrutinizes technology once safety hazards become apparent through crashes and deaths.

Unlike countries in Europe, meanwhile, the United States permits manufacturers to “self-certify” they are in compliance with basic safety standards.

The agency said in a statement that it was overseeing the new technology with the full range of its authority, including writing regulations, conducting research and taking enforcement action. “NHTSA is committed to protecting the public using all the authorities at its disposal,” the agency said.

Three days after Nissen’s April 19 death, an investigator from NHTSA asked the Washington State Patrol to let her see the vehicle, according to a state patrol investigative log released under a records request. By the end of the week, the agency disclosed it was opening a new investigation into Autopilot to review whether the changes made as part of the recall were sufficient.

The new investigation could empower NHTSA to impose fines and ultimately order Tesla to take specific actions to improve the safety of Autopilot. The company did not respond to questions about its interactions with regulators or the new investigation. Tesla has said the system improves safety by helping drivers avoid accidents.

The hazards of Tesla’s Autopilot have been known for years. The NHTSA investigation that prompted the recall in December involved a review of 467 crashes dating back to early 2018, involving 14 deaths and dozens of injuries, according to a summary of the probe released last week. In more than 200 cases, the front of Tesla vehicles on Autopilot hit objects that in many instances investigators said an alert driver could have avoided.

NHTSA concluded that the system was not doing enough to keep drivers engaged and that the system suffered from a “critical safety gap.” But it took the agency two years to reach that conclusion, even as the crashes continued.

At the end last year, NHTSA presented its findings to Tesla. The automaker disputed some of them but agreed to the recall. In mid-December, the company announced an update to the system, uploaded wirelessly into all 2 million cars, would provide drivers with additional alerts if their car detected that their attention was wandering.

Within days of the firmware update, a Tesla using Autopilot crashed in Pennsylvania.

The sequence of events highlights how Congress has limited NHTSA’s authority to review new features on cars before they are deployed. Instead, it relies on investigative powers once problems arise.

It sets minimum safety standards for vehicles, but such rulemaking takes years. And under the law, automakers certify their own compliance with the rules, rather than submitting new vehicle designs for review by regulators before they go on sale.

Regulators and the industry say self-certification encourages innovation, but as a result, automakers are mostly free to deploy self-driving systems without the okay from the federal government.

Rob Heilman, a former NHTSA and Transportation Department technology researcher, said that senior officials lack familiarity with automated systems and that the country was lacking federal leadership.

“Now you have technologies that are going out on the road and you are asking the public to take part in testing without their consent,” Heilman said.

What’s needed, experts say, is a more proactive approach that can set the ground rules for new technologies and head off problems in advance.

The Autopilot updates in the Tesla recall were not adequate, said Phil Koopman, a professor at Carnegie Mellon University who studies vehicle technology. He said the agency should be clear with automakers about what it views as unacceptable risks in their designs.

“Tesla basically thumbed their nose at them,” Koopman said. “The question is how often do they have to go back and forth. NHTSA issues a recall, and the remedy is just a little hand wave. This could go on forever.”

In Europe, car designs are certified by authorities who review vehicles and technologies before they are sold. That has given regulators more power to control the deployment of driver-assistance systems.

In response to questions about how it is ensuring the safety of the new technologies, NHTSA pointed to several steps it has taken in recent years, including the Autopilot recall and a 2021 order that requires manufacturers to disclose details about crashes involving automation systems.

Congress this year approved funding for new office at the agency with a staff of 10 who would consider rules to ensure safety, and NHTSA said a director of the office will start work Monday. Officials are also expected to soon release a proposal that could give them more oversight of fully automated vehicles. Officials this week issued a new standard for automated emergency brakes, a significant step toward harnessing new technology to improve safety, but the mandate won’t come into force until 2029.

Ann Carlson, who led NHTSA at the time of the Tesla recall, said the step was a major achievement for the agency.

“Sometimes NHTSA gets accused of not having the technological sophistication to deal with automated technology,” said Carlson, a UCLA law professor who ran the agency last year but whose nomination was withdrawn after opposition from Senate Republicans. “I think Tesla proves the opposite.”

NHTSA said this week that it was opening an investigation into Ford’s BlueCruise system after it was linked to two fatal crashes involving Mach-E electric vehicles in February and March. The National Transportation Safety Board – an independent investigative agency – is also investigating the crashes. Amy Mast, a spokeswoman for Ford, said the company was supporting NHTSA’s investigation.

Established by Congress in 1970, NHTSA is relatively small, employing a staff of about 750 with responsibilities for fuel economy standards and driver safety awareness campaigns, in addition to vehicle safety.

The agency has been beset by leadership turnover, having had a Senate-confirmed leader for just a three months since 2017. It is being led temporarily by Deputy Administrator Sophie Shulman, with no new nominee for the top job in the pipeline. NHTSA declined to make Shulman available for an interview.

In the Trump era, agency leadership took a hands-off approach to self-driving technology. James Owens, who led NHTSA between 2019 and 2020, defended the agency’s cautious approach to regulating in a 2020 speech to a U.N. body that oversees vehicle safety.

“Every decision must be made based on sound science, data and transparency,” he said. “It is not our practice to issue a new regulation simply because it is believed it might have a safety benefit or because it might be expedient.”

Under Owens’s leadership NHTSA issued the first waiver for an autonomous delivery robot that did not comply with safety standards written with human drivers in mind. Owens, who is now the chief legal officer at the company, Nuro, declined a request for comment.

In the Biden administration, officials have outlined an approach that would involve the manufacturers of autonomous vehicles sharing more data with the government – a step that could make it easier for government engineers to identify safety risks. It has yet to formally propose the rules, however, after announcing the plan last summer.

And even when it does pursue new rules, NHTSA requires seven to 10 years to finalize the standards, according to the Government Accountability Office in a report this year.

In some cases, the calls for stronger oversight go back years.

On a Saturday afternoon in May 2016, a Tesla using Autopilot crashed into a the trailer of a semi-truck in Williston, Fla. – shearing the roof off the car and killing the driver. The NTSB launched an investigation and recommended the following year that NHTSA work toward verifying that driver-assistance systems could only be activated in conditions they were designed for.

But NHTSA said setting such a requirement is out of reach, telling the safety board in 2020 that it “found this goal to be complex, resource-intensive, potentially impractical, and unlikely to result in changes in available technologies.”

A year later, the NTSB stamped the response as “unacceptable.”

Eight years after that Florida crash, self-driving options continue to outpace government rules. NHTSA advises drivers that “every vehicle currently for sale in the United States requires the full attention of the driver at all times for safe operation.”

In their TV commercials, automakers emphasize how driver assistance can alleviate the stress and boredom of being behind the wheel.

A recent ad for Ford’s system shows a mom taking her hands off the steering wheel of a Mach-E to talk to her son in sign language. A Mercedes commercial shows a man taking his eyes off the road to watch televised golf, using a higher-level system available on a limited number of roads in California and Nevada. Chevrolet touts the benefits of hands-free towing, pulling ATVs for a day in the mountains: “It’ll help you get to the adventure energized, and it will help drive you home.”

Mercedes said its system was approved by officials in California and Nevada, and relies on multiple sensors and redundant systems for safety. Aimee Ridella, a GM spokeswoman, said its Super Cruise system is marketed as offering convenience. Drivers are monitored by camera to ensure they are paying attention, and the vehicles will eventually stop if the driver is not engaged.

In March, the Insurance Institute for Highway Safety, a research group, graded 14 systems offered by nine automakers, including Tesla. It ranked 11 of the systems as performing poorly on tests designed to measure how well they monitor whether drivers remain engaged.

Missy Cummings, a professor at George Mason University who served as an adviser to NHTSA’s leaders, said the findings add to evidence that it’s time for the government to regulate driver-assistance systems, curbing their use to predetermined situations, such as limited access freeways.

“It’s very clear how to do it,” Cummings said. “The companies know how to do it. The technology exists. There’s no magic here.”

The question, Cummings said, is “how much does a regulatory agency have to regulate, as opposed to how much influence do companies have to make regulatory agencies do what they want them to do?”

It now largely falls to state and local authorities to deal with the consequences of crashes involving drive-assistance technology.

In Washington, state troopers responded to the scene of Nissen’s death. In the aftermath of the crash, his damaged bike lay on the road beside the Tesla as police worked to divert traffic. A trooper described in his report how he retrieved a yellow blanket to cover the body.

Police took Hunter, the Tesla driver into custody, jailing him on a vehicular homicide charge under existing distracted driving laws. Hunter did not respond to a request for comment.

Chris Loftis, a state patrol spokesman, said in an email that no matter what features a vehicle is equipped with, “the driver is ALWAYS responsible to operate the vehicle in a safe and legal manner.”