Why Tesla Autopilot Shouldn’t Be Used in as Many Places as You Think

Salwan Georges/The Washington Post
A interior dash view of Tesla’s Model 3 in Washington, D.C.

Tesla’s Autopilot technology has been involved in about 40 fatal and serious car crashes – including at least eight that occurred on roads with cross traffic, where the driver-assistance feature was not designed to be used, according to a Washington Post analysis.

More than 800,000 vehicles have Autopilot, and federal officials have asked Tesla to limit its use essentially to highways with center medians and no cross traffic. The company has largely ignored those requests.

Pointing to commonly cited guidelines from SAE International, a standards developing group that used to be called the Society of Automotive Engineers, Tesla has said that Autopilot use should be left to the discretion of drivers, and that its user manual explains that drivers are responsible for controlling their cars. But experts say drivers often don’t read the roughly 300-page manuals, leaving many unaware of the technology’s limitations.

So where is Autopilot meant to be used? We looked through Tesla’s user manuals, as well as National Transportation Safety Board investigations and lawsuits against the company, to find an answer.

What has Tesla said about where to use Autopilot?

In its user manual, Tesla says that Autosteer, one of the main functions of Autopilot, is “intended for use on controlled-access highways with a fully attentive driver.”

That usually means highways with on- and off-ramps and a center median to separate opposing lanes. Such highways do not typically have traditional intersections with stoplights or stop signs.

The first time a driver turns on Autosteer, a message pops up on the dashboard screen warning that the function is designed for “highways that have a center divider, clear lane markings, and no cross traffic.”

What has Tesla said about Autopilot’s limitations?

Tesla has said repeatedly that Autopilot is not designed to be used on roads with cross traffic.

In 2019, it told the NTSB that Autopilot is designed for areas with “no cross traffic and clear lane markings.” It also said it’s not for use on winding roads with sharp curves, or in bad weather that causes reduced visibility.

In an August legal filing, Tesla’s lawyers said the technology “was not designed to detect cross traffic because the technology to do so simply did not exist in 2018, nor does it exist today.”

After a 2018 crash in Utah involving Autopilot, Tesla told the Guardian that the driver engaged the technology “contrary to proper use” on a road with no center median and with stoplights.

But the company has occasionally contradicted itself. In an NTSB investigation into a fatal 2019 crash, the company said it is up to the driver to determine where to use Autopilot because it is not an autonomous system.

“Autopilot can be safely used on divided and undivided roads as long as the driver remains attentive and ready to take control,” it told the NTSB. Tesla also told the NTSB that “the driver determines the acceptable operating environment” for Autopilot, citing guidelines from SAE International.

Tesla’s user manual also lists a number of common conditions that may hinder the technology even on controlled-access highways, including:

– Roads with sharp curves

– Bad weather

– Oncoming bright light, such as direct sunlight or headlights

– Hills

– Other cars in blind spots

Where does Tesla allow drivers to turn on Autopilot?

Not everywhere: Sometimes drivers are notified that the features are “unavailable.”

According to Tesla, this could include circumstances where lane markings are missing or hard to detect, visibility is poor, or temperatures are extremely hot or cold, to name a few.

But it’s unclear exactly where Tesla has imposed hard limits on access, because drivers sometimes use it on roads that are not controlled-access highways – at times with deadly consequences.

What do safety officials recommend?

After the first fatal Autopilot crash in 2016, NTSB officials recommended that Tesla and other automakers add safeguards to limit use of the features to the conditions for which they were designed.

NTSB also recommended that the National Highway Traffic Safety Administration (NHTSA) verify that automakers were imposing those safeguards.

Tesla never officially responded. NHTSA replied that it had “no current plans” to create that process. In a statement to The Post, NHTSA said it would be too complex and resource-intensive to do so.

What do experts make of this?

Tesla is walking a tightrope between safety and driver autonomy, said Andrew Maynard, a professor of advanced technology transitions at Arizona State University.

“Tesla has got a very strong consumer base that expects certain things from the technology,” he said. “And one of those things is a certain degree of freedom that the technology gives them. If they start acting like a nanny company, they’re likely to lose trust.”

But consumers rarely read user manuals, Maynard said, which are often full of dense legal jargon. “At the end of the day, [most people] trust the company to protect them,” he said.

What about Full Self-Driving?

Full Self-Driving, also called Autosteer on City Streets, is Tesla’s most advanced form of driver-assistance, and drivers have to shell out $12,000 or pay a monthly subscription fee to make their cars eligible to receive it. Tesla made it widely available last year, pushing the software out to around 400,000 vehicles.

Unlike Autopilot, Full Self-Driving is intended to be used on surface streets with intersections. Tesla lists a number of limitations, however, including interactions with pedestrians, construction zones, narrow roads with oncoming cars, debris on the road and more.

In addition, Tesla rolled out the software in an early form known as a “beta,” warning drivers that glitches should be expected.