US probing Autopilot problems on 765,000 Tesla vehicles

by Joseph K. Clark

DETROIT — The U.S. government has opened a formal investigation into Tesla’s Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The study covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured, and one was killed.

Teslas on Autopilot or Traffic-Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board, or cones warning of hazards. NHTSA says it has identified 11 crashes since 2018. The agency announced the action Monday in a posting on its website.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to ensure drivers pay attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

Tesla vehicles

Last year the NTSB blamed Tesla, drivers, and lax regulation by NHTSA for two collisions in which Teslas crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to ensure automakers put safeguards in place to limit the use of electronic driving systems.

“We are glad to see NHTSA finally acknowledge our long-standing call to investigate Tesla for putting technology on the road that will be foreseeably misused in a way that is leading to crashes, injuries, and deaths,” said Jason Levine, executive director of the nonprofit Center for Auto Safety, an advocacy group. “If anything, this probe must go far beyond crashes involving first responder vehicles because the danger is to all drivers, passengers, and pedestrians when Autopilot is engaged.”

Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or riding in the back seat while a car rolled down a California highway. A message was left early Monday seeking comment from Tesla, which has disbanded its media relations office.

NHTSA has sent investigative teams to 31 crashes involving partially automated driver-assist systems since June 2016. Such systems can keep a vehicle centered in its lane and a safe distance from cars in front of it. Of those crashes, 25 involved Tesla Autopilot, in which ten deaths were reported, according to data released by the agency.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to crossing semis, Teslas using Autopilot, have crashed into stopped emergency vehicles and a roadway barrier.

The probe by NHTSA is long overdue, said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University who studies automated vehicles.

Rajkumar said that Tesla’s failure to monitor drivers to ensure they’re paying attention effectively should be the top priority in the probe. Teslas detect pressure on the steering wheel to ensure drivers are engaged, but drivers often fool the system.

“It’s straightforward to bypass the steering pressure thing,” Rajkumar said. “It’s been going on since 2014. We have been discussing this for a long time now.”

The crashes into emergency vehicles cited by NHTSA began on Jan. 22, 2018, in Culver City, California, near Los Angeles, when a Tesla using Autopilot struck a parked firetruck partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.

Since then, the agency said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina; Montgomery County, Texas; Lansing, Michigan; and Miami, Florida.

“The investigation will assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” NHTSA said in its investigation documents.

In addition, the probe will cover object and event detection by the system and where it is allowed to operate. NHTSA says it will examine “contributing circumstances” to the crashes and similar impacts.

An investigation could lead to a recall or other enforcement action by NHTSA. “NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves,” the agency said in a statement. “Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for the operation of their vehicles.”

The agency said it has “robust enforcement tools” to protect the public and investigate potential safety issues. It will act when it finds evidence “of noncompliance or an unreasonable safety risk.” In June, NHTSA ordered all automakers to report crashes involving fully autonomous vehicles or partially automated driver-assist systems. Based in Palo Alto, California, fell 3.5% at Monday’s opening bell.

Tesla uses a camera-based system, computing power, and sometimes radar to spot obstacles, determine what they are, and decide what the vehicles should do. But Carnegie Mellon’s Rajkumar said the company’s radar was plagued by “false positive” signals and would stop cars after determining overpasses were obstacles.

Now Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if objects are in the way. The system, he said, does an excellent job on most things that would be seen in the real world. But it has had trouble with parked emergency vehicles and perpendicular trucks in its path.

“It can only find patterns that it has been ‘quote unquote’ trained on,” Rajkumar said. “The inputs the neural network was trained on just do not contain enough images. They’re only as good as the inputs and training. Almost by definition, the training will never be good enough.” Tesla also allows selected owners to test a “full self-driving” system. Rajkumar said that should be investigated as well.

Related Posts