The US government just opened an investigation into Tesla's self-driving system

1424
6
The US government just opened an investigation into Tesla's self-driving system

The U.S. government has opened a formal investigation into Tesla's partially automated driving system after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles, nearly everything that Tesla has sold in the U.S. since model year 2014. 17 of the crashes identified by the National Highway Traffic Safety Administration as part of the probe are documented: 13 people were killed and one injured.

NHTSA says it has detected 11 crashes since 2018, in which Teslas have hit cars on autopilot or traffic alert control at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action in a posting on its website on Monday.

The probe is yet another sign that NHTSA is taking a tougher stance on automated vehicle safety under President Joe Biden than under the previous administrations. Previously the agency was reluctant to regulate the new technology for fear of hampering adoption of potentially life-saving systems.

The investigation covers Tesla's entire current model lineup, including Models Y, X, S and 3 from the 2014 to 2021 time period.

The NHTSA and Tesla, which investigated some of the accidents dating to 2016 and has recommended that NHTSA and Tesla restrict Autopilot's use to areas where it can safely operate. The NTSB also recommended that Tesla want NHTSA to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

Last year, the NTSB blamed Tesla and NHTSA for two collisions in which Teslas crashed underneath on tractor-trailers. The NTSB took the unusual step of accusing NHTSA of aiding in the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.

The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.

Autopilot is frequently misused by California highway drivers, caught driving drunk or even running in the back seat while a car was back on the highway.

A message was left early Monday to Tesla, seeking clarification from Tesla, which has disbanded its media relations office.

NHTSA has sent investigative teams to 31 crashes involving partially automated driving assist systems since June of 2016. Such systems can keep a vehicle in its lane and a safe distance from vehicles in front of it. Some of those crashes involved Tesla Autopilot in which 11 deaths were recorded according to the agency's data.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. In addition to driving semis, Teslas using Autopilot have crashed into stopped emergency vehicles and a roadway barrier.

The probe by NHTSA is long overdue, said Raj Rajkumar, an electrical and computer engineers who studies automated vehicles at Carnegie Mellon University.

Rajkumar said Tesla's failure to actually monitor drivers to make sure they're paying attention should be the top priority in the probe. Teslas detect pressure on the steering wheel to make sure drivers are engaged, but drivers often fool the system.

'It is very easy to bypass the steering pressure thing, Rajkumar said. 'It is going on for several years. It has been going on since 2014. We have been discussing this with long time now.

The collisions into emergency vehicles cited by NHTSA began on 21 Jan 2018 in Los Angeles when a Tesla using Autopilot struck a parked firetruck with its lights flashing partially in the travel lanes. Crews were handling another crash at the time.

Since then the agency said there were crashes in Tampa, Florida; Huntsville, New Jersey; West Bridgewater, Massachusetts; Little Rock, Texas; Charlotte, North Carolina; Montgomery County, Tennessee; Lansing, Michigan; and Laguna Beach, California.

'The investigation will assess the technologies and methods used to maintain, assist and enforce driver engagement with dynamic driving task during autopilot operation, NHTSA said in its investigation documents.

In addition, the probe will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says that it will examine similar circumstances to the crashes, as well as contributing crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

'NHTSA reminds the public that commercial vehicle vehicles are incapable of driving themselves, the agency said in a statement. 'Every available vehicle requires a human driver to hold the control at all times, and state laws hold human drivers responsible for operation of their vehicles.

The agency said it has 'extreme enforcement tools' to protect the public and investigate potential safety issues, and it will act when it finds evidence of noncompliance or an unreasonable risk to safety.

In June NHTSA ordered all automakers to report any crashes involving fully automated vehicles or partially autonomous driver assist systems.

Shares of Tesla Inc. based in Palo Alto, California, fell 35% at the opening bell Monday.

Tesla uses a camera system, a lot of computing power, and sometimes radar to map obstacles, determine what they are and then decide what the cars should do. Aber Carnegie Mellon's Rajkumar said the company's radar was plagued by 'false positive' signals and would stop cars after determining overpasses were obstacles.

Now the Tesla has eliminated radar in favor of cameras and thousands of images that the computer neural network uses to determine if there are objects on the way. The system, he said, does a very good job on most objects that would be seen in the real world. But it has had trouble with parked emergency vehicles and perpendicular trucks in its path.

'It can only find patterns that it has been 'quote unquote' trained on said Rajkumar. 'Currently the inputs that the neural network is trained on do not contain enough images. They're only as good as inputs and training. The training will never be good enough by definition.

In addition, Tesla is allowing selected owners to test what it calls a 'full self-driving' system. Rajkumar said that the inquiry should be investigated as well.