Tesla Autopilot System Found Probably at Fault in 2018 Crash

WASHINGTON — Tesla’s Autopilot driver-assistance system and a driver who relied too heavily on it are likely to blame for a 2018 crash in California in which the driver died, a federal safety agency said on Tuesday.

The agency, the National Transportation Safety Board, criticized several institutions for failing to do more to prevent the crash, including the National Highway Traffic Safety Administration for what some board members described as a hands-off approach to regulating automated-vehicle technology.

“We urge Tesla to continue to work on improving Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken when necessary,” Robert L. Sumwalt, the board’s chairman, said. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”

The board adopted a number of staff findings and recommendations from an investigation into the crash that began more than six months ago. The findings included the determination that Autopilot failed to keep the driver’s vehicle in the lane, that its collision-avoidance software failed to detect a highway barrier and that the driver was probably distracted by a game on his phone.

The board also determined that the driver, Wei Huang, most likely would have survived had the California Transportation Department fixed the barrier he hit, which was designed to absorb some of the impact of a collision but was damaged during a previous crash.

Mr. Sumwalt also said Tesla had not responded to two recommendations the safety board made to the electric-car company and five other automakers in 2017. The board told the companies to limit use of automated systems to the conditions for which they were designed and to better monitor drivers to make sure they remain focused on the road and have their hands on the wheel.

“It’s been 881 days since these recommendations were sent to Tesla and we’ve heard nothing,” he said. “We’re still waiting.”

Tesla did not respond to requests for comment about criticism of Autopilot.

In a statement, the National Highway Traffic Safety Administration said that all crashes caused by distracted driving, including those in which driver-assistance systems were in use, were a “major concern” and that it planned to review the board’s report.

The board’s conclusion is the latest development in a string of federal investigations into crashes involving Autopilot, which can, among other things, keep a moving car in its lane and match the speed of surrounding vehicles. Tesla has said that the system should be used only under certain conditions, but some safety experts say the company doesn’t do enough to educate drivers about those limitations or take steps to make sure drivers do not become overly reliant on the system and, thus, distracted.

Mr. Huang had been playing a game on his phone during the drive, but it was not clear whether he was engaged with the game in the moments before the crash, according to the investigation.

The concerns about Autopilot have done little to slow Tesla’s rise. The company’s share price has more than tripled since October as Tesla’s financial performance has surpassed even the rosiest of analyst expectations. In September, Tesla earned its first safety award from the nonprofit Insurance Institute for Highway Safety, and last week Consumer Reports named Tesla’s first mass-market electric car, the Model 3, one of its top picks for 2020.

Tesla has repeatedly said that Autopilot makes its vehicles safer. In the fourth quarter of 2019, the company reported one accident for every three million miles driven in a Tesla with Autopilot engaged. Over all, the national rate was one accident for every 498,000 miles driven in 2017, according to NHTSA.

Still, the electric-car maker faces scrutiny on multiple fronts. The N.T.S.B. and the traffic safety administration are currently investigating more than a dozen crashes in which Autopilot might have played a role.

In the 2018 accident, Autopilot had been engaged for nearly 19 minutes, according to the safety board’s investigation. Mr. Huang put his hands on and off the wheel several times during that period, and in the final minute before the crash, the vehicle detected his hands on the wheel three times for a total of 34 seconds. It did not detect his hands on the wheel in the six seconds before impact.

Tesla’s event data recorders routinely collect a wide variety of information, such as location, speed, seatbelt status, the position of the driver’s seat, the rotation angle of the steering wheel and pressure on the accelerator pedal.

Mr. Huang had been traveling in his 2017 Tesla Model X sport utility vehicle on U.S. 101 in Mountain View when the car struck a median barrier at about 71 miles per hour. The speed limit was 65 m.p.h. The impact spun the car, which later hit two other vehicles and caught fire.

Mr. Huang, who worked at Apple, had previously complained to family of problems with Autopilot along that stretch of highway near State Route 85, his brother told investigators. Data from the vehicle confirmed at least one similar episode near the area dividing the two highways, according to documents from the investigation.

The safety board called on Apple to ban the nonemergency use of company-issued devices while driving. It also called on Apple and other electronics companies to either lock people out of their devices or limit what they can do with the devices while driving.

The first known fatal crash with Autopilot in use occurred in May 2016 in Florida, when a Tesla failed to stop for a truck that was turning in front of it on a Florida highway. The vehicle hit the trailer, continued traveling underneath it and veered off the road. The driver of that car, Joshua Brown, was killed in the accident.

Both the N.T.S.B. and the traffic safety agency investigated that crash, but came to somewhat different conclusions. In January 2017, NHTSA cleared Autopilot, finding that it had no defects and did not need to be recalled, though the agency called on automakers to clearly explain how such systems work to drivers. Nine months later, the safety board determined that while Autopilot worked as intended, it had nonetheless “played a major role” in the crash.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” Mr. Sumwalt said at the time.

That finding reflects a common critique of Autopilot — that it does not go far enough in forcing drivers to maintain their focus on the road. Unlike Autopilot, Super Cruise, a driver-assistance system offered by General Motors, works only on certain highways and tracks drivers’ heads to make sure they are paying attention to the road.

Critics also say Tesla and its chief executive, Elon Musk, have exaggerated Autopilot’s capabilities.

In 2018, for example, Mr. Musk was widely criticized for taking his hands off a Tesla Model 3 steering wheel while demonstrating Autopilot for the CBS News program “60 Minutes,” something the vehicle owner’s manual instructs drivers using Autopilot never to do.

In January, Mr. Musk told investors that Tesla’s “full self-driving capability” might be just a few months from having “some chance of going from your home to work, let’s say, with no interventions.”

Jason Levine, executive director of the Center for Auto Safety, an advocacy group, said that “by calling it Autopilot, by using terms like ‘full self-driving,’ Tesla is intentionally misleading consumers as to the capabilities of the technology.”

To avoid false expectations, German regulators reportedly asked Tesla in 2016 to stop using the term Autopilot, arguing that it suggests that the technology is more advanced than it really is.

Autonomous technology is commonly categorized into six levels, from zero to five, as defined by SAE International, an association of automotive engineers. Level 5 represents full autonomy in which a vehicle can perform all driving functions on its own, including navigating to a chosen destination. Autopilot and Super Cruise are considered Level 2 “partial automation” technologies, which enable a vehicle to control steering and braking and accelerating yet require the full attention of a human driver.

Evidence of drivers misusing Autopilot abounds on the internet. And in a survey last year, the Insurance Institute for Highway Safety found that 48 percent of drivers believed it was safe to remove their hands from a steering wheel while using Autopilot. By comparison, 33 percent or fewer drivers said the same thing about similar systems in cars made by other automakers.

View original article here Source