Tesla Self-Driving System Faulted by Safety Agency in Crash

Tesla Self-Driving System Faulted by Safety Agency in Crash

- in Automotive
317
0

In January, in what was interpreted as a victory for Tesla, the National Highway Traffic Safety Administration’s report on the accident said that the company’s Autopilot-enabled vehicles did not need to be recalled. That inquiry, however, focused only on the question of whether any flaws in the system had led to the crash; it found no such flaws.

The renewed attention to the Tesla system came as automakers are jockeying to push driverless technologies forward, while lawmakers and regulators scramble to keep pace, with the Trump administration putting forward its approach on Tuesday.

The Transportation Department unveiled voluntary guidelines for testing autonomous vehicles on Tuesday as part of a broader government effort to encourage automakers’ development of self-driving technology.

The department announced the initiative as Transportation Secretary Elaine L. Chao visited a testing center for self-driving vehicles in Ann Arbor, Mich.

The proposal establishes a voluntary framework of safety guidelines for companies to test autonomous vehicles on public roads. The approach also aims to clarify the role that state governments play in regulating the technology, including the enforcement of traffic laws and vehicle insurance requirements.

The guidelines replace policies set down by the Obama administration last year that called for automakers to submit safety assessments of their self-driving models before testing them on public roads.

Under the new guidelines, it will be left to automakers and other companies to decide whether to submit safety reviews to federal regulators. While the Trump administration will encourage public disclosures of such assessments, the documents will not be subject to federal approval.

“This is not an enforcement document,” Ms. Chao said. “This is a guidance document.”

There will be no waiting period for a company to begin testing autonomous models, although the vehicles remain subject to broader safety rules and standards for equipment and parts.

Industry officials lauded the less restrictive guidelines, which are intended to be a model for state policies. “The guidance provides the right balance, allowing emerging innovations to thrive while government still keeps a watchful eye over new developments,” said the Alliance of Automobile Manufacturers, a trade group.

Photo

Self-driving Uber cars in Pittsburgh last year. Automotive and technology companies have been accelerating their testing of self-driving cars, and have backed legislation exempting autonomous vehicles from current motor vehicle laws.

Credit
Angelo Merendino/Agence France-Presse — Getty Images

Separately, the House approved a bill last week allowing automakers to deploy hundreds of thousands of autonomous vehicles on American roads over the next few years. A similar bill is being drafted in the Senate.

In addition to Tesla’s efforts, the competition to develop self-driving cars has become fierce among auto industry giants such as General Motors and Ford Motor, as well as technology companies including Google and Apple.

The companies have been accelerating their testing and have backed legislation exempting autonomous vehicles from current motor vehicle laws.

Some safety campaigners and consumer groups have been critical of the move toward voluntary rules covering self-driving technology, including the guidelines introduced by Ms. Chao, saying they reduce federal oversight that was already too limited.

“The voluntary policy announced today is a retreat from the already flawed guidance provided in 2016,” said Jason K. Levine, the executive director of the Center for Auto Safety in Washington. He said the lack of required safety assessments cedes power to automakers “who have frequently proven they cannot be trusted to protect the public interest in their race for profits.”

Automakers and government officials contend that self-driving technology could reduce vehicle accidents and traffic fatalities, which rose by nearly 8 percent in 2015 to more than 35,000 deaths. Tesla reiterated that safety potential Tuesday after the transportation safety board issued its report on the Florida crash.

The accident killed Joshua Brown, 40, of Canton, Ohio. His 2015 Tesla Model S was operating under its Autopilot system on May 7, 2016, on state highway in Williston, Fla., when it crashed into a tractor-trailer that was crossing the road in front of him.

The system’s forward-looking camera failed to recognize the white truck against a bright sky, and neither Mr. Brown nor the Autopilot system activated the brakes. Data from the car showed it had been traveling at 74 miles per hour at the time of the crash and that Mr. Brown had ignored several warnings to keep his hands on the steering wheel. A preliminary N.T.S.B. report found that he had at least seven seconds to notice the truck before impact.

Like the National Highway Traffic Safety Administration, the N.T.S.B. found that the version of Autopilot in Mr. Brown’s car had performed as it had been designed to.

But Mr. Sumwalt said that version of Autopilot “gave far too much leeway to the driver to divert his attention to something other than driving.” He also said it was intended for use on limited-access highways rather than routes with cross traffic and intersections, such as the state highway Mr. Brown was traveling on.

In a statement, Tesla said it “appreciates” the N.T.S.B.’s analysis and will evaluate the agency’s recommendations. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times,” the company said.

Since the accident, Tesla has modified Autopilot to warn drivers more frequently to keep their hands on the steering wheel. After three warnings, the system cannot be engaged without stopping and restarting the car.

Tesla has also modified how Autopilot’s radar and camera sensors interact to improve its ability to recognize obstacles. The Autopilot upgrade was rolled out a year ago.

Tesla introduced Autopilot in October 2015, to great fanfare. And for a time it seemed that Tesla was far ahead of the big, established automakers as the notion of self-driving cars caught the imagination of both the media and technology enthusiasts.

But before long, some drivers, including Mr. Brown, began to post videos on YouTube showing that it was possible to go several minutes without looking at the road or holding the wheel. Some videos show drivers reading while at the wheel; in one, a driver climbs into his car’s back seat.

Even before the fatal crash, Tesla had come under criticism for releasing Autopilot without greater safeguards to prevent improper use. And early on, the company referred to it as a beta system — a technical term for an experimental version, suggesting it was a work in progress.

Continue reading the main story

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like

Arbitrum Price Prediction: 10% losses likely for holders ahead of $107 million worth of cliff token unlocks

Arbitrum network will unleash 92.65 million tokens to