Courtesy of iii.org
A fatal car accident involving a Tesla Model S in autonomous driving mode is drawing widespread scrutiny both in the United States and overseas.
Joshua Brown was killed in May this year when a tractor trailer made a left turn in front of his Tesla and the self-driving car failed to apply the brakes.
The National Highway Traffic Safety Administration (NHTSA) said it is investigating the incident and will examine the design and performance of the automated driving systems in use at the time of the crash.
Its preliminary evaluation of the incident doesn’t indicate any conclusion about whether the Tesla vehicle was defective, the NHTSA said.
In a blog post, Tesla noted that this is the first known fatality in just over 130 million miles where autopilot was activated:
“Among all vehicles in the U.S., there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.”
Tesla further noted that neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied:
“The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”
As companies continue to innovate and invest in self-driving technology, the crash indicates that fully automated cars are still a thing of the future.
The crash also raises important concerns over regulation.
According to this New York Times article:
“Even as companies conduct many tests on autonomous vehicles at both private facilities and on public highways, there is skepticism that the technology has progressed far enough for the government to approve cars that totally drive themselves.”
And the Wall Street Journal reports:
“Tesla now risks being the test case that could prompt new safety regulations or laws limiting the deployment of self-driving technology.”
The crash also highlights liability concerns regarding this emerging technology. Most car crashes are caused by human error, but presumably the NHTSA investigation will also evaluate potential product liability on the part of the manufacturer.
The crux of the issue is weighing up the risk of crashes versus crashes avoided via the use of self-driving technology.
As the Insurance Information Institute (I.I.I.) notes:
“As crash avoidance technology gradually becomes standard equipment, insurers will be able to better determine the extent to which these various components reduce the frequency and cost of accidents. They will also be able to determine whether the accidents that do occur lead to a higher percentage of product liability claims, as claimants blame the manufacturer or suppliers for what went wrong rather than their own behavior.”
Liability laws might evolve to ensure autonomous vehicle technology advances are not brought to a halt, the I.I.I. adds.