But that view is now in question after the revelation on Thursday that the driver of a Tesla Model S electric sedan was killed in an accident when the car was in self-driving mode.
Federal regulators, who are in the early stages of setting guidelines for autonomous vehicles, have opened a formal investigation into the incident, which occurred on May 7 in Williston, Florida.
In a statement, the National Highway Traffic Safety Administration said preliminary reports indicated that the crash occurred when a tractor-trailer made a left turn in front of the Tesla, and the car failed to apply the brakes.
It is the first known fatal accident involving a vehicle being driven by itself by means of sophisticated computer software, sensors, cameras and radar.
The safety agency did not identify the Tesla driver who was killed. But the Florida Highway Patrol identified him as Joshua Brown, 40, of Canton, Ohio. He was a Navy veteran who owned a technology consulting firm.
In a news release, Tesla has described him as a man “who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission.”
Brown posted videos of himself riding in autopilot mode. “The car’s doing it all itself,’’ he said in one, smiling as he took his hands from the steering wheel.
In another, he praised the system for saving his car from an accident.
The death is a blow to Tesla at a time when the company is pushing to expand its product line-up from expensive electric vehicles to more mainstream models. The company on Thursday declined to say whether the technology or the driver or either were at fault in the accident.
In its news release it said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.”
The crash also casts doubt on whether autonomous vehicles in general can consistently make split-second, life-or-death driving decisions on the highway.
And other companies are increasing investments in self-driving technology. Google, for example, recently announced plans to adapt 100 Chrysler minivans for autonomous driving. Earlier this year, GM acquired the software firm Cruise Automation to accelerate its own self-driving applications.
Even as the companies conduct many tests on autonomous vehicles at both private facilities and on public highways, there is skepticism that the technology has progressed far enough for the government to approve cars that totally drive themselves.
The traffic safety agency said it was working with the Florida Highway Patrol in the inquiry into Brown’s fatal accident. The agency cautioned that the opening of an investigation did not mean it thought there was a defect in the vehicle being examined.
The federal traffic safety agency is nearing the release of a new set of guidelines and regulations regarding the testing of self-driving vehicles on public roads.
They are expected to be released later this month. At a recent technology conference in Novi, Michigan, the agency’s leader, Mark Rosekind, said self-driving cars should at least be twice as safe as human drivers to result in a significant reduction in roadway deaths. “We need to start with two times better,” Rosekind said. “We need to set a higher bar if we expect safety to actually be a benefit here.” Karl Brauer, an analyst with the auto research firm Kelley Blue Book, said the accident served as a signal that the technology might not be as advanced and ready for the market as some proponents have suggested. “This is a bit of a wake-up call,” Brauer said. “People who were maybe too aggressive in taking the position that we’re almost there, this technology is going to be in the market very soon, maybe need to reassess that.” Tesla said in its news release that it had informed the traffic safety agency about the accident “immediately after it occurred.” But the company reported it publicly only on Thursday, after learning that the agency had begun to investigate.
In the past, Elon Musk, the Tesla chief executive, has praised the company’s self-driving feature, introduced in the Model S last fall, as “probably better than a person right now.”
But in its statement on Thursday, the company cautioned that it was still only a test feature and noted that its use “requires explicit acknowledgment that the system is new technology.”
It noted that when a driver activated the system, an acknowledgment box popped up, explaining that the autopilot mode “is an assist feature that requires you to keep your hands on the steering wheel at all times.”
On Friday, Reuters reported that Tesla Motors’ shares were down about 2 per cent in pre-market trading, with at least one analyst saying that worries stemming from the first fatality involving the company’s Autopilot system were based more on perception than reality.
RBC Capital Markets analyst Joseph Spak said the incident presented more of a “headline risk” than real risk to Tesla. “The bottom line is that no autonomous vehicle will ever be perfect, to say nothing of semi-autonomous or TSLA's Autopilot (which we would classify as low level semi-autonomous). The idea of autonomous is that it’s safer than a human driver.”
“Bottom line, the media attention may weigh (the first major incident was always a headline risk), but we don’t think much has changed,” he said in a client note.
©2016 The New York Times News Service