Business Standard

A car accident

Tesla's experiments with cars under watch

Image

Business Standard Editorial Comment New Delhi
Just about 11 weeks ago, an accident that the entire automobile industry feared, occurred. At 3.45 p.m. on a Florida highway on May 7, a Tesla Model S running in the self-driving Autopilot mode was involved in an accident that led to the death of its driver, Joshua Brown. The exact causes of this incident are being examined and the technology is under review. This is an important test case for insurance and liability. But more importantly, the reverberations of the accident continue to be felt in the automobile industry in different ways. Self-driving vehicles may already be safer than most human drivers. Over 70,000 cars with Tesla's Autopilot technology are currently operational. Tesla recorded over 208 million kilometres of driving on Autopilot before this first fatal accident. Globally, a fatal auto accident occurs every 96 million kilometres on average, with over 1.3 million persons killed every year. So Tesla is doing well statistically. The Model S is also certified as the world's safest car in crash tests conducted by the American National Highway Traffic Safety Administration.
 

What happened in Florida requires closer study. An 18-wheel truck trailer had turned across the car's path. The car did not apply brakes. Instead, it hit the truck and went under it. The high-bodied truck was painted white against a bright clear sky. It is speculated that the Autopilot may not have "seen" the white body. Or, the car radar may have registered it as a road sign, since it was high up. Moreover, it seems that Joshua Brown was watching a Harry Potter DVD and may not have had his hands on the wheel. This is despite the fact that, every time Autopilot is engaged, the car says, "Keep your hands on the wheel. Be prepared to take over at any time."

Fully autonomous driving will not be widespread for years. Before it does happen, authorities must make appropriate changes to vehicle licensing laws with explicit certification of autonomous vehicles. For example, the American National Highway Traffic Safety Administration suggests a five-rung autonomous vehicle-rating system, ranging from Level 0 for "no-automation" to Level 4 for "full self-driving automation." In accidents involving self-driving cars, questions will often arise as to who or what is responsible and there will be a heavy reliance on data from vehicles' black boxes. Insurers will have to evolve criteria for liability when one or both cars are operating autonomously, or semi-autonomously.

There may also be new ethical problems to consider. When an accident is inevitable, self-driving cars must be programmed to minimise damage. That could mean a vehicle placing a given individual at risk, to avoid killing many others. For example, a car with a single passenger may drive off a narrow road to avoid collision with a car full of children. In such a case, could the vehicle programmer be sued for causing deliberate harm to the passenger? While such thorny problems are being considered, autonomous technologies are improving. Tesla is the market leader. But every major auto manufacturer is interested and so are Silicon Valley giants like Google and Apple. Better sensors, better cameras, radar and LiDAR (Light Detection and Ranging systems) may soon be augmented by more powerful onboard computers, detailed and real-time updated maps based on the global positioning system and autonomous cars that talk to each other. Greater safety will surely be attained. But while accidents may be drastically reduced, some are bound to occur. And, this incident could create scepticism about autonomous and semi-autonomous cars. Unfortunate as this accident was, it should not derail research and development. Instead, it should serve as a wake-up call to the authorities and to insurers everywhere.

Don't miss the most important news and views of the day. Get them on our Telegram channel

First Published: Jul 23 2016 | 9:42 PM IST

Explore News