The world has had its first self-driving car fatality: a Tesla autopilot failed. So far the world hasn’t freaked out. I think self-driving cars will be way safer than human-driven cars. But there’s a lot of shaping the truth in Tesla’s announcement.
(Fair warning: this blog post is uninformed hot take territory. I’m reacting to Tesla’s description of the crash, published two months after the death. We’ll know a lot more after an independent investigation.)
Tesla’s press release is masterful. It characterizes the cause of the accident like this:
the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.A truck pulled out in front of the car on the highway. It may well have been an unavoidable accident. We’ll know eventually.
But note the facility of claiming the “driver” didn’t notice the truck. How do we know that? The man is dead, we have no idea what he saw. I don't know about you, but I've never once failed to spot a white truck against a bright sky, particularly when I'm driving towards it at 70mph. I could see how a computer vision system would fail that test though.
“The brake was not applied”. It takes time to apply the brakes after you see your death coming at you. Doubly so if you’re not actually driving. The passenger-behind-the-wheel was almost certainly not having his foot hovering gently near the accelerator / brake like an engaged driver would. That slows reaction time. I do this all the time with my simple cruise control and it scares the hell out of me when some slow jerk pulls in front of me and I don’t react quickly.
(I also admire the comfort of “he never saw it coming”. Sort of takes the sting out of the next sentence, which describes the unfortunate’s grisly decapitation.)
The real problem here is Tesla’s autopilot is a half measure, “driver assist”. It doesn’t fully drive the car. This design is the most dangerous of all worlds. I had this experience with my airplane’s autopilot all the time. At some point when the automation does enough work, you can’t help but check out mentally, let the machine take over. But if the machine isn’t capable of taking over entirely you can end up dead.
That’s why I’m in favor of fully autonomous vehicles. No steering wheel, no accelerator, maybe just a single brake or other emergency cutout. Of course in this situation the software has to work reliably. Let's say a fatality rate of 50% of human drivers. And insurance and the law have to adapt to this shift of control to software. I believe the technology nerds are very close to having systems that can fully drive a car with no “driver assist” ever needed, at least in clear weather. It will be a better future. And those robot cars will kill some of their passengers. Far fewer than humans are killing now.