A little while back, I saw the following tweet:
I can print mostly. My wifi works often. The Xbox usually recognises me. Siri sometimes works. But my self driving car will be *perfect*.
The tweet has since been deleted, so I won’t name the author, but it’s a thought-provoking idea. At first, I agreed with it. I’m a programmer and know full well just how shoddy is 99.9% of the code we all write. The idea that I would put my life in the hands of a coder like myself is a bit worrying.
But then I remembered something else: I put my life in my own hands every day. And my own hands are even less capable outside of coding. Or to put it another way:
I can walk in a straight line mostly. I pay attention to the road often. My reaction time is usually fast enough. I sometimes check my blind spots. But giving me control of a 2 ton steel death machine that goes 85 mph will be *perfect*.
The reality is that self-driving cars don’t need to be perfect. They just need to be better than the alternative: human-driven cars. And that is a much lower bar, as human beings are remarkably bad at driving. Considering the following stats (source):
- Nearly 1.3 million people die in road crashes each year
- An additional 20-50 million are injured or disabled.
To put that into perspective, if it took you 1 minute to get to this point in this blog post, then since you started reading, there have been roughly 100 car accidents and 2 deaths. Somehow, we are all OK with this. I can’t help but wonder if this will be one of the things we are all ashamed of in the future:
"so what did you do before self-driving cars?"
"we just drove 'em ourselves!"
"wow, no one died that way?"
"oh no, millions of people died"
— gregory erskine (@cat_beltane)
April 15, 2015
Self-driving cars don’t get tired. They don’t get drunk. They don’t get distracted by friends or a crying baby. They don’t look away from the road to send a text message. They don’t speed, tailgate, brake too late, forget to show a blinker, drive too fast in bad weather, run red lights, race other cars at red lights, or miss exits. Self-driving cars aren’t going to be perfect, but they will be a hell of a lot better than you and me.
That said, there is no doubt that self-driving cars will have problems, especially early on. There will be bugs. There will be accidents. Some people will die as a result of a programmer error. And yet, the number of such deaths will be miniscule compared to what we have today from human error. Self-driving cars could reduce accidents by 90%, becoming one of the greatest health achievements of the century. Even today we have some evidence of automation saving lives. Check out this video of Tesla autopilot, which is a long way from full automation, preventing an accident:
Besides saving millions of lives, self-driving cars will likely lead to other profound changes. For example, car sharing will become much more common. Why pay thousands of dollars to own a car that sits unused in a parking lot or driveway 99% of the time when you could pay a subscription and have a car come and pick you up whenever you needed it (think Uber, but with automated drivers)? Parking in busy cities would become less of a headache, as your car could drop you off and head back home or find a less busy area to park. In fact, roughly one-third the land in many US cities is used by parking lots, most of which could be repurposed if self-driving cars became the norm.
The ability to see and react better than humans also means self-driving cars could potentially drive at significantly higher speeds than would be safe for a human-operated vehicle. Imagine a commuting at 150 mph. While you sleep. You may even be able to take vacations by sleeping in the car overnight while it drives you to some new destination. Of course, this means changing the design of the car, but many car manufacturers are working on that already:
There are plenty of challenges and unanswered questions ahead. For example, who is responsible if a self-driving car gets in an accident? The car manufacturer? The owner? Someone else? Also, if an accident is inevitable, how does the car decide who to protect and who to sacrifice? For example, if a child jumps out into the middle of the road, should the car swerve into oncoming traffic, possibly killing you and someone in another car, or should it continue on and hit the child? These sorts of hypothetical questions are straight out of Isaac Asimov’s novels, and there may be no simple answers—or at least none that actually work. But these sorts of questions are worth answering, even if our answers are imperfect, as it could prevent the 300 additional accidents and 6 additional deaths that have happened in the time it took you to read the rest of this blog post.
Yevgeniy Brikman
If you enjoyed this post, you may also like my books, Hello, Startup and Terraform: Up & Running. If you need help with DevOps or infrastructure, reach out to me at Gruntwork.