As many people know (especially my wife), I hate it when I’m right. Case in point:
In 2015, I wrote a post called, “Highways of Algorithmic Morality”. It was about an ostensible study conducted by three sociopathic nitwits from the Toulouse School of Economics in France, shared in an article from MIT Technology Review called, with touching compassion, “Why Self-Driving Cars Must Be Programmed to Kill”. In response to allegedly ethical questions, the article concluded:
Taking algorithmic morality seriously has never been more urgent.
I, on the other hand, concluded this:
Unless some governmental bureaucrats — who are probably even more detached than the academic jugheads who conduct [studies like “Why Self-Driving Cars Must Be Programmed to Kill“] — shove self-driving cars down the throats of the motoring public, people will likely be more comfortable determining their own fates than they will be surrendering them to some morally conflicted algorithm.
Well, Golly Dang
Wouldn’t you know it? On May 26 of this year, Forbes published an article with this headline: “Tesla On Autopilot Slams Into Stalled Car On Highway”. I hate it when that happens almost as much as I hate it when I’m right. With as much compassion as MIT Technology Review contended self-driving cars must be programmed to kill, the Forbes article reported:
A Tesla on Autopilot … rammed into a stalled vehicle on a highway, doing so while the Tesla was moving along at a speed of around 60 mph … According to the driver of the Tesla, another car cut in front of him, staying there fleetingly, then moved rapidly over to the next lane, and within moments it became apparent that a car was stalled up ahead.
That kind of grace and tenderness just brings a tear to your eye, doesn’t it?
If you’re addled enough to be snoozing in the back seat with your car on Autopilot, you don’t get to call it an accident — unless, of course, you put your car on Autopilot by accident, or you’re willing to call the inevitable an accident-waiting-to-happen. So, then, MIT Technology Review’s article and the Forbes recounting of the auto-piloted collision raise two philosophical questions:
- Is someone who isn’t driving a car that’s on Autopilot still the driver?
- Are the people who make self-driving cars crazy, or do they think we are?
Save It, Please
I know what you’re going to say. I’ve heard it all before: “Well, there are bound to be some fatalities when some new technology is being tried.” Please. Spare me.
I gave up my rotary-dial land line to try cellular technology. It didn’t kill me. I gave up my manual portable typewriter to try computer technology. It didn’t kill me. I gave up the notebook I used to lug around in my gym bag to try app-based fitness-tracking technology on my iPhone. It didn’t kill me. I gave up my bicycle for internal-combustion automobile technology. It didn’t kill me. But I actually drive my car.
Let’s try this instead. Since the first American astronaut went into space in 1961, 17 people have died in NASA spacecraft — seven aboard Challenger, seven aboard Columbia, and three aboard Apollo 1. That’s an average of .293 fatalities per year. Since driverless cars were introduced in 2013, seven people have died, six in the United States and one in China. That’s an average of 1.67 people per year (rounded). At that rate, 67.67 people will die in driverless cars in the next 58 years — if the number of driverless cars doesn’t go up.
I know what you’re going to say here, too: “Well, O’Brien, nearly 1.25 million people die in road crashes each year, on average 3,287 deaths a day. And an additional 20-50 million are injured or disabled.” Agreed. But driverless cars don’t yet exist in anywhere near the numbers that driven cars do. (Thank God.) And if we think the blind sheep are leading the blind sheep on our highways now (they are), wait till we get somewhere in the neighborhood of 281.3 million Autopilots on the road.
I’ll take my chances with my hands on the wheel and my feet on the pedals long before I’ll consign my fate to some morally conflicted algorithm.
Photo courtesy of the National Transportation Safety Board.