People are going to die in Waymo cars, and sometimes we'll wonder if the accident was avoidable and if the car made the wrong choice leading to someone's death.
It'll look bad on the brand when the news is reporting how, "Google self-driving car kills family en route to Disneyland".
I'm not saying self-driving cars are dangerous, but it's just the numbers. People will die and self-driving car deaths will be shocking news and make headlines for a long time. I wouldn't want my brand name anywhere near them.
Sure, just look at Rolls Royce. A related company with that brand name makes jet engines, which of course are thus involved in tragic and headline-making air accidents.
As a result Rolls Royce cars are... huh, no, people actually nod right along, that makes sense, Rolls Royce jet engines, I would want a reliable, high quality jet engine and that's what I associate with this car brand. Most people have no idea it isn't even the same company.
News outlets will connect Google to such deaths regardless. "Google car kills two" the headline will say, and Google's PR person will know better than to insist "Actually it should say Waymo not Google".
It looks bad if it becomes a trend. "Man dies in car crash" isn't going to destroy your brand. Ask every single car company in the world. Even Volvo, whose brand is specifically all about safety. It requires a trend, which is what hurt Boeing recently, bad news after bad news without respite, that'll do it. But there's no reason to think Waymo will ship something dangerous enough to cause such a trend.
There's a difference between someone crashing their own Volvo, and someone sitting in the backseat of a Waymo that misinterprets the lane markers and crashes into a barrier. One is not news, one is front page news.
I guarantee the first Waymo death is going to be publicized everywhere. It doesn't need to be a trend, and as I said, it doesn't mean the cars are more dangerous than human drivers. However, it's going to be news. There are going to be all sorts of moral debates when an algorithm decides to drive over a child instead of turning into an oncoming car. People will want answers. How does Waymo rank the value of different lives? I imagine it'll be news for the next decade until there are thousands of deaths and it becomes normal.
It's an industry that's going to be full of "firsts" and that's going to be the news. Waymo drives over dog while auto-piloting to a parking lot. Waymo mistakes grocery cart for stroller and swerves into elderly man. Waymo kills cyclist when poor weather disrupts censors. It doesn't matter if they ship something safe. Get enough cars on the road and these things will happen and people will be talking about it.
> There are going to be all sorts of moral debates when an algorithm decides to drive over a child instead of turning into an oncoming car.
No. That's a trolley problem. It's an interesting intellectual exercise, you can maybe win a debate team trophy for your rousing defence of one choice or the other - but these moral decisions aren't actually what drivers do whether they're humans or a powerful AI.
People keep acting as though this is an unprecedented situation and invoking weird moral beliefs about thinking machines, when it's actually utterly routine. Let's try another exercise:
How many headlines have you read about a specific brand of elevator decapitating a child? Is it none? Do you see anybody pushing for the big elevator manufacturers to have to reveal how they "rank the value of different lives"? No?
That's not because nobody dies this way, it's because we say oh that's just a machine obviously if things go wrong you can get seriously injured and the machine doesn't know if you're a nun or a basketball champion it isn't trying to kill/ not kill anybody in particular it's just a machine.
Humans are often tempted to try "escape manoeuvres" and these almost invariably go wrong, we don't teach machines to try such manoeuvres because the machines are trained based on real performance data not someone's model of themselves as an immortal superhero.
One of the first Waymo crashes was somebody trying an escape manoeuvre. They found themselves in a potential collision so rather than the correct thing (brake to reduce speed, hit the thing you're colliding with because it's too close) they tried an abrupt swerve, lost control of course, crossed a median and smashed into the unrelated oncoming Waymo car at high speed writing off both vehicles. Humans do stuff like that, you can try training them not to but they won't listen. But the machines do not have that problem, so less "Should I kill the nun, the pregnant woman or the Olympic champion?" and more "Despite maximum braking effort a collision has become inevitable. Preparing safety systems for impact".
It'll look bad on the brand when the news is reporting how, "Google self-driving car kills family en route to Disneyland".
I'm not saying self-driving cars are dangerous, but it's just the numbers. People will die and self-driving car deaths will be shocking news and make headlines for a long time. I wouldn't want my brand name anywhere near them.