Tesla has lane-assist, adaptive cruise control and a few other goodies. Many auto manufacturers have those things. You still have to manually follow GPS directions.
Google/Waymo is actually piloting a car from origin to destination without a steering wheel or pedals. That's a much more difficult challenge.
Put another way, no blind person is going to sit behind the wheel of a Tesla and tell the car to take him/her to the supermarket. Google's car actually does this, on real roads, today... and that's amazing.
> Put another way, no blind person is going to sit behind the wheel of a Tesla and tell the car to take him/her to the supermarket.
That's exactly what Tesla is marketing and working on with their new self-driving features, and you can buy cars that will allegedly be capable of this today.
"All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you."
>Google's car actually does this, on real roads, today.
It should be obvious that Tesla is going to win this race. Tesla already does fully autonomous driving, they released that video in November. And they build real, production cars. "Real artists ship", as Steve Jobs said.
If you a manufacturing ace, would you join the company with a long list of failed projects and no actual production cars, or would you join the company that is building 500,000 of the most advanced autos right now?
If I was a hardware expert, would I join Google who shies at giving a fraction of it's cash to hard tech, or Tesla who buys german automation companies?
Or plain old software. Tesla has magnitudes more data and can update neural networks to hundreds of thousands of cars at a whim. Soon millions.
I'm not super convinced by that video. That drive had almost no challenges/obstacles. No bicycles, no construction, basically no pedestrians, no other vehicles behaving erratically.
Google has cars that can respond to everything from a cyclist making hand signals, to a school bus (which must not be passed), to a police car pulling it over, to a woman chasing a duck across the road: https://youtu.be/tiwVMrTLUWg?t=8m49s
I'm also not sure this is a simple case of "throw data and hardware at the problem." If it were that easy, it wouldn't have taken Google so long to get to where they are today.
Again, not to take away by the amazing feat Tesla has achieved, but Google has had similar videos going back 2-3 years, and they are still not ready to release.
So either the problem in practice is much harder than a simple video can show, or something else is up.
Though, I think the real issue is that google wants to go straight for L5 (meaning 100% automated, and you can remove the steering wheel). As compared to an L4, which is more like 99.9% automated, but you still need steering wheel for those rare edge cases.
If Tesla executes perfectly they might stand a chance. They don't have much margin of error though. One lost lawsuit because of an accident caused by a bug in their software, and they are out of business...
I find that notion very frustrating. 30,000 people die on the road in the US every year and no one seems to bat an eye, a self driving vehicle has one and it's the end of the world. The expectations seems rather unrealistic and the media seems to love creating a controversy.
On their webpage, waymo breaks down that 30k into smaller pieces. 94% of those deaths were due to human choice or human error. At least 71% were due to speeding, alcohol, distraction, and drowsiness. It's probably safe to say that driving like those 94% during your DMV driving test would mean a human wouldn't even get licensed. Why lower the bar to just being better than them? I want my autonomous car to be better than a good human driver.
The trouble with just throwing the statistics around and saying "it's fewer deaths!" is that it has to be fewer deaths in comparable situations. A tesla having fewer deaths per mile in perfect driving conditions shouldn't be compared to a possibly-drunk person in possibly-awful weather. This isn't the media creating controversy, it's people expressing skepticism when a corporation's incentives are to let a couple people die, and trying to maintain a high bar.
But they should be compared. As you indicated and the statistics show the death primarily happen when people are impaired (drugs and alcohol, fatigue, distractions etc), the machines substitute these human errors by fewer and further apart potential engineering faults. With proper Root Cause Analysis of the investigations (helped with all the data collected of all the accidents), overtime fewer and fewer would be expected.
If you can go from 30000 to 1000 just by changing to good enough autonomous cars that's a change worth doing. Then you can look at improving it to 0 and encourage that process if not through insurance liability processes through regulations that require ongoing improvement in autonomous safety.
As rtx said, the difference is choice. We could save a ton of lives where people choose to put themselves in danger, but we'd save them by killing a smaller number of people who choose not to put themselves in danger. I don't think that's ok.
Sure, but if the average person really would be safer with a car that self-drives 90% of the time, wouldn't you recommend that everyone get a self driving car? You can't say "Only get a self-driving car if you're a below average driver" because then nobody would get one (and they'd be worse off for it)
I don't think there's a clear answer to that question, but my gut says that would be immoral. What we'd be doing is saving a ton of lives where people would have otherwise gotten themselves killed through their own choices, but at the expense of a few deaths where the fault is entirely our own (the car maker). It's laudable to protect people from themselves when it doesn't otherwise affect them, but when the cost is killing totally innocent responsible people, I think it crosses a line.
On the other hand, if you think fault doesn't matter and it's just one life for another, then it essentially becomes the trolley problem, which doesn't have a clear answer either.
Well, in the former case, that is the fault of the driver (or another driver). In the latter case, it would be the fault of the manufacturer. It's not even remotely the same thing. Remember those out-of-control Priuses, where the manufacturers were at fault. Toyota got plenty of backlash for it.
I find this an interesting question. With the exception of OTC and prescription drugs (for which there are still lawsuits) I have trouble coming up with other consumer products when used as directed and properly maintained are still directly involved in killing people because... well, stuff happens. This isn't an argument against using autopilots but it is an issue that hasn't been really addressed so far.
The issue is Google is doing it with (prohibitively) expensive hardware, so we don't know yet if Google's self-driving system works with cheaper Autopilot 2.0-level hardware that the other car makers are going to use as well.
"The person in the driver's seat is just there for legal reasons" -- but note just how closely that person's hands are to the wheel, in an obvious state of readiness. That person is 100% ready to take over the car if it messes up.
Tesla had the luxury of running test drives until they had a success. This is a different thing than putting a car on the road with no steering wheel and saying bon voyage. The first is a demonstration of a single successful instance (which is probably repeatable on average); the second is a much stronger demonstration of a > 99% success rate.
Googles car only does 25 mph. They might have made that video when they only have a 95% success rate, because in the 5% chance of collision it would be at low speed, with a very light vehicle, and very unlikely to cause a fatality.
Google/Waymo is actually piloting a car from origin to destination without a steering wheel or pedals. That's a much more difficult challenge.
Put another way, no blind person is going to sit behind the wheel of a Tesla and tell the car to take him/her to the supermarket. Google's car actually does this, on real roads, today... and that's amazing.