I once threw myself in front of a cruise vehicle to test it for safety. (That’s a bit of an exaggeration, I saw it coming, sped up and jumped into an open parking spot on a trajectory that would take me in front of I didn’t stop) It performed admirably. The safety driver didn’t see me coming but the computer did. The car decelerated to the point it would be able to stop if I kept coming. The safety driver couldn’t figure out what was happening at first. He was PISSED when he figured it out.
Having done that and having never seen one commit an error or safety infraction, I now have a high degree of trust in the safety of cruise vehicles.
Can confirm from personal experience in crosswalks with self driving cars: cruise is the most timid (read safest!) of the self driving cars being tested in SF IMHO. Cruise will proactively alter trajectory (such as deceleration) for pedestrians and cyclists at a noticeably earlier threshold than Waymo. This is much more pleasant for everyone surrounding the vehicle as it clearly expresses that you have been recognized as a being needing space.
I live in a part of San Francisco that has a lot of self driving car testing, and I generally agree with this. I'm not saying that I think Cruise's cars are good at driving, but all their errors that I see appear to err on the side of being slow, albeit sometimes to an almost laughable degree. Sometimes it's actually unsafe when you're driving behind them as they'll randomly slam on the brakes for no obvious reason, but I was taught to leave enough following distance and to pay attention to the road. (Still, I bet they get rear-ended a lot.)
Urban bicyclist here. I don't life in SF, but I'm really curious how current autonomous vehicles behave around cars. If you're biking down a one-way street with little room, do most autonomous vehicles just wait behind you? Do they try to pass? What kind of follow distance would they give a bicyclist who "dominates" the lane because it's too narrow to let a car pass safely?
I haven’t had this exact scenario you describe. In many cases in SF it is you passing them, not the other way around. Cruise gets confused in intersections sometimes, esp in the presence of unexpected cyclists and pedestrians. Their error state is to freeze. Then of course very easy to get around them. I also notice if I close pass the vehicle, they will brake or adjust course to varying degrees depending on street conditions and which tech stack.
I have never witnessed a self driving car exhibiting aggressive behavior towards cyclists or pedestrians. I have witnessed many humans driving cars exhibiting aggressive behavior.
As a fellow cyclist who must be vigilant in defending against frequently bad behavioral answers to these questions from human drivers, I’d love to know how any self-driving system approaches them.
Though it probably depends on your definition of 'narrow'. I'm guessing a lot of Americans think the average SF street is narrow, and it would be a medium or even wide street in some other countries.
Unfortunately, during the massive storms in the fall, I witnessed a Cruise car drive right through an intersection where the lights were out. I witnessed a Waymo car make the stop at the same intersection. I live above said intersection and watch it a lot.
Good data point! In reading all of this feedback it does make me think there may be utility for a third-party monitoring service. Think Nielsen but for self driving cars. The intelligence collected from on-road movements could be valuable for both competitors and regulators.
Timid is a good word to describe how I would prefer self-driving vehicles to drive. Consistency and comfort will be more important than minimizing travel times. It's impatient human drivers that are behind a lot of accidents.
I’m a former taxi driver and safe streets advocate, so if anything my motivations are not sympathetic with their cause, just observing behavior of them.