> "the resulting congestion required law enforcement to manually manage intersections"
Does anyone know if a Waymo vehicle will actually respond to a LEO giving directions at a dark intersection, or if it will just disregard them in favour of treating it as a 4 way stop?
I suddenly find that I really want an answer to this as well because I'm now imagining what might ensue if one of these attempted to board a car ferry. Typically there's a sign "turn headlights off", you're expected to maintain something like 5 mph (the flow of traffic should never stop), and you get directed by a human to cross multiple lane markings often deviating from the path that the vehicle immediately in front of you took.
I think that Waymo isn't concerned about those types of scenario because they only operate in a limited area, and can tune their systems to operate best in that area (EG not worrying about car ferries, human-operated parking lots etc)
Your scenario seems to have a lot of overlap with a construction worker directing traffic around a road construction site. I have no idea if Waymo is any good at navigating these, but I am sure there is a lot of model training around these scenarios because they are common in urban driving environments.
This was found to be one of the early challenges of self driving: reading traffic signal gestures of traffic agents. It does it. But the jury is out if it does it well.
The amount of times this has been asked with no confirmation leads me to believe they still do not.
Tesla fanboys gush about how FSD can understand LEO at irregular traffic conditions, but no company I’m aware of has confirmed their systems are capable.
Teslas currently have a driver in the front who could take over in these situations.
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.
How is this mode not a standard part of their disaster recovery plan? Especially in sf and the bay area they need to assume an earthquake is going to take out a lot of infrastructure. Did they not take into account this would happen?
> While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets.
We established these confirmation protocols out of an abundance of caution during our early deployment, and we are now refining them to match our current scale. While this strategy was effective during smaller outages, we are now implementing fleet-wide updates that provide the Driver with specific power outage context, allowing it to navigate more decisively.
Sounds like it was and you’re not correctly understanding the complexity of running this at scale.
Sounds like their disaster recovery plan was insufficient, intensified traffic jams in already congested areas because of "backlog", and is now being fixed to support the current scale.
The fact this backlog created issues indicates that it's perhaps Waymo that doesn't understand the complexity of running at that scale, because their systems got overwhelmed.
They probably do, they just don't give a shit. It's still the "move fast and break things" mindset. Internalize profits but externalize failures to be carried by the public. Will there be legal consequences for Waymo (i.e. fines?) for this? Probably not...
If the onboard software has detected an unusual situation it doesn't understand, moving may be a bad idea. Possible problems requiring a management decision include flooding, fires, earthquakes, riots, street parties, power outages, building collapses... Handling all that onboard is tough. For different situations, a nearby "safe place" to stop varies. The control center doesn't do remote driving, says Waymo. They provide hints, probably along the lines of "back out, turn around, and get out of this area", or "clear the intersection, then stop and unload your passenger".
Waymo didn't give much info. For example, is loss of contact with the control center a stop condition? After some number of seconds, probably. A car contacting the control center for assistance and not getting an answer is probably a stop condition.
Apparently here they overloaded the control center. That's an indication that this really is automated. There's not one person per car back at HQ; probably far fewer than that. That's good for scaling.
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
Road casualties are tied to geographical areas and America is an infamously dangerous place to live in when it comes to traffic. By fixing education, road design, and other factors, those 40k killed can be reduced by seven times before you even need to bother with automation. There's a human driver problem, but it's much smaller than the American driver problem.
Also, that still doesn't excuse Waymo blocking roads. These are two different, independent problems. More people die in care crashes than they do in plane crashes but that doesn't mean we should be replacing all cars by planes either.
Seriously. People are outraged about the theoretical potential for human harm while there is a god damn constant death rate here that is 4x higher than every other western country.
I mean really. I’m a self driving skeptic exactly because our roads are inherently dangerous. I’ve been outraged at Cruise and Tesla for hiding their safety shortcomings and acting in bad faith.
Everything I’ve seen from Waymo has been exceptional… and I literally live in a damn neighborhood that lost power, and saw multiple stopped Waymos in the street.
They failed-safe, not perfect, definitely needs improvement, but safe. At the same time we have video of a Tesla blowing through a blacked out intersection, and I saw a damn Muni bus do the same thing, as well as a least a dozen cars do the same damn thing.
People need to be at least somewhat consistent in their arguments.
Hey, I hear you. And I'm sad. Because I'd like to say that the right way is to:
build infrastructure that promotes safe driving, and
train drivers to show respect for other people on the road
However, those are both non-starters in the US. So your answer, which comes down to "at least self-driving is better than those damn people" might be the one that actually works.
I've spend some time driving in both the US and the UK and while infrastructure in the US could be improved I don't think that's the main issue.
What's different is driver training and attitude. Passing a driving test in the US is too easy to encourage new drivers to learn to drive. And an average American driver shows less respect to pedestrians, cyclists and other drivers, aggressive driving is relatively common. Bad drivers can be encountered in the UK of course but on average British drive better.
Huge SUV and pickup trucks are also part of the problem - they are more dangerous for everyone except people in such vehicle.
> Maybe there's something to be said for left-hand driving
Is this written in jest, or is there something more serious behind it? Off the top of my head, I cannot think of an obvious reason why "road handedness" (left vs right) would matter for road safety. Could it something about more people are right-handed so there is some 2nd order safety effect that I am overlooking?
When people say "western" they often don't mean "western hemisphere" but the "first world". So Peru wouldn't be "western" by this definition but Australia might be.
Yeah, HN just loves the term "The West" / "Western", which weirdly includes Australia and New Zealand, but excludes Japan, South Korea, and Taiwan. (What about South Africa? Unsure.) To me, it is better to say something like "G7-like" (or OECD) nations, because that includes all highly developed nations.
> The US isn't close to being the highest per traffic fatality rate in the western hemisphere.
Is this a serious comment? Is that actually what you think they meant by "Western"? When people talk about Russia vs "the West", do you also think they mean Russia vs the Western hemisphere?
My concern is that one company can have a malfunction which shuts down traffic in a city. That seems new or historically rare. I understand large scale deployment will find new system design flaws so I’m not outraged, but I do think we should consider what this means for us, if anything.
I think the blog is strongly hinting us to focus on the real problem -- the electrical utility and I have to agree.
The only other option I can think of is to build some kind of high density low power solar powered IoT network that is independent of current infrastructure but then where is the spectrum for that?
On the contrary, I would prefer HN detach all threads expressing "concern." That way we don't have to make a subjective call if a comment is "concern" or "concern trolling" at all - they are equally uninteresting and do not advance curiosity.
I suspected this. They were moving, but randomly to an observer. I’d seen about 2 out of maybe 20 stopped Waymos navigating around Arguello and Geary area in SF Saturday at 6PM. What was worse was that there was little to no connectivity service across all 3 main providers deeper in the power outage area as well - Spruce and Geary or west of Park Presidio (I have 2 phones, with Google Fi/T-Mobile, AT&T, and Verizon).
Interesting that some legacy safety/precaution code caused more timid and disruptive driving behavior than the current software route planner would've chosen on its own.
This reads to me, an angry resident, as an AI generated article that attempts to leverage the chaos that they caused, for marketing purposes — not as any sort of genuine remorse — underscoring why we shouldn’t be banning AI regulation in the USA.
Do Waymo’s have Starlink or another satellite based provider backup? Otherwise, what do they if cell service goes down and they need to phone home for confirmation?
> "the resulting congestion required law enforcement to manually manage intersections"
Does anyone know if a Waymo vehicle will actually respond to a LEO giving directions at a dark intersection, or if it will just disregard them in favour of treating it as a 4 way stop?
I suddenly find that I really want an answer to this as well because I'm now imagining what might ensue if one of these attempted to board a car ferry. Typically there's a sign "turn headlights off", you're expected to maintain something like 5 mph (the flow of traffic should never stop), and you get directed by a human to cross multiple lane markings often deviating from the path that the vehicle immediately in front of you took.
I think that Waymo isn't concerned about those types of scenario because they only operate in a limited area, and can tune their systems to operate best in that area (EG not worrying about car ferries, human-operated parking lots etc)
Right. People still imagine that Level 5 is going to happen, and it is at best a long way off. You're talking full AI at some point.
Your scenario seems to have a lot of overlap with a construction worker directing traffic around a road construction site. I have no idea if Waymo is any good at navigating these, but I am sure there is a lot of model training around these scenarios because they are common in urban driving environments.
Don't they just have a stop/go board? Whereas an LEO at a crossing would have to use hand signals
This was found to be one of the early challenges of self driving: reading traffic signal gestures of traffic agents. It does it. But the jury is out if it does it well.
The amount of times this has been asked with no confirmation leads me to believe they still do not.
Tesla fanboys gush about how FSD can understand LEO at irregular traffic conditions, but no company I’m aware of has confirmed their systems are capable.
Teslas currently have a driver in the front who could take over in these situations.
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.
How is this mode not a standard part of their disaster recovery plan? Especially in sf and the bay area they need to assume an earthquake is going to take out a lot of infrastructure. Did they not take into account this would happen?
> While we successfully traversed more than 7,000 dark signals on Saturday, the outage created a concentrated spike in these requests. This created a backlog that, in some cases, led to response delays contributing to congestion on already-overwhelmed streets. We established these confirmation protocols out of an abundance of caution during our early deployment, and we are now refining them to match our current scale. While this strategy was effective during smaller outages, we are now implementing fleet-wide updates that provide the Driver with specific power outage context, allowing it to navigate more decisively.
Sounds like it was and you’re not correctly understanding the complexity of running this at scale.
Sounds like their disaster recovery plan was insufficient, intensified traffic jams in already congested areas because of "backlog", and is now being fixed to support the current scale.
The fact this backlog created issues indicates that it's perhaps Waymo that doesn't understand the complexity of running at that scale, because their systems got overwhelmed.
They probably do, they just don't give a shit. It's still the "move fast and break things" mindset. Internalize profits but externalize failures to be carried by the public. Will there be legal consequences for Waymo (i.e. fines?) for this? Probably not...
If the onboard software has detected an unusual situation it doesn't understand, moving may be a bad idea. Possible problems requiring a management decision include flooding, fires, earthquakes, riots, street parties, power outages, building collapses... Handling all that onboard is tough. For different situations, a nearby "safe place" to stop varies. The control center doesn't do remote driving, says Waymo. They provide hints, probably along the lines of "back out, turn around, and get out of this area", or "clear the intersection, then stop and unload your passenger".
Waymo didn't give much info. For example, is loss of contact with the control center a stop condition? After some number of seconds, probably. A car contacting the control center for assistance and not getting an answer is probably a stop condition. Apparently here they overloaded the control center. That's an indication that this really is automated. There's not one person per car back at HQ; probably far fewer than that. That's good for scaling.
relying on essentially remote dispatch to resolve these errors states is a disaster
> we are now implementing fleet-wide updates
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
>seemed to navigate this just fine
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
No one seems sufficiently outraged that a private company's equipment blocked the public roads during an emergency.
No one seems sufficiently outraged that human drivers kill 40,000 people a year in the US.
It's approximately one 9/11 a month. And that's just the deaths.
Worldwide, 1.2m people die from vehicle accidents every year; car/motorcycle crashes are the leading cause of death for people aged 5-29 worldwide.
https://www.transportation.gov/NRSS/SafetyProblem
https://www.who.int/news-room/fact-sheets/detail/road-traffi...
Road casualties are tied to geographical areas and America is an infamously dangerous place to live in when it comes to traffic. By fixing education, road design, and other factors, those 40k killed can be reduced by seven times before you even need to bother with automation. There's a human driver problem, but it's much smaller than the American driver problem.
Also, that still doesn't excuse Waymo blocking roads. These are two different, independent problems. More people die in care crashes than they do in plane crashes but that doesn't mean we should be replacing all cars by planes either.
Seriously. People are outraged about the theoretical potential for human harm while there is a god damn constant death rate here that is 4x higher than every other western country.
I mean really. I’m a self driving skeptic exactly because our roads are inherently dangerous. I’ve been outraged at Cruise and Tesla for hiding their safety shortcomings and acting in bad faith.
Everything I’ve seen from Waymo has been exceptional… and I literally live in a damn neighborhood that lost power, and saw multiple stopped Waymos in the street.
They failed-safe, not perfect, definitely needs improvement, but safe. At the same time we have video of a Tesla blowing through a blacked out intersection, and I saw a damn Muni bus do the same thing, as well as a least a dozen cars do the same damn thing.
People need to be at least somewhat consistent in their arguments.
Hey, I hear you. And I'm sad. Because I'd like to say that the right way is to:
build infrastructure that promotes safe driving, and
train drivers to show respect for other people on the road
However, those are both non-starters in the US. So your answer, which comes down to "at least self-driving is better than those damn people" might be the one that actually works.
I've spend some time driving in both the US and the UK and while infrastructure in the US could be improved I don't think that's the main issue.
What's different is driver training and attitude. Passing a driving test in the US is too easy to encourage new drivers to learn to drive. And an average American driver shows less respect to pedestrians, cyclists and other drivers, aggressive driving is relatively common. Bad drivers can be encountered in the UK of course but on average British drive better.
Huge SUV and pickup trucks are also part of the problem - they are more dangerous for everyone except people in such vehicle.
The difference is those human-driven cars all have a driver who can be held accountable.
If I kill someone with my car, I’m probably going to jail. If a Waymo or otherwise kills someone, who’s going to jail?
> If I kill someone with my car, I’m probably going to jail
This is rarely true in the US. A driver's license is a license to kill with near impunity.
https://www.cbsnews.com/chicago/news/man-gets-10-days-in-jai...
Why lie? If you have a valid point, make it. Don't pull made up stats out of your ass.
The US isn't close to being the highest per traffic fatality rate in the western hemisphere.
I count 14 countries higher.
https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r...
I thought the UK ranked well, I didn't realise it ranked that well.
Maybe there's something to be said for left-hand driving, I see Japan ranks very highly too. ;)
The real reason is I guess we take road safety seriously, we have strict drink-driving laws, and our driving test is genuinely difficult to pass.
I seem to remember road safety also featuring prominently throughout the primary national curriculum.
And of course, our infamous safety adverts that you never quite forget, such as: https://www.youtube.com/watch?v=mKHY69AFstE
Yes, it was in jest.
The US is just a big place. We drive a lot. Average annual mileage is about 13k vs 7k in the UK.
The USA don’t do very well on the deaths per km metric either.
When people say "western" they often don't mean "western hemisphere" but the "first world". So Peru wouldn't be "western" by this definition but Australia might be.
Yeah, HN just loves the term "The West" / "Western", which weirdly includes Australia and New Zealand, but excludes Japan, South Korea, and Taiwan. (What about South Africa? Unsure.) To me, it is better to say something like "G7-like" (or OECD) nations, because that includes all highly developed nations.
It’s referring to a specific culture of people.
> The US isn't close to being the highest per traffic fatality rate in the western hemisphere.
Is this a serious comment? Is that actually what you think they meant by "Western"? When people talk about Russia vs "the West", do you also think they mean Russia vs the Western hemisphere?
> No one seems sufficiently outraged
Harvesting outrage is about the only reliable function the internet seems to have at this point. You're not seeing enough of it?
I've seen plenty but about the wrong things.
> a private company's equipment blocked the public roads
That would be like every traffic incident ever? I don't think US has public cars or state-owned utilities.
My concern is that one company can have a malfunction which shuts down traffic in a city. That seems new or historically rare. I understand large scale deployment will find new system design flaws so I’m not outraged, but I do think we should consider what this means for us, if anything.
I think the blog is strongly hinting us to focus on the real problem -- the electrical utility and I have to agree.
The only other option I can think of is to build some kind of high density low power solar powered IoT network that is independent of current infrastructure but then where is the spectrum for that?
A power outage should not cause robot cars to block intersections.
Typically people move aside for emergency vehicles
Ask any EMT or paramedic - an astonishingly large proportion of human drivers panic in the presence of an ambulance and just slam their brakes on.
On the contrary, I would prefer HN detach all threads expressing "concern." That way we don't have to make a subjective call if a comment is "concern" or "concern trolling" at all - they are equally uninteresting and do not advance curiosity.
Based. Anyone complaining about HN being "insufficiently outraged" should go to Twitter and never return.
I was actually wondering more about the people whose streets they are. Didn't mean to indicate that I or anyone cares what HN thinks.
I suspected this. They were moving, but randomly to an observer. I’d seen about 2 out of maybe 20 stopped Waymos navigating around Arguello and Geary area in SF Saturday at 6PM. What was worse was that there was little to no connectivity service across all 3 main providers deeper in the power outage area as well - Spruce and Geary or west of Park Presidio (I have 2 phones, with Google Fi/T-Mobile, AT&T, and Verizon).
Interesting that some legacy safety/precaution code caused more timid and disruptive driving behavior than the current software route planner would've chosen on its own.
This reads to me, an angry resident, as an AI generated article that attempts to leverage the chaos that they caused, for marketing purposes — not as any sort of genuine remorse — underscoring why we shouldn’t be banning AI regulation in the USA.
Do Waymo’s have Starlink or another satellite based provider backup? Otherwise, what do they if cell service goes down and they need to phone home for confirmation?
Cell services is usually around for a while when power goes down.
I doubt they have more than that.
That seems like a major oversight. Adding Starlink wouldn’t add that much marginal cost.
Related context:
Waymo halts service during S.F. blackout after causing traffic jams
https://news.ycombinator.com/item?id=46342412
Tesla FSD would never have this issue according to Elon Musk.
- written from my flying roadster
The symbolic irony of this situation is almost too rich to bear.