> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
As someone who lives on a residential street right by a primary school in the UK, the majority of drivers are going over 20mph even at the peak time when there are children everywhere.
While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.
I vividly recall a shot within a commercial, in which a driver was shown in slow motion, chucking his coffee into the passenger foot well in order to have two hands on the wheel for an emergency. I don’t remember what was about to happen to the car or the world around it. I’m pretty sure that a collision occurred.
> But most humans would have been aware of the big picture scenario much earlier.
I wouldn't call it likely. Sure, there are definitely human drivers who are better than Waymo, but IME they're few and far between. Much more common to be distracted or careless.
I don't think it makes sense to lump some drivers better than waymo and worse than waymo. A human brain automatically thinks of all the scenarios, where Waymo has pre-programmed ones (and some NN based ones). So it's scenarios by scenario.
Consider this scenario:
5 kids are walking on the sidewalk while you're driving past them. But suddenly a large dumpster is blocking your view of them just as you pass. You saw them before the dumpster, but not after your car and the dumpster completely blocks the view.
Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster?
Does Waymo carry that worry or just continue to drive at the exact same speed.
Again, it's not like every driver will think about this, but many drivers will (even the bad ones).
> A human brain automatically thinks of all the scenarios
I don't think this is true. There are infinitely many scenarios in a complex situation like a road with traffic, cars parked, pedestrians about, weather, etc. My brain might be able to quickly assess a handful, but certainly not all.
> A human brain automatically thinks of all the scenarios, ...
Patently, obviously false. A human brain will automatically think of SOME scenarios. For instance, if a collision seems imminent, and the driver is holding a cup of coffee, these ideas are likely to occur to the driver:
IF I GRAB THE STEERING WHEEL AND BRAKE HARD, I MIGHT NOT HIT THAT PEDESTRIAN IN FRONT OF ME.
IF I DON'T CONTINUE HOLDING THE COFFEE CAREFULLY, I MIGHT GET SCALDED.
THIS SONG ON MY RADIO IS REALLY ROCKING!
IF I YANK MY WHEEL TO THE LEFT, I MIGHT HIT A CAR INSTEAD OF A HUMAN.
IF I BRAKE HARD OR SWERVE AT ANY TIME IN TRAFFIC, I CAN CAUSE AN ACCIDENT.
Experiments with callosal patients (who have damaged the connective bridge between the halves of their brains) demonstrate that this is a realistic picture of how the brain makes decisions. It offers up a set of possible actions, and attempts to choose the optimal one and discard all others.
A computer program would do likewise, EXCEPT it won't care about the coffee cup nor the radio (remove two bad choices from consideration).
It still has one bad choice (do nothing), but the SNR is much improved.
I'm not being hyperbolic; self-preservation (focusing on keeping that coffee in my hand) is a vital factor in decision-making for a human.
> ...where Waymo has pre-programmed ones (and some NN based ones).
Yes. And as time goes on, more and better-refined scenarios will be added to its programming. Eventually, it's reasonable to believe the car software will constantly reassess how many humans are within HUMAN_RUN_DISTANCE + CAR_TRAVEL_DISTANCE in the next block, and begin tracking any that in an unsafe margin. No human on Earth does that, continually, without fail.
> Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
You continue to imply that Waymo cannot ever improve on its current programming. Does it currently consider this situation? Probably not. Will it? Probably.
For what it's worth, that kind of lumping of drivers is more-or-less one of the metrics Waymo is using to self-evaluate. Perfect safety when multi-ton vehicles share space with sub-300-pound humans is impossible. But they ultimately seek to do better than humans in all contexts.
According to the article the car was traveling at 17 miles an hour before it began braking. Presumably this was in a 25 mph school zone, so it seems the Waymo was already doing exactly what you describe - slowing down preemptively.
This is close to a particular peeve I have. Occasionally I see signs on the street that say "Slow Down". I'm not talking about the electronic ones connected to radar detectors. Just metal and paint.
Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.
But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.
You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.
Obviously a static sign is not aware of your current state, so it's message can only be interpreted as relevant to your likely state... i.e. the posted speed limit.
A lot of clickbait headlines have the same problem. "You're using too much washing powder!"
Everyone's replying to you as if you truly don't understand the sign's intention but I'm sure you do. It's just annoying to be doing everything right and the signs and headlines are still telling you you're wrong.
There was a driving safety safety ad campaign here: "Drive to the conditions. If they change, reduce your speed." You can imagine how slow we'd all be going if the weather kept changing.
Think of it like they're saying "my children play on this street and my neighbors walk here. Please think about that when you decide how fast to go here."
A 25mph school zone? That seems fast. 15mph would be more the norm, which is in line with the 17mph the car believed itself to be traveling.
FYI, unless you are a commerical truck, a cop, or a racer, your speedometer will read slightly fast, sometimes as much as 5 to 10%. This is normal practice for cars as it limits manufacturer liability. You can check this using independant gps, ie not an in-dash unit. (Just imagine the court cases if a speedo read slower than the actual speed and you can understand why this started.)
I mostly see 25 mph for school zones, though I'm in NC. Checking California, it sounds like 25 is standard there as well.[0] Some will drop to 15, but 25 is the norm as far as I can find.
Also, a different wheel diameter than the speedometer was calibrated with and you will have a larger difference between actual velocity and speedometer reading. The odometer will also not record actual distance traveled.
To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.
There's a bus stop right behind my house. I routinely hear the driver honking and yelling at people who ignore when the stop sign is extended (which is a misdemeanor in my state). So forgive me for not assuming a human would have done better.
Precisely. Environmental context is not considered in Waymo's "peer-reviewed model" (I encourage reflexive commenters to first read it: https://waymo.com/safety/collision-avoidance-benchmarking), only basic driver behavior and traffic signal timings.
Note the weaselly "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.
If they already do this, why isn't it factored in the model?
I think my problem is that it reacted after seeing the child step out from behind the SUV.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
At least it was already slowed down to 17 mph to start. Remember that viral video of some Australian in a pickup ragdolling a girl across the road? Most every comment is "well he was going the speed limit no fault for him!" No asshole, you hit someone. It's your fault. He got zero charges and the girl was seriously injured.
No it's not. The same principle applies to rules of right of way on the water. Technically the 32 foot sailboat has right of way over a triple-E because the triple-E uses mechanical propulsion.
You have a responsibility to be cautious in heavy equipment no matter what the signage on the road says, and that includes keeping a speed at which you can stop safely if a person suddenly steps onto the road in situations where people are around. If you are driving past a busy bar in downtown, a drunk person might step out and you have a responsibility to assume that might happen. If you have to go slower sometimes, tough.
As an aside, because it would not be germane to automotive safety…
In the Coast Guard Auxiliary “Sailing and Seamanship” class that I attended, targeting would-be sailboat skippers, we were told the USS Ranger nuclear-powered aircraft carrier had the right-of-way.
And if a pedestrian jumps from a bridge to land right in front of you? or how about a passenger jumps of out the car next to you? still going to stand on your absolute?
You mean the Aussie one where the guy was going an appropriate speed for the area and when the cops arrived the parents and their neighbors LIED TO THE POLICE and said he was hooning down the road at excess speed and hit the kid? And that he was only saved from prison by having a dash cam that proved the lies to be lies? That one?
That logic is utter bs, if someone jumps out when you're travelling at an appropriate speed and you do your best to stop then that's all that can be done. Otherwise by your logic the only safe speed is 0.
Aye, and to always look for feet under and by the front wheel of vehicles like that.
Stopped buses similarly, people get off the bus, whip around the front of them and straight into the streets, so many times I’ve spotted someone’s feet under the front before they come around and into the street.
Not to take away from Waymo here, agree with thread sentiment that they seem to have acted exemplary
You can spot someone's feet under the width of a bus when they're on the opposite side of the bus and you're sitting in a vehicle at a much higher position on the opposite side that the bus is on? That's physically impossible.
In normal (traditional?) European city cars, yes, I look for feet or shadows or other signs that there is a person in the other side. In SUVs this is largely impossible but then sometimes you can see heads or backpacks.
Or you look for reflections in the cars parked around it. This is what I was taught as “defensive“ driving.
I think you're missing something though, which I've observed from reading these comments - HN commenters aren't ordinary humans, they're super-humans with cosmic powers of awareness, visibility, reactions and judgement.
I don't see how that's feasible without introducing a lot of friction.
Near my house, almost the entire trip from the freeway to my house is via a single lane with parked cars on the side. I would have to drive 10 MPH the entire way (speed limit is 25, so 2.5x as long).
That's why the speed limit is 25 (lower when children are present in some areas) and not 35 or 40 etc. It's not reasonable to expect people to drive at 40% of the posted speed limit the entire way. We're also not talking about zero visibility (e.g. heavy fog). We're talking about blind spots behind parked cars, which in dense areas of a city is a large part of the city. If we think as a society in those situations the safe speed is 10 mph, then the speed limit should be 10mph.
I mean, you are putting your finger right on the answer: the whole car thing doesn't work or make sense, and trying to make autonomous vehicles solve the unsolvable is never going to succeed.
>reacted after seeing the child step out from behind the SUV.
Lmao most drivers I see on the roads aren't even capable of slowing down for a pedestrian crossing when the view of the second half of the crossing is blocked by traffic (ie they cannot see if someone is about to step out, especially a child).
This is generally the problem with self-driving cars, at least in my experience (Tesla FSD).
They don't look far enough ahead to anticipate what might happen and already put themselves in a position to prepare for that possibility. I'm not sure they benefit from accumulated knowledge? (Maybe Waymo does, that's an interesting question.) I.e., I know that my son's elementary school is around the corner so as I turn I'm already anticipating the school zone (that starts a block away) rather than only detecting it once I've made the turn.
Yes and no. Tons of situations where this is simply not possible, whole traffic goes full allowed speed next to row of parked cars. If somebody unexpectedly pops up distracted, its a tragedy guaranteed regardless of driver's skills and experience.
In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.
We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.
I slow down considerably near parked cars. And I try to slow down much earlier approaching intersections where there are parked cars blocking my view of cross walk entries. I need to be able to come to full stop earlier than intersection if there happens to be a pedestrian there.
I kind of drive that way. I slow down, move as far away in my lane from the parked cars as possible. It's certainly what I would expect from a machine that would claim to be as good as the best human driver.
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
Certainly, I'm not against constructive criticism of Waymo. I just think it's important to consider the counterfactual. You're right too that an especially prudent human driver may have avoided the scenario altogether, and Waymo should strive to be that defensive.
> I'm not against constructive criticism of Waymo.
I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
> I just think it's important to consider the counterfactual
More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
> I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
Yes, and I find it annoying that some people do seem to think Waymo should never be criticized. That said, we already have an astounding amount of data, and that data clearly shows that the experiment is successful in reducing crashes. Waymos are absolutely, without question already making streets safer than if humans were driving those cars.
> If you want to spend your efforts improving safety _anywhere_ it's right here.
We can and should do both. And as your comment seems to imply but does not explicitly state, we should also improve road design to be safer, which Europe absolutely kicks America's ass on.
> and that data clearly shows that the experiment is successful in reducing crashes
I disagree. You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year. Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
> without question
There are _tons_ of questions. This is not even a simple problem. I cannot understand this prerogative. It's far too eager or hopeful.
> We can and should do both
Well Google is operating Waymo and "we" control road policy. One of these things we can act on today and the other relies on huge amounts of investments paying off in scenarios that haven't even been tested successfully yet. I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
> More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
Could you spell out exactly what "sensible" policy changes you were thinking of? Driving under the influence of drugs and/or alcohol is already illegal in every state. Are you advocating for drastically more severe enforcement, regardless of which race the person driving is, or what it does to the national prison population? Or perhaps for "improved transit access", which is a nice idea, but will take many decades to make a real difference?
>Driving under the influence of drugs and/or alcohol is already illegal in every state.
FWIW, your first OWI in Wisconsin, with no aggravating factors, is a civil offense, not a crime, and in most states it is rare to do any time or completely lose your license for the first offense. I'm not sure exactly what OP is getting at, but DUI/OWI limits and enforcement are pretty lax in the US compared to other countries. Our standard .08 BAC limit is a lot higher than many other countries.
Absolutely, I can tell you right now that many human drivers are probably safer than the Waymo, because they would have slowed down even more and/or stayed further from the parked cars outside a school; they might have even seen the kid earlier in e.g. a reflection than the Waymo could see.
It seems it was driving pretty slow (17MPH) and they do tend to put in a pretty big gap to the right side when they can.
There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
> especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?)
Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."
Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.
I appreciate your sensible driving, but here in the UK, roads outside schools are complete mayhem at dropping off/picking up times. Speeding, overtaking, wild manoeuvres to turn round etc.
When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.
Most US states cap speed limits around schools at 15mph when children are present. There may also be blinking lights above these signs during times that will be likely.
I'm not sure how much of that Waymo's cars take into account, as the law technically takes into account line of sight things that a person could see but Waymo's sensors might not, such as children present on a sidewalk.
> Most US states cap speed limits around schools at 15mph when children are present.
Are you sure? The ones I've seen have usually been 20 or 25mph.
Looking on Image Search (https://www.google.com/search?q=school+zone+speed+limit+sign) and limiting just to the ones that are photos of real signs by the side of the road, the first 10 are: 25, 30, 25, 20, 35, 15, 20, 55, 20, 20. So only one of these was 15.
School pick up and drop off traffic is just about the worst drivers anywhere. Like visibly worse than a bunch of "probably a little drunk" people leaving a sports stadium. It's like everyone reverts to "sixteen year old on first day behind the wheel" behavior. It's baffling. And there's always one token dad picking up his kid on a motorcycle or in a box truck or something that they all clutch their pearls at.
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
> It's likely that a fully-attentive human driver would have done worse.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
I do the same, and try to actively anticipate and avoid situations like this. Sadly, in my experience most drivers instead fixate on getting to their destination as fast as possible.
You clearly don't spend much time around a school measuring the speed of cars. Head on down and see for yourself how often or not a human driver goes >17mph in such a situation.
Waymo is intentionally leaving out the following details:
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
Many states in the US have the Basic Speed Law, e.g. California:
> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.
The speed limit isn't supposed to be a carte blanche to drive at that speed no matter what; the basic speed law is supposed to "win." In practice, enforcement is a lot more clear cut at the posted speed limit and officers don't want to write tickets that are hard to argue in court.
That law seems more likely to assign blame to drivers if they hit someone. So practically it's not enforced but in accidents it becomes a justification for assigning fault.
I mean yeah. If you were traveling at some speed and caused damage to persons or property, that's reasonable, but refutable, evidence that you were traveling at a speed that endangered persons or property.
And at the same time, if you were traveling at some speed and no damage was caused, it's harder to say that persons or property were endangered.
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
How many human drivers do you think would pass the bar you're setting?
IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
> How many human drivers do you think would pass the bar you're setting?
How many humans drivers would pass it, and what proportion of the time? Even the best drivers do not constantly maintain peak vigilance, because they are human.
> IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust. And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
I think "better than the average performance of a 75th or 90th percentile human driver" might be a good way to look at things.
It's going to be a weird thing, because odds are the distribution of accidents that do happen won't look much like human ones. It will have superhuman saves (like that scooter one), but it will also crash in situations that we can't really picture humans doing.
I'm reminded of airbags; even first generation airbags made things much safer overall, but they occasionally decapitated a short person or child in a 5MPH parking lot fender bender. This was hard for the public to stomach, and if it's your kid who is internally decapitated by the airbag in a small accident, I don't think you'll really accept "it's safer on average to have an airbag!"
> In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust.
Sadly, you're right, but as rational people, we can acknowledge that it should. I care about reducing injuries and deaths, and the %tile of human performance needed for that is probably something like 30%ile. It's definitely well below 75%ile.
> > And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
It's only if you get the habitually drunk (a group that is overall impoverished), the very old, etc, to ride Waymo that you reap this benefit. And they're probably not early adopters.
Uber and Lyft were supported by police departments because they reduced drunk driving. Drunk driving isn't just impoverished alcoholics. People go to bars and parts and get drunk all the time.
You also solve for people texting (or otherwise using their phones) while driving, which is pretty common among young, tech-adopting people.
> Drunk driving isn't just impoverished alcoholics. People go to bars and parts and get drunk all the time
Yes, but the drivers who are 5th percentile drivers who cause a huge share of the most severe accidents are "special" in various ways. Most of them are probably not autonomy early adopters.
The guy who decided to drive on the wrong side of a double yellow on a windy mountain road and hit our family car in a probable suicide attempt was not going to replace that trip with Waymo.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
Swedish schools still have students who walk there. I live near one and there are very few cars that exceed 20km/h during rush hours. Anything faster is reckless even if the max over here is 30 km/h (19 mph).
The schools I'm thinking of have sidewalks with some degree of protection/offset from street, and the crossings are protected by human crossing guards during times when students are going to schools. The posted limits are "25 (MPH) When Children Are Present" and traffic generally moves at 20MPH during most of those times.
There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
Indeed, 15 or 25 mph (24 or 40 km/h) are the speed limits in school zones (when in effect) in CA, for reference. But depending on the general movement and density and category of pedestrians around the road it could be practically reckless to drive that fast (or slow).
If my experience driving through a school zone on my way to work is anything to go off of, I rarely see people actually respecting it. 17 mph would be a major improvement over what I'm used to seeing.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
So if they were 100% self-sufficient and understood everything they'd be smart enough to interpret a child being hit at 6 mph as progress? Fun how "general public" is always a "they" vs "you".
Your comment sounds like subconsciously you're trying to come off as stronger than the general public, which begs the question: Why? Why do you need to prove your strength over the populace?
Immediately hitting the brakes when a child suddenly appears in front of you, instead of waiting 500ms like a human, and thereby hitting the child at a speed of 6 instead of 14 is a success.
What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?
I don't know the implementation details, but success would be not hitting pedestrians. You have some interesting ideas on how to achieve that but there might be other ways, I don't know.
>I don't know the implementation details, but success would be not hitting pedestrians.
So by that logic, if we cured cancer but the treatment came with terrible side effects it wouldn't be considered a "success"? Does everything have to perfect to be a success?
The limit is 20 MPH in Washington state, in California the default is 25 MPH, but is going to 20 MPH soon and can be further lowered to 15 MPH with special considerations.
The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.
On street parking is so ingrained into the American lifestyle that any change to the status quo is impossible. Cars have more rights on public property than people. Every suburban neighborhood has conflicts over people's imagined "ownership" of the street parking in front of their house. People rarely use their garages to store their car since they can just leave it on the street. There are often laws that prevent people from other neighborhoods from using the public street to park. New roads are paved as wide as possible to allow both street parking and a double-parked car to not impede traffic. And we've started building homes without any kind of parking that force people to use the street.
Europe is much better at this than we are. Even when you have on street parking, they make sure there are clearances around cross walks and places where there are lots of pedestrians. Most US cities don't even care, even a supposedly pedestrian friendly one like Seattle.
If it had no parking, then the parents would be parked somewhere else and loading and unloading their kids there, and then that would need to be a no-parking zone too.
I guess you could keep doing that until kids just walk to and from school?
Our local school has them unload a block away unless they are handicapped. A kid isn't going to die walking a block. But its pointless because they still allow residential on street parking around the school, and my son has to use a crosswalk where cars routinely park so close to, I had to tell him that the traffic (pretty heavy) on the road wouldn't see him easily, and he should always ease his way into a crosswalk and not assume he would be easily seen.
In the UK we have a great big yellow zig-zag road marking that extends 2/3rds the width of an average car across the road. It means "this is a school, take your car and fuck off". You find it around school gates, to a distance of a few car lengths either side of the gate, and sometimes all along the road beside a school.
It doesn't stop all on street parking beside the school, but it cuts it down a noticeable amount.
For a school near me, the road is no parking during pick up/drop off times. It even changes to one way traffic. The no parking windows is similar to alternate street sweeping days. There are signs posted that indicate the times.
Same for my tiny town. Stopping on the road is 100% not allowed, and parking isn't allowed there either. The school has its own parking area to park and pick up/drop off kids, and cars in there creep at 2 or 3 MPH.
This isn't Apollo 13 with a successful failure. A driverless car hit a human that just happened to be a kid. Doesn't matter if a human would have as well, the super safe driverless car hit a kid. Nothing else matters. Driverless car failed.
Tesla report ids from SGO-2021-01_Incident_Reports_ADAS.csv with no or unknown airbag deployment status: 13781-13330, 13781-13319, 13781-13299, 13781-13208, 13781-8843, 13781-13149, 13781-13103, 13781-13070, 13781-13052... and more
Being transparent about such incidents is also what stops them from potentially becoming a business/industry-killing failures. They're doing the right thing here, but they also surely realize how much worse it would be if they tried to deny or downplay it.
Was it unpredictable? They drove past a blind corner (parked SUV) in a school zone. I'm constantly slowing down in these situations as I expect someone might run out at any second. Waymo seemed to default to the view that if it can't see anyone then nobody is there.
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions.
That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
Building on my own experience I think you have to own that if you crash with someone you made a mistake. I do agree that car and road design for bicycles(?) makes it almost impossible to move around if you do not risk things like that.
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit? It didn't honor the 15 mph limit.
Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do to, though not everyone of course, incredibly.
So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
> Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
So let me get this straight, the car should have been going less than the speed limit, but the fact that it was going a hair over the speed limit is the problem?
The car clearly failed to identify that this was a situation it needed to be going slower. The fact that it was going 17 instead of 15 is basically irrelevant here except as fodder for moral posturing. If the car is incapable of identifying those situations no amount of "muh magic number on sign" is going to fix it. You'll just have the same exact accident again in a 20 school zone.
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.
Kinetic energy is a bad metric. Acceleration is what splats people.
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
Yeah, if a human made the same mistakes as the Waymo driving too fast near the school, then they would have hurt the kid much worse than the Waymo did.
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
Depends on the school zone. The tech school near me is in a 50 zone and they don't even turn on the "20 when flashing" signs because if you're gonna walk there, you're gonna come in via residential side streets in the back and the school itself is way back off the road. The other school near me is downtown and you wouldn't be able to go 17 even if you wanted to.
Maybe we should not only replace the unsafe humans with robots, but also have the robots drive in a safe manner near schools rather than replicating the unsafe human behavior?
One argument for the robots is that they can be programmed to drive safer, while humans cant.
But that depends on reliability, especially in unforseen (and untrained-upon) circumstances. We'll have to see how they do, but they have been doing better than expected
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (likely undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
Are you thinking of civil liability or criminal liability?
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.
A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.
> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.
Sorry, I was being oblique. Humans kill other humans with cars every day. They kill even more on Halloween. Let's start addressing that problem before worrying whether Waymos might someday decide it's OK to drive through ghosts.
Autonomous vehicles won't be perfect. They'll surely make different mistakes from the ones humans currently make. People will die who wouldn't have died at the hands of human drivers. But the overall number of mistakes will be smaller.
Suppose you could wave your magic wand and have a The Purge-style situation where AVs had a perfect safety record 364 days of the year, but for some reason had a tricky bug that caused them to run over tiny Spidermen and princesses on Halloween. The number of fatalities in the US would drop from 40,000 annually to 40. Would you wave that wand?
Waymo is not a machine, it is a corporation, and corporations can, in fact be held accountable for decisions (and, perhaps more to the point, for defects in goods they manufacture, sell, distribute, and/or use to provide services.)
Sure, but the companies building them are just shoving billions of dollars into their ears so they don't have to answer "who's responsible when it kills someone?"
What? No? The main selling point is eliminating costs for a human driver (by enabling people to safely do other things from their car, like answering emails or doomscrolling, or via robotaxis).
> They have to be, as a machine can not be held accountable for a decision
This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.
It's hard to imagine how any driver could have reacted better in this situation.
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general.
Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs
> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.
But with waymos it would be possible. Mark those streets as "extremely slow" and never go there unless you are dropping someone off. (The computer has more patience than human drivers.)
If that's too annoying then bad parking by school areas so the situation doesn't happen.
I don't know if you've been to some cities or neighborhoods but almost every street has on-street parking in many of them.
And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.
Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.
> The situation of "kid running out between cars" will likley never be solved
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
there's still an inlet and outlet (kinda like hotel pickup/drop off loops). It's not absolutely perfect, but it constrains the space of where kids can dart from every parked car to 2 places.
Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.
Ok, that's not really a simple change anymore, because you need more space for that. Unless it's really just a drop off queue, but then it's not parked cars, since a parked car blocks the queue.
We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.
In high parking contention areas, I think there's enough latent demand for parking that you wouldn't observe fewer parked cars until reduce demand by a much greater amount.
Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?
"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.
If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?
People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.
Apart from a minority of car related deaths resulting in jail time, what kind of person wants many more people to die just so they can point at someone to blame for it? At what point are such people the ones to blame for so many deaths themselves?
The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.
This driver will be back on the street right away.
>We accept the risks with humans because those humans accept risk.
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
I believe Mercedes is the only consumer car manufacturer that is advertising an SAE Level 3 system. My understanding is that L3 is where the manufacturer says you can take your attention off the road while the system is active, so they're assuming liability.
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.
Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
> John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.
Honestly, I'd much rather be hit by a Waymo than John.
I see. Thank you for sharing. Insurance here is mandatory here for all motorists.
If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?
Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?
I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.
Insurance here is also mandatory for all motorists. Doesn't matter if the rules aren't actually enforced.
Waymo has deep pockets, so everyone is going to try and sue them, even if they don't have a legitimate grievance. Where I live, the city/state would totally milk each incident from a BigCo for all it was worth. "Hit and run" by a drunk waymo? The state is just salivating thinking about the possibility.
I don't agree with you that BigCorp doesn't have any skin in the game. They are basically playing the game in a bikini.
>John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!
The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.
So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me
Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.
No, the fact is that the child sustained minor injuries. And, fact: no human driver made the decision to drive a vehicle in that exact position and velocity. Imagining a human-driven vehicle in the same place is certainly valid, but your imagination is not fact. I imagine that the kid would be better off if no vehicle was there. But that's not a fact, that's an interpretation -- perhaps the kid would have ended up dead under an entirely different tire if they hadn't been hit by the waymo!
Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company
That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.
It might not be possible for a lot of places — I don’t really know.
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
Most people prefer the shortest ride. Circling around school zones would be the opposite of that. Rides are charged based on distance, so maybe this would interest Waymo, but one of the big complaints about taxi drivers was how drivers would "take them for a ride" to increase the fare.
Seems like a solvable problem: make it clear on the app/interior car screens that a school zone is being avoided — I think most riders will understand this.
You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.
At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.
From the Waymo blog...
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
Yup. And to add
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
As someone who lives on a residential street right by a primary school in the UK, the majority of drivers are going over 20mph even at the peak time when there are children everywhere.
While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.
I vividly recall a shot within a commercial, in which a driver was shown in slow motion, chucking his coffee into the passenger foot well in order to have two hands on the wheel for an emergency. I don’t remember what was about to happen to the car or the world around it. I’m pretty sure that a collision occurred.
> But most humans would have been aware of the big picture scenario much earlier.
I wouldn't call it likely. Sure, there are definitely human drivers who are better than Waymo, but IME they're few and far between. Much more common to be distracted or careless.
I don't think it makes sense to lump some drivers better than waymo and worse than waymo. A human brain automatically thinks of all the scenarios, where Waymo has pre-programmed ones (and some NN based ones). So it's scenarios by scenario.
Consider this scenario:
5 kids are walking on the sidewalk while you're driving past them. But suddenly a large dumpster is blocking your view of them just as you pass. You saw them before the dumpster, but not after your car and the dumpster completely blocks the view.
Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
Again, it's not like every driver will think about this, but many drivers will (even the bad ones).
You are vastly overestimating most drivers. Most drivers aren't even looking out the window the majority of their time driving.
> A human brain automatically thinks of all the scenarios
I don't think this is true. There are infinitely many scenarios in a complex situation like a road with traffic, cars parked, pedestrians about, weather, etc. My brain might be able to quickly assess a handful, but certainly not all.
> A human brain automatically thinks of all the scenarios, ...
Patently, obviously false. A human brain will automatically think of SOME scenarios. For instance, if a collision seems imminent, and the driver is holding a cup of coffee, these ideas are likely to occur to the driver:
IF I GRAB THE STEERING WHEEL AND BRAKE HARD, I MIGHT NOT HIT THAT PEDESTRIAN IN FRONT OF ME.
IF I DON'T CONTINUE HOLDING THE COFFEE CAREFULLY, I MIGHT GET SCALDED.
THIS SONG ON MY RADIO IS REALLY ROCKING!
IF I YANK MY WHEEL TO THE LEFT, I MIGHT HIT A CAR INSTEAD OF A HUMAN.
IF I BRAKE HARD OR SWERVE AT ANY TIME IN TRAFFIC, I CAN CAUSE AN ACCIDENT.
Experiments with callosal patients (who have damaged the connective bridge between the halves of their brains) demonstrate that this is a realistic picture of how the brain makes decisions. It offers up a set of possible actions, and attempts to choose the optimal one and discard all others.
A computer program would do likewise, EXCEPT it won't care about the coffee cup nor the radio (remove two bad choices from consideration).
It still has one bad choice (do nothing), but the SNR is much improved.
I'm not being hyperbolic; self-preservation (focusing on keeping that coffee in my hand) is a vital factor in decision-making for a human.
> ...where Waymo has pre-programmed ones (and some NN based ones).
Yes. And as time goes on, more and better-refined scenarios will be added to its programming. Eventually, it's reasonable to believe the car software will constantly reassess how many humans are within HUMAN_RUN_DISTANCE + CAR_TRAVEL_DISTANCE in the next block, and begin tracking any that in an unsafe margin. No human on Earth does that, continually, without fail.
> Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
You continue to imply that Waymo cannot ever improve on its current programming. Does it currently consider this situation? Probably not. Will it? Probably.
For what it's worth, that kind of lumping of drivers is more-or-less one of the metrics Waymo is using to self-evaluate. Perfect safety when multi-ton vehicles share space with sub-300-pound humans is impossible. But they ultimately seek to do better than humans in all contexts.
In this situation, the car was already driving under the legal speed required for a school zone (25mph when children are present).
I think any fair evaluation of this (once the data was available) would conclude that Waymo was taking reasonable precautions.
According to the article the car was traveling at 17 miles an hour before it began braking. Presumably this was in a 25 mph school zone, so it seems the Waymo was already doing exactly what you describe - slowing down preemptively.
This is close to a particular peeve I have. Occasionally I see signs on the street that say "Slow Down". I'm not talking about the electronic ones connected to radar detectors. Just metal and paint.
Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.
But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.
You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.
Obviously a static sign is not aware of your current state, so it's message can only be interpreted as relevant to your likely state... i.e. the posted speed limit.
A lot of clickbait headlines have the same problem. "You're using too much washing powder!"
Everyone's replying to you as if you truly don't understand the sign's intention but I'm sure you do. It's just annoying to be doing everything right and the signs and headlines are still telling you you're wrong.
There was a driving safety safety ad campaign here: "Drive to the conditions. If they change, reduce your speed." You can imagine how slow we'd all be going if the weather kept changing.
Think of it like they're saying "my children play on this street and my neighbors walk here. Please think about that when you decide how fast to go here."
Think of the sign as a flag, not an instruction.
A 25mph school zone? That seems fast. 15mph would be more the norm, which is in line with the 17mph the car believed itself to be traveling.
FYI, unless you are a commerical truck, a cop, or a racer, your speedometer will read slightly fast, sometimes as much as 5 to 10%. This is normal practice for cars as it limits manufacturer liability. You can check this using independant gps, ie not an in-dash unit. (Just imagine the court cases if a speedo read slower than the actual speed and you can understand why this started.)
I mostly see 25 mph for school zones, though I'm in NC. Checking California, it sounds like 25 is standard there as well.[0] Some will drop to 15, but 25 is the norm as far as I can find.
[0] https://www.dmv.ca.gov/portal/handbook/california-driver-han...
I've lived all over California and I agree that 25mph is the norm here.
Edit: However, elsewhere in the thread someone linked this Streetview image that shoes that this particular school zone is 15mph: https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
Also, a different wheel diameter than the speedometer was calibrated with and you will have a larger difference between actual velocity and speedometer reading. The odometer will also not record actual distance traveled.
In Encinitas, California, that sign would have no more than 20 MPH. In adjacent Carlsbad, I believe 25 is normal.
It was going 17 mph. That is rather slow.
To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.
I drive like this too, but I think we’re a small minority. Especially here in LA.
There's a bus stop right behind my house. I routinely hear the driver honking and yelling at people who ignore when the stop sign is extended (which is a misdemeanor in my state). So forgive me for not assuming a human would have done better.
Are they not using a ton of ML to take exactly this sort of context into account?
Precisely. Environmental context is not considered in Waymo's "peer-reviewed model" (I encourage reflexive commenters to first read it: https://waymo.com/safety/collision-avoidance-benchmarking), only basic driver behavior and traffic signal timings.
Note the weaselly "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.
If they already do this, why isn't it factored in the model?
How is "immediately detected the individual as soon as they began to emerge" worded weaselly?
I think my problem is that it reacted after seeing the child step out from behind the SUV.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
First, it's still the automobile's fault.
At least it was already slowed down to 17 mph to start. Remember that viral video of some Australian in a pickup ragdolling a girl across the road? Most every comment is "well he was going the speed limit no fault for him!" No asshole, you hit someone. It's your fault. He got zero charges and the girl was seriously injured.
You seem to be implying that there are no circumstances in which a vehicle can hit a pedestrian and the driver not be at fault... which is absurd.
No it's not. The same principle applies to rules of right of way on the water. Technically the 32 foot sailboat has right of way over a triple-E because the triple-E uses mechanical propulsion.
You have a responsibility to be cautious in heavy equipment no matter what the signage on the road says, and that includes keeping a speed at which you can stop safely if a person suddenly steps onto the road in situations where people are around. If you are driving past a busy bar in downtown, a drunk person might step out and you have a responsibility to assume that might happen. If you have to go slower sometimes, tough.
As an aside, because it would not be germane to automotive safety…
In the Coast Guard Auxiliary “Sailing and Seamanship” class that I attended, targeting would-be sailboat skippers, we were told the USS Ranger nuclear-powered aircraft carrier had the right-of-way.
And if a pedestrian jumps from a bridge to land right in front of you? or how about a passenger jumps of out the car next to you? still going to stand on your absolute?
You mean the Aussie one where the guy was going an appropriate speed for the area and when the cops arrived the parents and their neighbors LIED TO THE POLICE and said he was hooning down the road at excess speed and hit the kid? And that he was only saved from prison by having a dash cam that proved the lies to be lies? That one?
That logic is utter bs, if someone jumps out when you're travelling at an appropriate speed and you do your best to stop then that's all that can be done. Otherwise by your logic the only safe speed is 0.
That’s not how fault works
Aye, and to always look for feet under and by the front wheel of vehicles like that.
Stopped buses similarly, people get off the bus, whip around the front of them and straight into the streets, so many times I’ve spotted someone’s feet under the front before they come around and into the street.
Not to take away from Waymo here, agree with thread sentiment that they seem to have acted exemplary
You can spot someone's feet under the width of a bus when they're on the opposite side of the bus and you're sitting in a vehicle at a much higher position on the opposite side that the bus is on? That's physically impossible.
In normal (traditional?) European city cars, yes, I look for feet or shadows or other signs that there is a person in the other side. In SUVs this is largely impossible but then sometimes you can see heads or backpacks.
Or you look for reflections in the cars parked around it. This is what I was taught as “defensive“ driving.
I think you're missing something though, which I've observed from reading these comments - HN commenters aren't ordinary humans, they're super-humans with cosmic powers of awareness, visibility, reactions and judgement.
I don't see how that's feasible without introducing a lot of friction.
Near my house, almost the entire trip from the freeway to my house is via a single lane with parked cars on the side. I would have to drive 10 MPH the entire way (speed limit is 25, so 2.5x as long).
It's hard to consider it "lots of friction" in a vehicle where you press a button to go faster and another button to slow down.
A single lane residential street with zero visibility seems like an obvious time to slow down. And that's what the Waymo did.
That's why the speed limit is 25 (lower when children are present in some areas) and not 35 or 40 etc. It's not reasonable to expect people to drive at 40% of the posted speed limit the entire way. We're also not talking about zero visibility (e.g. heavy fog). We're talking about blind spots behind parked cars, which in dense areas of a city is a large part of the city. If we think as a society in those situations the safe speed is 10 mph, then the speed limit should be 10mph.
I mean, you are putting your finger right on the answer: the whole car thing doesn't work or make sense, and trying to make autonomous vehicles solve the unsolvable is never going to succeed.
>reacted after seeing the child step out from behind the SUV.
Lmao most drivers I see on the roads aren't even capable of slowing down for a pedestrian crossing when the view of the second half of the crossing is blocked by traffic (ie they cannot see if someone is about to step out, especially a child).
Humans are utterly terrible drivers.
They don't even stop when it's a crosswalk with a flashing light system installed and there are no obstructions.
This is generally the problem with self-driving cars, at least in my experience (Tesla FSD).
They don't look far enough ahead to anticipate what might happen and already put themselves in a position to prepare for that possibility. I'm not sure they benefit from accumulated knowledge? (Maybe Waymo does, that's an interesting question.) I.e., I know that my son's elementary school is around the corner so as I turn I'm already anticipating the school zone (that starts a block away) rather than only detecting it once I've made the turn.
Yes and no. Tons of situations where this is simply not possible, whole traffic goes full allowed speed next to row of parked cars. If somebody unexpectedly pops up distracted, its a tragedy guaranteed regardless of driver's skills and experience.
In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.
We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.
I slow down considerably near parked cars. And I try to slow down much earlier approaching intersections where there are parked cars blocking my view of cross walk entries. I need to be able to come to full stop earlier than intersection if there happens to be a pedestrian there.
I kind of drive that way. I slow down, move as far away in my lane from the parked cars as possible. It's certainly what I would expect from a machine that would claim to be as good as the best human driver.
> a machine that would claim to be as good as the best human driver.
Does Waymo claim that? If so I haven't seen it. That should of course be the goal, but "better than the average human driver" should be the bar.
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
Certainly, I'm not against constructive criticism of Waymo. I just think it's important to consider the counterfactual. You're right too that an especially prudent human driver may have avoided the scenario altogether, and Waymo should strive to be that defensive.
> I'm not against constructive criticism of Waymo.
I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
> I just think it's important to consider the counterfactual
More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
> I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
Yes, and I find it annoying that some people do seem to think Waymo should never be criticized. That said, we already have an astounding amount of data, and that data clearly shows that the experiment is successful in reducing crashes. Waymos are absolutely, without question already making streets safer than if humans were driving those cars.
> If you want to spend your efforts improving safety _anywhere_ it's right here.
We can and should do both. And as your comment seems to imply but does not explicitly state, we should also improve road design to be safer, which Europe absolutely kicks America's ass on.
> and that data clearly shows that the experiment is successful in reducing crashes
I disagree. You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year. Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
> without question
There are _tons_ of questions. This is not even a simple problem. I cannot understand this prerogative. It's far too eager or hopeful.
> We can and should do both
Well Google is operating Waymo and "we" control road policy. One of these things we can act on today and the other relies on huge amounts of investments paying off in scenarios that haven't even been tested successfully yet. I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
> More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
Could you spell out exactly what "sensible" policy changes you were thinking of? Driving under the influence of drugs and/or alcohol is already illegal in every state. Are you advocating for drastically more severe enforcement, regardless of which race the person driving is, or what it does to the national prison population? Or perhaps for "improved transit access", which is a nice idea, but will take many decades to make a real difference?
>Driving under the influence of drugs and/or alcohol is already illegal in every state.
FWIW, your first OWI in Wisconsin, with no aggravating factors, is a civil offense, not a crime, and in most states it is rare to do any time or completely lose your license for the first offense. I'm not sure exactly what OP is getting at, but DUI/OWI limits and enforcement are pretty lax in the US compared to other countries. Our standard .08 BAC limit is a lot higher than many other countries.
Absolutely, I can tell you right now that many human drivers are probably safer than the Waymo, because they would have slowed down even more and/or stayed further from the parked cars outside a school; they might have even seen the kid earlier in e.g. a reflection than the Waymo could see.
It seems it was driving pretty slow (17MPH) and they do tend to put in a pretty big gap to the right side when they can.
There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.
And the fact that Waymo is never drunk/high/tired/texting, which an astounding portion of human drivers are.
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
> especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?)
Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."
Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.
I appreciate your sensible driving, but here in the UK, roads outside schools are complete mayhem at dropping off/picking up times. Speeding, overtaking, wild manoeuvres to turn round etc.
When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.
Most US states cap speed limits around schools at 15mph when children are present. There may also be blinking lights above these signs during times that will be likely.
I'm not sure how much of that Waymo's cars take into account, as the law technically takes into account line of sight things that a person could see but Waymo's sensors might not, such as children present on a sidewalk.
> Most US states cap speed limits around schools at 15mph when children are present.
Are you sure? The ones I've seen have usually been 20 or 25mph.
Looking on Image Search (https://www.google.com/search?q=school+zone+speed+limit+sign) and limiting just to the ones that are photos of real signs by the side of the road, the first 10 are: 25, 30, 25, 20, 35, 15, 20, 55, 20, 20. So only one of these was 15.
I don't think this is accurate, 20mph is more common
School pick up and drop off traffic is just about the worst drivers anywhere. Like visibly worse than a bunch of "probably a little drunk" people leaving a sports stadium. It's like everyone reverts to "sixteen year old on first day behind the wheel" behavior. It's baffling. And there's always one token dad picking up his kid on a motorcycle or in a box truck or something that they all clutch their pearls at.
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
I live near a school zone in LA and most drivers do not obey school zone speed limits.
You will get honked at by aggro drivers if you slow down to the school zone speed limit of 25mph. Most cars go 40ish.
And ofc a decent chunk of those drivers are on tiktok, tinder, Instagram, etc
Some human drivers? Yes, certainly.
Your median human driver? Sadly, I think not. Most would be rushing, or distracted, or careless.
> waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
I don't think we can say at all that the Waymo was driving recklessly with the data we currently have
> It's likely that a fully-attentive human driver would have done worse.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
If you have contradicting data I'd be glad to see it
>It's likely that a fully-attentive human driver would have done worse.
Is based off the source I gave in my comment, the peer-reviewed model
> a huge portion of human drivers
Is based on my experience and bits of data like 30% of fatal accidents involving alcohol
Like I said, if you have better data I'm glad to see it
> based on my experience
The data completely disagrees with you.
> Like I said, if you have better data I'm glad to see it
We all have better data. It's been here the entire time:
https://www.nhtsa.gov/research-data/fatality-analysis-report...
"fully attentive human driver ..." is Waymo's claim, and it could be biased in their favor.
Could be! In aggregate though, Waymos have shown to be safer than human drivers, so my prior is that that holds here.
who benefits from a statement like this?
People who reflexively assume a human driver would do better
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
I remember someone using similar language when Uber self driving killed someone - and when the video was released, it was laughable.
It is also crazy that this happened 6 days ago at this point and video was NOT part of the press releases. LOL
LOL
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
I do the same, and try to actively anticipate and avoid situations like this. Sadly, in my experience most drivers instead fixate on getting to their destination as fast as possible.
A fully attentive human would've known he was near a school and wouldn't have been driving at 17 mph to begin with.
Doubt
You clearly don't spend much time around a school measuring the speed of cars. Head on down and see for yourself how often or not a human driver goes >17mph in such a situation.
Waymo is intentionally leaving out the following details:
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
[0] https://www.safedrivingforlife.info/free-practice-tests/haza...
Many states in the US have the Basic Speed Law, e.g. California:
> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.
The speed limit isn't supposed to be a carte blanche to drive at that speed no matter what; the basic speed law is supposed to "win." In practice, enforcement is a lot more clear cut at the posted speed limit and officers don't want to write tickets that are hard to argue in court.
That law seems more likely to assign blame to drivers if they hit someone. So practically it's not enforced but in accidents it becomes a justification for assigning fault.
I mean yeah. If you were traveling at some speed and caused damage to persons or property, that's reasonable, but refutable, evidence that you were traveling at a speed that endangered persons or property.
And at the same time, if you were traveling at some speed and no damage was caused, it's harder to say that persons or property were endangered.
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
How many human drivers do you think would pass the bar you're setting?
IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
The bar is very high because humans expect machines to be perfect. As for the expectation of other humans, "pobody's nerfect!"
> How many human drivers do you think would pass the bar you're setting?
How many humans drivers would pass it, and what proportion of the time? Even the best drivers do not constantly maintain peak vigilance, because they are human.
> IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust. And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
I think "better than the average performance of a 75th or 90th percentile human driver" might be a good way to look at things.
It's going to be a weird thing, because odds are the distribution of accidents that do happen won't look much like human ones. It will have superhuman saves (like that scooter one), but it will also crash in situations that we can't really picture humans doing.
I'm reminded of airbags; even first generation airbags made things much safer overall, but they occasionally decapitated a short person or child in a 5MPH parking lot fender bender. This was hard for the public to stomach, and if it's your kid who is internally decapitated by the airbag in a small accident, I don't think you'll really accept "it's safer on average to have an airbag!"
> In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust.
Sadly, you're right, but as rational people, we can acknowledge that it should. I care about reducing injuries and deaths, and the %tile of human performance needed for that is probably something like 30%ile. It's definitely well below 75%ile.
The counterpoint, though:
> > And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
It's only if you get the habitually drunk (a group that is overall impoverished), the very old, etc, to ride Waymo that you reap this benefit. And they're probably not early adopters.
Uber and Lyft were supported by police departments because they reduced drunk driving. Drunk driving isn't just impoverished alcoholics. People go to bars and parts and get drunk all the time.
You also solve for people texting (or otherwise using their phones) while driving, which is pretty common among young, tech-adopting people.
> Drunk driving isn't just impoverished alcoholics. People go to bars and parts and get drunk all the time
Yes, but the drivers who are 5th percentile drivers who cause a huge share of the most severe accidents are "special" in various ways. Most of them are probably not autonomy early adopters.
The guy who decided to drive on the wrong side of a double yellow on a windy mountain road and hit our family car in a probable suicide attempt was not going to replace that trip with Waymo.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
Swedish schools still have students who walk there. I live near one and there are very few cars that exceed 20km/h during rush hours. Anything faster is reckless even if the max over here is 30 km/h (19 mph).
The schools I'm thinking of have sidewalks with some degree of protection/offset from street, and the crossings are protected by human crossing guards during times when students are going to schools. The posted limits are "25 (MPH) When Children Are Present" and traffic generally moves at 20MPH during most of those times.
There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.
Whoa! You're allowed to double park outside a school over there?!
No (excluding some circumstances like delivery vehicles).
Wait, is double parking allowed anywhere?
Pretty common at airports; of course, the `parking` only lasts a few minutes at most.
It’s common but almost always illegal based on the posted signage.
People loitering in their cars waiting for a space to pick up their kid. So not actually parked.
More like standing, and quite common in a school zone.
I would not race at 17 MPH through such an area. Of course, Waymo will find a way to describe themselves as the heroes of this situation.
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
17 mph is pretty slow unless it’s a school zone
Indeed, 15 or 25 mph (24 or 40 km/h) are the speed limits in school zones (when in effect) in CA, for reference. But depending on the general movement and density and category of pedestrians around the road it could be practically reckless to drive that fast (or slow).
If my experience driving through a school zone on my way to work is anything to go off of, I rarely see people actually respecting it. 17 mph would be a major improvement over what I'm used to seeing.
Take that particular Waymo car off the road. Seems absurd, but they still hit someone.
The car is not the problem. The problem is the intersection of human and machine operating independently of each other with conflicting intention.
I am personally a fan of entirely automated but slow traffic. 10mph limit with zero traffic is fast enough for any metro area.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
The general public is stupid.
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
You ARE the general public. _I_ am the general public.
So if they were 100% self-sufficient and understood everything they'd be smart enough to interpret a child being hit at 6 mph as progress? Fun how "general public" is always a "they" vs "you".
That's impossible though. And you and I are part of the general public as well, for things we don't understand.
It isn't me vs them. It is just me being self-aware. Clearly, you had a problem with what I said so I must have struck a nerve.
Welcome to the real world bro.
Your comment sounds like subconsciously you're trying to come off as stronger than the general public, which begs the question: Why? Why do you need to prove your strength over the populace?
They they are being very transparent about it.
As every company should, when they have a success. Are they also as transparent about their failures?
How is hitting a child not a failure? And actually, how can you call this a success? Do you think this was a GTA side mission?
Immediately hitting the brakes when a child suddenly appears in front of you, instead of waiting 500ms like a human, and thereby hitting the child at a speed of 6 instead of 14 is a success.
What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?
I don't know the implementation details, but success would be not hitting pedestrians. You have some interesting ideas on how to achieve that but there might be other ways, I don't know.
>I don't know the implementation details, but success would be not hitting pedestrians.
So by that logic, if we cured cancer but the treatment came with terrible side effects it wouldn't be considered a "success"? Does everything have to perfect to be a success?
If you clearly define your goals in advance, then you can make success whatever you want. What are Waymo's goals?
Something tells me it wasn't 0 accidents, given that it's impossible.
17 mph is way too fast near a school if it's around the time children are getting out (or in).
The limit is 20 MPH in Washington state, in California the default is 25 MPH, but is going to 20 MPH soon and can be further lowered to 15 MPH with special considerations.
The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.
On street parking is so ingrained into the American lifestyle that any change to the status quo is impossible. Cars have more rights on public property than people. Every suburban neighborhood has conflicts over people's imagined "ownership" of the street parking in front of their house. People rarely use their garages to store their car since they can just leave it on the street. There are often laws that prevent people from other neighborhoods from using the public street to park. New roads are paved as wide as possible to allow both street parking and a double-parked car to not impede traffic. And we've started building homes without any kind of parking that force people to use the street.
> On street parking is so ingrained into the American lifestyle that any change to the status quo is impossible
Plenty of American cities regulate or even eliminated, in various measures, on-street parking.
Europe is much better at this than we are. Even when you have on street parking, they make sure there are clearances around cross walks and places where there are lots of pedestrians. Most US cities don't even care, even a supposedly pedestrian friendly one like Seattle.
If it had no parking, then the parents would be parked somewhere else and loading and unloading their kids there, and then that would need to be a no-parking zone too.
I guess you could keep doing that until kids just walk to and from school?
Our local school has them unload a block away unless they are handicapped. A kid isn't going to die walking a block. But its pointless because they still allow residential on street parking around the school, and my son has to use a crosswalk where cars routinely park so close to, I had to tell him that the traffic (pretty heavy) on the road wouldn't see him easily, and he should always ease his way into a crosswalk and not assume he would be easily seen.
In the UK we have a great big yellow zig-zag road marking that extends 2/3rds the width of an average car across the road. It means "this is a school, take your car and fuck off". You find it around school gates, to a distance of a few car lengths either side of the gate, and sometimes all along the road beside a school.
It doesn't stop all on street parking beside the school, but it cuts it down a noticeable amount.
This isn't universal. The schools in our Montana town have pickup lanes and short term parking areas for pickup. Stopping on the road isn't allowed.
For a school near me, the road is no parking during pick up/drop off times. It even changes to one way traffic. The no parking windows is similar to alternate street sweeping days. There are signs posted that indicate the times.
Same for my tiny town. Stopping on the road is 100% not allowed, and parking isn't allowed there either. The school has its own parking area to park and pick up/drop off kids, and cars in there creep at 2 or 3 MPH.
"and thereby hitting the child ... is a success."
> What else to you expect them to do, only run on grade–separated areas where children can't access?
no, i expect them to slow down when children may be present
how slow?
This isn't Apollo 13 with a successful failure. A driverless car hit a human that just happened to be a kid. Doesn't matter if a human would have as well, the super safe driverless car hit a kid. Nothing else matters. Driverless car failed.
If failure is defined such that failure is the only possible outcome, I don't think it's a useful part of an evaluation.
Why didn't sully just not hit the birds?
Skill issue, presumably.
They've gone to the courts to fight to keep some of their safety data secret
https://www.theverge.com/2022/1/28/22906513/waymo-lawsuit-ca...
Well, as a comparison, we know that Tesla has failed to report to NHTSA any collisions that didn't deploy the airbag.
Tesla report ids from SGO-2021-01_Incident_Reports_ADAS.csv with no or unknown airbag deployment status: 13781-13330, 13781-13319, 13781-13299, 13781-13208, 13781-8843, 13781-13149, 13781-13103, 13781-13070, 13781-13052... and more
Is this a success? There was still an incident. I'd argue this was them being transparent about a failure
Being transparent about such incidents is also what stops them from potentially becoming a business/industry-killing failures. They're doing the right thing here, but they also surely realize how much worse it would be if they tried to deny or downplay it.
> they also surely realize how much worse it would be if they tried to deny or downplay it.
Indeed. Waymo is a much more thoughtful and responsible company than Cruise, Uber, or Tesla.
"Cruise admits to criminal cover-up of pedestrian dragging in SF, will pay $500K penalty" https://www.sfgate.com/tech/article/cruise-fine-criminal-cov...
They handled an unpredictable emergency situation better than any human driver.
Was it unpredictable? They drove past a blind corner (parked SUV) in a school zone. I'm constantly slowing down in these situations as I expect someone might run out at any second. Waymo seemed to default to the view that if it can't see anyone then nobody is there.
as far as we know
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
> Humans have this intuitive sense.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
What's the success rate of this intuitive sense that humans have? Intuitions are wrong frequently.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions. That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
I dont think Waymo is interested in using a video of their car striking a child as marketing.
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
It’s great handling of the situation. They should release a video as well.
Indeed. Rather than having the company telling me that they did great I'd rather make up my own mind and watch the video.
> I honestly cannot imagine a better outcome or handling of the situation.
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
We should take their reporting with grain of salt and wait for official results
EDIT: replies say I'm misremembering, disregard.
That was Cruise, and that was fixed by Cruise ceasing operations.
I don’t think that was Waymo right? Cruise is already wound down as far as I know.
Most humans in that situation won't have reaction speed to do shit about it and it could result in a severe injury or death.
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
Building on my own experience I think you have to own that if you crash with someone you made a mistake. I do agree that car and road design for bicycles(?) makes it almost impossible to move around if you do not risk things like that.
Humans are not going to win on reaction time but prevention is arguably much more important.
How would standard automatic breaking (standard in some brands) have performed here?
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
If that's the case, then that's great info. Thank you for adding :)
People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
When my kids were school age, I taught them that the purpose of crosswalk lines is to determine who pays for your funeral.
They got the point.
This is a very good way of putting it.
Do you think Waymos should be banned from driving through Santa Monica?
No. They are by far the safest drivers in Santa Monica. Ideally we get to a point where human drivers are banned.
I do not like the phase "it's the kid's fault" for a kid being hit by a robot-car.
It is never a 6 year old's fault if they get struck by a robot.
Exactly. It’s his parents fault.
Cheers to cities pedestrianizing school streets even in busy capitals (e.g. Paris). Cars have no place near school entrances. Fix your urbanism and public transportation.
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
The school speed limit there is 15 mph, and that wasn't enough to prevent an accident.
https://www.yahoo.com/news/articles/child-struck-waymo-near-...
https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
The interesting thing is a 12 mph speed limit would be honored by an autonomous vehicle but probably ignored by humans.
If the speed limit was 15 mph, and the Waymo vehicle was traveling at 17 mph before braking, why do you believe the Waymo vehicle would honor a 12 mph speed limit? It didn't honor the 15 mph limit.
Ignored by some, not all humans. I absolutely drive extra slowly and cautiously when driving past an elementary school during drop off and pick up precisely because kids do dumb stuff like this. Others do to, though not everyone of course, incredibly.
So the waymo was speeding! All the dumbasses on here defending waymo when it was going 17 > 15.
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Depends on where the Waymo was.
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
https://maps.app.goo.gl/Vhce7puwwYyDYEuo6
> Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
> As if 2mph would have fundamentally changed this. Pfft.
According to https://news.ycombinator.com/item?id=46812226 1mph slower might have entirely avoided contact in this particular case.
The default, with good visibility in ideal conditions, should be to not exceed the speed limit.
In a school zone, when in a situation of low visibility, the car should likely be going significantly below the speed limit.
So, it's not a case of 17mph vs 15mph, but more like 17mph vs 10mph or 5mph.
So let me get this straight, the car should have been going less than the speed limit, but the fact that it was going a hair over the speed limit is the problem?
The car clearly failed to identify that this was a situation it needed to be going slower. The fact that it was going 17 instead of 15 is basically irrelevant here except as fodder for moral posturing. If the car is incapable of identifying those situations no amount of "muh magic number on sign" is going to fix it. You'll just have the same exact accident again in a 20 school zone.
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.
With above-average human reflexes, the kid would have been hit at 14mph instead of 6mph.
About 5x more kinetic energy.
Kinetic energy is a bad metric. Acceleration is what splats people.
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
Yeah, if a human made the same mistakes as the Waymo driving too fast near the school, then they would have hurt the kid much worse than the Waymo did.
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
But there may be a better solution...
But would a human be driving at 17 in a school zone during drop off hours? Id argue a human may be slower exactly because of this scenario
Depends on the school zone. The tech school near me is in a 50 zone and they don't even turn on the "20 when flashing" signs because if you're gonna walk there, you're gonna come in via residential side streets in the back and the school itself is way back off the road. The other school near me is downtown and you wouldn't be able to go 17 even if you wanted to.
> would a human be driving at 17 in a school zone during drop off hours?
In my experience in California, always and yes.
Maybe we should not only replace the unsafe humans with robots, but also have the robots drive in a safe manner near schools rather than replicating the unsafe human behavior?
One argument for the robots is that they can be programmed to drive safer, while humans cant.
But that depends on reliability, especially in unforseen (and untrained-upon) circumstances. We'll have to see how they do, but they have been doing better than expected
Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly
It honked at you? But local laws dictate that it angrily flashes its high beams at you.
In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
For reference, here's a link to Waymo's blog post: https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
Absent more precise information, this is a statistical negative mark for Waymo putting their child pedestrian injury rate at ~2-4x higher than the US human average.
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (likely undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...
[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...
[5] https://waymo.com/safety/impact/
That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.
Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?
When I was a kid (age 12, or so), I got hit by a truck while crossing the road on my bike.
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
Are you thinking of civil liability or criminal liability?
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Waymo is going to make sure they are never criminally liable for anything, and even if they were, a criminal case against a corporation just ends up being a modest fine.
A person who hits a child, or anyone, in America, with no resulting injury, stands a roughly 0% chance of facing a judge in consequence. Part of Waymo's research is to show that even injury accidents are rarely reported to the police.
> Waymo said its robotaxi struck the child at six miles per hour, after braking “hard” from around 17 miles per hour. The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
These systems don't discriminate on whether the object is a child. If an object enters the path of the vehicle, the lidar should spot it immediately and the car should brake.
It is more complicated than that. Deepends on size of object and many other factors.
The object could be a paper bag flying in the wind, or leaves falling from the tree.
You're right: a quick search shows that pedestrian fatalities are 43% higher on Halloween.
That's probably more a function of more people being in the road than people not understanding what object they're about to hit.
Sorry, I was being oblique. Humans kill other humans with cars every day. They kill even more on Halloween. Let's start addressing that problem before worrying whether Waymos might someday decide it's OK to drive through ghosts.
Autonomous vehicles won't be perfect. They'll surely make different mistakes from the ones humans currently make. People will die who wouldn't have died at the hands of human drivers. But the overall number of mistakes will be smaller.
Suppose you could wave your magic wand and have a The Purge-style situation where AVs had a perfect safety record 364 days of the year, but for some reason had a tricky bug that caused them to run over tiny Spidermen and princesses on Halloween. The number of fatalities in the US would drop from 40,000 annually to 40. Would you wave that wand?
Lidar would pick up a moving object in 3D so unlikely to just keep going.
"Oh that obstructing object doesn't look like a child? Gun it, YOLO." Lmao.
I suspect the cars are trying to avoid running into anything, as that's generally considered bad.
Oddly I cannot decide if this is cause for damnation or celebration
Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
> Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.
> Waymo is held to a significantly higher standard than human drivers.
They have to be, as a machine can not be held accountable for a decision.
Waymo is not a machine, it is a corporation, and corporations can, in fact be held accountable for decisions (and, perhaps more to the point, for defects in goods they manufacture, sell, distribute, and/or use to provide services.)
The promise of self-driving cars being safer than human drivers is also kind of the whole selling point of the technology.
Sure, but the companies building them are just shoving billions of dollars into their ears so they don't have to answer "who's responsible when it kills someone?"
What? No? The main selling point is eliminating costs for a human driver (by enabling people to safely do other things from their car, like answering emails or doomscrolling, or via robotaxis).
> They have to be, as a machine can not be held accountable for a decision
This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.
It's hard to imagine how any driver could have reacted better in this situation.
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
Basically Waymo just prevented a kids potential death.
Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.
You just invented a hypothetical situation in your head then drew conclusions from it. In my version, the other car misses the kid entirely.
And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
It will also never get worse. This is the worst the algorithms from this point forward.
I am not sure. Self-driving is complex and involves the behavior of other, non-automated actors. This is not like a compression algorithm where things are easily testable and verifiable. If Waymos start behaving extra-oddly in school zones, it may lead to other accidents where drivers attempt to go around the "broken" Waymo and crash into it, other pedestrians, or other vehicles.
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...
And we haven't reached the point where people start walking straight into the paths of cars, either obliviously or defiantly. https://www.youtube.com/shorts/nVEDebSuEUs
Has this been true of other Google products? They never get worse?
> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
Do you mean like this?
https://waymo.com/safety/impact/
Yes but ideally from some objective source.
Like this? https://waymo.com/blog/2024/12/new-swiss-re-study-waymo
Maybe an objective source that isn't on the waymo.com domain?
"We find that when benchmarked against zip code-calibrated human baselines, the Waymo Driver significantly improves safety towards other road users."
https://pmc.ncbi.nlm.nih.gov/articles/PMC11305169/
If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.
If we adopted that level of risk, we'd have 5mph speed limits on every street with parking. As a society, we've decided that's overly cautious.
But with waymos it would be possible. Mark those streets as "extremely slow" and never go there unless you are dropping someone off. (The computer has more patience than human drivers.)
If that's too annoying then bad parking by school areas so the situation doesn't happen.
I don't know if you've been to some cities or neighborhoods but almost every street has on-street parking in many of them.
And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.
Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.
> The situation of "kid running out between cars" will likley never be solved
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
How do you add a chain link fence between the parked and driving cars for on-street parking?
there's still an inlet and outlet (kinda like hotel pickup/drop off loops). It's not absolutely perfect, but it constrains the space of where kids can dart from every parked car to 2 places.
Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.
Ok, that's not really a simple change anymore, because you need more space for that. Unless it's really just a drop off queue, but then it's not parked cars, since a parked car blocks the queue.
We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.
Second-order benefit: More Waymos = fewer parked cars
In high parking contention areas, I think there's enough latent demand for parking that you wouldn't observe fewer parked cars until reduce demand by a much greater amount.
Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?
"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.
If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?
People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.
Apart from a minority of car related deaths resulting in jail time, what kind of person wants many more people to die just so they can point at someone to blame for it? At what point are such people the ones to blame for so many deaths themselves?
Do they go to jail?
That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...
The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.
This driver will be back on the street right away.
Ugh. That is so despicable both of the driver and as a society that we accept this. Ubiquitous Waymo can't come soon enough.
>We accept the risks with humans because those humans accept risk.
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
I think a very good reason to want to know who's liable is because Google has not exactly shown itself to enthusiastically accept responsibility for harm it causes, and there is no guarantee Waymo will continue to be safe in the future.
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
Even in terms of plain results, I'd say the consequences-based system isn't working so well if it's producing 40,000 US deaths annually.
> Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.
> Self driving needs to be orders of magnitude safer for us to acknowledge it
All data indicates that Waymo is ~10x safer so far.
"90% Fewer serious injury or worse crashes"
https://waymo.com/safety/impact/
I generally agree the bar is high.
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?
Who is liable when FSD is used? In Waymo's case, they own and operate the vehicle so obviously they are fully liable.
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
I believe Mercedes is the only consumer car manufacturer that is advertising an SAE Level 3 system. My understanding is that L3 is where the manufacturer says you can take your attention off the road while the system is active, so they're assuming liability.
https://www.mbusa.com/en/owners/manuals/drive-pilot
I'm a big fan of Waymo and have enjoyed my Waymo rides. And I don't think Waymno did anything "bad" here.
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
Waymo will 100% go down a route human drivers avoid because it will have "less traffic".
A human driver would most likely have killed this child. That's what should be on the ledger.
That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...
There's a 50/50 chance that a distracted driver wouldn't have slowed at all and run the child over.
> To estimate injury risk at different impact speeds, IIHS researchers examined 202 crashes involving pedestrians ages 16 or older
A child is probably more likely to die in a collision of the same speed as an adult.
For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.
Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
> John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.
Honestly, I'd much rather be hit by a Waymo than John.
> John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of
https://en.wikipedia.org/wiki/Judgment_proof
I see. Thank you for sharing. Insurance here is mandatory here for all motorists.
If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?
Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?
I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.
Insurance here is also mandatory for all motorists. Doesn't matter if the rules aren't actually enforced.
Waymo has deep pockets, so everyone is going to try and sue them, even if they don't have a legitimate grievance. Where I live, the city/state would totally milk each incident from a BigCo for all it was worth. "Hit and run" by a drunk waymo? The state is just salivating thinking about the possibility.
I don't agree with you that BigCorp doesn't have any skin in the game. They are basically playing the game in a bikini.
>John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!
Yes, that is the specific deterrence effect.
The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.
So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me
Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.
Personally I'm a lot more interested in kids not dying than in making income for injury lawyers. But that's just me.
Your comment implies that they are less interested in kids not dying. Nowhere do they say that.
I'm not interested in the policy question.
Then don't reply??
That still doesn't excuse trying to make them look bad.
It was a reply to my comment.
No, "the ledger" should record actual facts, and not whatever fictional alternatives we imagine.
Fact: This child's life was saved by the car being driven by a computer program instead of a human.
No, the fact is that the child sustained minor injuries. And, fact: no human driver made the decision to drive a vehicle in that exact position and velocity. Imagining a human-driven vehicle in the same place is certainly valid, but your imagination is not fact. I imagine that the kid would be better off if no vehicle was there. But that's not a fact, that's an interpretation -- perhaps the kid would have ended up dead under an entirely different tire if they hadn't been hit by the waymo!
Instead of a human who was driving exactly the same as the Waymo up until the instant the child ran out. Important distinction.
Would have. Could Have. Should have.
Most humans would be halfway into other lane after seeing kids near the street.
Apologist see something different than me.
Perception.
Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
can we just get waymo tech in busses?
Big vehicles that demand respect and aren't expected to turn on a dime, known stops.
Alternate headline: Waymo saves child's life
In this timeline, we want our headlines to somehow reflect the contents of the story.
Saved child from what? From themselves. You can't take full credit for partially solving a problem that you, yourself, created.
Q: Why did the self-driving car cross the road?
A: It thought it saw a child on the other side.
That's Tesla. Waymo seems mostly ok.
Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company
I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.
Any accident is bad. But accidents involving children are especially bad.
That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.
It might not be possible for a lot of places — I don’t really know.
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
Most people prefer the shortest ride. Circling around school zones would be the opposite of that. Rides are charged based on distance, so maybe this would interest Waymo, but one of the big complaints about taxi drivers was how drivers would "take them for a ride" to increase the fare.
Seems like a solvable problem: make it clear on the app/interior car screens that a school zone is being avoided — I think most riders will understand this.
You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.
At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.
Again, this all seems very solvable.
Well, I'm a human and I figure out how to avoid school zones.
> The vehicle remained stopped, moved to the side of the road
How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.
My reading of that is that they mean stopped the progression of the journey rather that made no movement whatsoever.
I agree, it’s poorly worded but I think that’s what they mean.
I also assume a human took over (called the police, moved the car, etc) once it hit the kid.
They mean the vehicle didn't drive away. It moved to the side of the road and then stopped and waited.
So many tech lovers defending waymo.
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.