Self-driving cars will be a very different level of freedom than the ability to summon a Lyft.
Um, they're pretty much the same thing. The self-driving car may be safer (although the whole process isn't really dangerous to begin with). On the other hand, it won't help you with your bag. Who cares?
All taxis do have the failure modes of "the cloud", though.
I read that line as Zvi talking about privately owned self-driving cars, not just robotaxis. Otherwise yeah it's very similar.
Going Full San Francisco
Waymo goes Full San Francisco West Bay except for SFO:
They can serve SJC, and SFO is almost ready, employee rides are in place and public rides are ‘coming soon.’
Going Down The Highway
Waymo is going to start using freeways in Phoenix, Los Angeles and San Francisco. That’s a big deal for longer rides, but there is still the problem that Waymos have to obey the technical speed limit. On freeways no huamn driver does this, so obeying the technical speed limit is both slower and more dangerous. We are going to need a regulatory solution, ideally that allows you to drive at the average observed speed.
Looking For Adventure
This is all a big unlock, but it depends on having enough cars to take advantage.
At this point, aside from regulatory barriers in some places like my beloved New York City, it all comes down to being able to get enough cars.
And Whatever Comes Our Way
Waymo will be ready for Washington DC in 2026 if legally allowed to proceed, if blocked there will be a Waymo Gap where Baltimore has it but not Washington. Dean Ball notes that councilmember Charles Allen is trying to hold Waymo up over nebulous ‘safety concerns,’ which is the worst possible argument against Waymo. We know for certain that Waymos are vastly safer than human drivers.
One could say this is cherry picking, but the number of (truthful) such tweets about losing a friend to a Waymo is zero, because it has never happened.
Waymo set to deliver DoorDash orders in Phoenix. That presumably means you’ll have to go out to get the food out of the car, which is slightly annoying but seems fine. My actual concern is whether this will be a little slow? Waymos do not understand that when you have hot food, time is of the essence.
The cars, they are coming to a City Near You pending regulatory barriers.
It’s Coming
This is not an expanded service area yet, but look at where Waymo is now officially authorized:
Who Stands In The Way
It is the state that matters, not the city. That helps.
Could we do preemption on state laws about self-driving cars? Please?
The states preempting the cities was key to getting deployment in California and Texas, but we still have a long way to go.
I actually disagree with Neil, I think this should be in the federal domain.
Then there’s the final boss enemies of all that is good and true, those who would permanently cripple our economy so that people could have permanent entirely fake jobs sitting in trucks:
Across the pond, could there be anything more Doomed European than an article that says ‘Europe doesn’t need driverless cars’? As with so many things like air conditioning, free speech and economic growth, the European asks, do we really ‘need’ this? Aren’t European roadways already ‘safer’ than American ones now that we’ve slowed them down to make them thus? Wouldn’t this ‘threaten’ European traditions of bikes and public transportation? Aren’t cars ‘inefficient’? Won’t someone please think of the potential traffic issues?
This emphasizes why I would make the case without emphasizing safety.
My One Weakness
When San Francisco had a power outage, there were mistaken initial reports that Waymos came to a halt or ‘bricked,’ causing traffic disruptions. The transition wasn’t perfect, some cars did come to a stop and behavior was more conservative than you would want.
My understanding is that this was overstated. Waymo has now issued a full report.
Waymo successfully identified the situation. The Waymo policy, decided in advance, treated every intersection as a four-way stop sign as per California law while the traffic lights were out, had protocols in place to request additional verification checks, and then as a result Waymo suspended service to avoid slowing traffic.
That seems fine? It’s not even clear it is non-ideal from Waymo’s perspective given their incentives? The risk-reward of using a more aggressive policy seems rather terrible, and worse than a service suspension? What would you have them do here?
This seems exactly right. Waymo has to be risk averse for now given that a single incident could derail their entire program. Over time, as they gain experience, they can act more decisively.
The amount of ‘omg never using a self-driving car again’ or ‘police and fire departments will now fight against self-driving cars to the death’ boggles the mind.
If enough cars on the road were self-driving, then they wouldn’t even need the traffic lights, they could coordinate in other ways, and this would all be moot.
Yes, in the case where the internet goes down entirely or Waymos otherwise systemically fail there will be a bigger problem that might not have a great solution right now, but do you think Waymo hasn’t planned for this?
At most, this says that if we had so many self-driving-only cars that we would be in deep trouble if all the self-driving cars died at once, then we want a solution where the cars are, in such an emergency, something a human could override and drive. That does not seem like such a difficult bar to cross?
The most common crisis scenario where things go haywire is very simple:
Human drivers cannot solve this. Self-driving cars in sufficiently quantities solves this through coordination. Given these are maximally important scenarios where not getting out often risks death, it’s kind of a big deal. Imagine if things were reversed.
Holly Elmore accused me of missing the point here, that it is about all the things that could go wrong with self-driving cars and that haven’t yet occured in the field.
To which I say no, it is Holly that is missing the point. The reason why AGI is different is that if you have such a failure, you could be dead or lose control, and be unable to recover from the failure, or suffer truly catastrophic levels of damage. Thus, you need to get such potential difficulties right on the first try, before an incident happens, and you have to do this generally against a potential adversary more intelligent than you that will be out of distribution.
A self-driving car… is a car. It is a normal technology.
Even if something goes systematically wrong with a fleet of such cars, or all such fleets of cars? This is highly recoverable. The damage even for ‘all the Waymos suddenly floor it and crash’ (or even the pure sci-fi ‘suddenly try to do maximum amounts of damage’) is not so high in the grand scheme of things. There are a finite number of things that could happen that involve things going very wrong, and yes you can list all of them and then hardcode what to do in each case.
That is, indeed, how the cars actually learn to drive under normal circumstances. If the regulators want to provide a list of potential incident types and require Waymo to say how they plan to deal with each, including any combination of loss of internet and loss of power and everyone simultaneously fleeing an oncoming tsunami caused by a neutron bomb, then okay, sure, fine, I guess, let’s be overly paranoid to keep everyone happy, it will in expectation cost lives but whatever it takes.
But I think it’s really important, when arguing for AI safety, to be able to differentiate AGI from self-driving cars, and to not draw metaphors that don’t apply.
Cruise Control
The real final boss for self-driving cars is the speed limit.
As everyone knows, the ‘real’ speed limit is by default 10 MPH above the speed limit. You’re highly unlikely to get a ticket, in most places, unless you are both more than 10 MPH and substantially faster than other drivers. If the speed limit is enforced to the letter, that usually involves attempting to trick motorists. We call that a ‘speed trap.’
To be safe, you want to match the speed of other cars around you, so driving the listed speed limit is actively dangerous on many roads.
The wrong answer is to current obviously too low numbers, and slow down all cars to the current technical speed limits. That’s profoundly stupid. It’s also scarily plausible that we will end up doing it.
The correct answer is to increase our speed limits across the board to the actual limit, beyond which we can and will ticket you.
This generalizes, as per Levels of Friction.
If AI has to obey the rules and humans don’t, the correct answer wherever possible is to change the rules to what we want both AIs and humans to actually have to follow.
In many other places, this creates a real problem, because the true rules are nebulous and involve social incentives and a willingness to adopt to practical conditions. As Robin Hanson notes, an otherwise highly capable AI that had to formally obey all laws in all ways would find many human tasks impossible or impractical.
Growth Rates
I strongly agree that Waymo must pick up the pace. 7% growth per month? That’s it?
The Competition
Tesla continues to not even apply to operate fully autonomous services in the areas it claims it wants to offer those services, such as California, Arizona and Nevada. Please stop thinking Elon Musk’s timelines are ever meaningful.
The actual global competition is probably Chinese, as one would expect.
What they have done is made ‘Robotaxi’ service go live in Austin for select rides, but these rides remain supervised with a Tesla employee in the driver’s seat.
Andrej Karpathy reports the new Tesla self-driving on the HW4 Model X is a substantial upgrade.
Delivery via self-driving e-bikes? Brilliant.
They Took Our Jobs
If enough people lose their jobs at once, society has a big problem.
It’s amazing how easily those opposed to self-driving throw around ‘safety concerns’ when self-driving vehicles are massively safer, or the idea here that gains are ‘short term’ or that there are ‘no productivity gains.’
Even if we literally require a human to be in each truck at all times ‘in case of emergency’ we would still see massive productivity gains, since the trucks would be able to be on the road 24/7.
Maintenance is another truly silly objection. Yes, when you need to maintain something you’d (for now at least) bring the truck to a human. Okay.
That leaves the ever mysterious and present ‘edge cases.’
Bikers As Winners
Waymos will reliably yield to bikes, use its turn signals and obey the rules. When you are biking, the problem is tail risk can literally kill you, so you have to constantly be paranoid that any given car will do something unexpected or crazy. With a Waymo, you don’t have to worry about that.
The Elderly As Big Winners
Both the young and the old, who cannot drive, will benefit greatly. Self-driving cars will be a very different level of freedom than the ability to summon a Lyft. Tesla will likely offer ‘unsupervised’ self-driving very soon.
If you combine self-driving cars with other new smart products, including basic home robots, suddenly assisted living facilities look pretty terrible. They’re expensive and unpleasant, with the upside being that when you need help you really need help. The need for that forces you to buy this entire package of things you mostly don’t want. But what if most of that help was covered?
The social problem requires people who want to interact with you, but note that we’ve solved the transportation problem. That makes it a lot easier.
Save The Cat
What will happen when a Waymo finally does kill someone? Waymos are vastly safer than human drivers, but are we always going to be one accident away from disaster? The CEO of Waymo says people will accept it. I think she’s right if Waymo gets enough traction first, the question is when that point comes and whether we have reached it yet.
In the meantime, they’re trying to drum up outrage because a Waymo killed a cat, a ‘one-of-a-kind’ mascot of a bodega, something that happens to 5.3 million cats per year whens truck by human drivers. If ‘cat killed by Waymo’ is news then Waymos are absurdly safe.
(The bottom line should also include millions of cats and other pets as well, of course.)
I mean, if they wanted to ban all vehicles that would at least make some sense.
I will note that the 26 million number comes from Merrett Clifton’s extrapolation from 1993 and it’s basically absurd if you think about it, there simply are not enough cats for this to be real. It’s probably more like 2-5 million cats per year. Not that this changes the conclusion that Waymos are obviously vastly safer for cats than human drivers.
The stats are of course in, and if you use reasonable estimates Waymos probably kill on the order of 75x fewer pets, as in a 98%+ reduction in cats killed per mile.
In the meantime, we continue to deal with things like New York Times articles about a Waymo running over this very special cat, in which they bury the fact Waymos are vastly safer than human drivers.
It’s Going To Be The Future Soon
Even if the rest of AI doesn’t prove that disruptive soon, self-driving will change quite a lot wherever it is allowed to proceed. I too am unreasonably excited.
The agonizingly slow ramp-up, along with the avalanche of other AI things happening, is definitely taking the focus off of self-driving and making us not realize how much the ground is shifting under our feet. The second order effects are going to be huge. The child mobility and safety improvements are especially neglected.
In-Context Old Man Yells At Self-Driving Car
A no good, very bad take but also why competition is good:
To state the obvious I vastly prefer Waymos, and I am confused by the part about catastrophic failures since it seems obvious that rates of ‘things go wrong’ are higher for an Uber. But yeah, if you actively want to talk to drivers and to have another human in the car, and you care more about speed than a smooth ride, I can see it.
The Safety Case
Self-driving cars have been proven vastly safer than human drivers, despite many believing the opposite. The question continues to be, how hard do you push on this?
Human drivers have been grandfathered in as an insanely dangerous thing we have accepted as part of life. We’ve destroyed huge other parts of life in the name of far less serious safety concerns, whereas here we have a solution that is life affirming while also preventing most of a leading cause of death.
Keep It Classy
Auerlien reports that ‘broken windows theory’ very much applies to cars. If you don’t keep cars fully pristine then people stop respecting the car and things escalate quickly, and also people care quite a lot. Thus, if a Waymo or other self-driving car gets even a little dirty it needs to head back and get cleaned. And thus, every Waymo I’ve ever ridden in has been pristine.
The Lighter Side