You’ve seen them everywhere in San Francisco and Los Angeles. Those white SUVs with the spinning buckets on top, gliding through intersections like ghost ships. For years, if one of these robotaxis cut you off or blew through a red light, there wasn't much a cop could do. California law had a weird, glaring hole. It allowed police to ticket human drivers, but if nobody was behind the wheel, the paperwork just didn't exist. That's changing right now.
California is closing the loophole that let autonomous vehicle companies off the hook for traffic violations. It’s about time. For a long time, local police departments felt their hands were tied. They’d watch a driverless car commit a blatant moving violation, pull it over, and then realize they couldn't actually issue a citation to a piece of software. It was a bizarre regulatory vacuum that frustrated city officials and terrified pedestrians. For an alternative perspective, check out: this related article.
The end of the free pass for robotaxis
Assembly Bill 1777 is the hammer coming down. This piece of legislation specifically targets how law enforcement interacts with autonomous vehicles (AVs). The core change is simple. It treats the AV company as the driver. If a Waymo or a Zoox breaks the law, the company gets the ticket. It sounds like common sense, but the legal reality was a mess of outdated vehicle codes that assumed a person with a pulse was always in control.
The San Francisco Police Department has been vocal about this for a while. Chief William Scott pointed out that officers were essentially spectators to robot malfunctions. Imagine a car blocking an ambulance with its lights on. Under the old rules, the officer could try to contact the company's remote operations center, but they couldn't slap a fine on the dashboard. Now, the paper trail starts the moment the sirens go off. Related reporting on the subject has been provided by MIT Technology Review.
This isn't just about collecting revenue. It’s about data and accountability. When a human gets a ticket, it goes on their record. If they get too many, they lose their license. Until now, we didn't have a centralized, punitive way to track how often these "perfect" drivers were actually breaking the rules of the road. We relied on the companies to self-report their "disengagements" or "incidents." You don't let a teenager grade their own driving test. Why let a billion-dollar tech company do it?
Why the old rules didn't work
The California Vehicle Code was written in an era when the most high-tech thing in a car was a cassette deck. Section 21000 and those following it almost always refer to "the driver" as a person. When the DMV started issuing permits for driverless testing, they focused on safety drivers—humans sitting in the seat ready to take over. But once the cars went "fully driverless," the legal definition of a driver became a ghost.
I’ve seen dozens of videos of San Francisco police officers standing next to a stopped Waymo, looking confused. They can't ask for a license and registration. They can't perform a field sobriety test on a lidar sensor. The tech moved faster than the law, and that gap created a sense of "tech exceptionalism." The companies argued they were safer than humans, which might be true on a statistical level, but that shouldn't grant them immunity from the basic rules of the neighborhood.
We aren't just talking about speeding. We're talking about blocking fire hydrants, stopping in the middle of active bus lanes, and failing to yield to emergency vehicles. These "edge cases" are where the software often trips up. When a robot car gets confused, its default move is often to just stop. In a city like San Francisco, a car that just stops in the middle of a narrow street is a major hazard.
How the new ticketing system actually functions
The process isn't going to look like a standard traffic stop. You won't see a cop trying to hand a yellow slip of paper to a camera lens. Instead, the legislation creates a digital and administrative bridge.
- Identification: Officers will use the vehicle's unique identification number or fleet ID to log the violation.
- Electronic Citations: The ticket goes directly to the company’s registered agent.
- Fines: The companies will be responsible for the same monetary penalties a human would pay.
- Reporting: These violations will be tracked by the DMV, which could eventually impact the company's permit to operate in the state.
Crucially, the law also demands better communication. AV companies have to provide a dedicated, 24/7 phone line for law enforcement. If a car is stuck or obstructing a crime scene, the police shouldn't be on hold with a customer service bot. They need a human operator who can remotely move the car or shut it down immediately.
The myth of the perfect driver
Silicon Valley loves the narrative that self-driving cars will eliminate traffic deaths. It’s a noble goal. Human drivers are, quite frankly, terrible. we get distracted, we get tired, and we get angry. But robots have their own set of flaws. They can't interpret a hand signal from a construction worker. They struggle with heavy rain or thick fog. They sometimes interpret a plastic bag as a solid wall and slam on the brakes.
By allowing police to ticket these vehicles, we’re finally collecting unbiased evidence of these failures. If a specific software update causes a spike in "failure to yield" tickets across a fleet, the DMV has the evidence it needs to suspend testing. It moves the conversation from marketing hype to hard data.
Some critics argue that ticketing won't change behavior because the fines are "pocket change" for companies backed by Alphabet or Amazon. That's a valid point. A $250 fine for a company worth a trillion dollars is nothing. But the real sting isn't the money. It's the public record. Every ticket is a mark against their safety claims. It’s fuel for trial lawyers and fodder for city councils who are already skeptical of robotaxis clogging their streets.
What this means for your next ride
If you're a passenger in a Waymo, don't worry. You aren't the one getting the ticket. The law is very clear that the "operator" is the entity that owns the automated driving system. You're just cargo. If the car decides to turn right on a "No Right on Red" sign, your insurance rates won't go up.
However, you might notice changes in how the cars behave. Companies are likely to tune their software to be even more conservative. Expect more hesitant turns and longer waits at stop signs. They're going to be terrified of getting "pulled over" because of the bad PR it generates. We might see a temporary dip in the "fluidity" of these rides as the algorithms are adjusted to prioritize strict legal compliance over traffic flow.
The tension between cities and the state
This new law is a huge win for local control. For years, the California Public Utilities Commission (CPUC) and the DMV have held all the power, often over the objections of city leaders in San Francisco and LA. Local mayors felt like their cities were being used as laboratories without their consent.
By empowering local police to issue citations, the state is giving a bit of power back to the streets. It allows local jurisdictions to enforce their specific traffic patterns. If a city decides to implement new "No Left Turn" zones to manage congestion, they can now actually enforce those rules against the robot fleets.
It also forces a level of cooperation that was missing. The tech companies have to play by the local rules or pay the price. It’s a step toward integrating autonomous tech into the existing social fabric rather than letting it bypass the rules everyone else has to follow.
Practical steps for California residents
If you’re living in an area with high AV activity, you don't have to just yell at a windowless car anymore.
- Report Violations: If you see a driverless car commit a clear violation, note the time, location, and the car's ID number (usually found on the rear or side).
- Contact Local Authorities: Use non-emergency lines to report recurring issues with AVs at specific intersections.
- Check the Data: Keep an eye on DMV reports. As this ticketing system rolls out, that data will become public. You can see for yourself which companies are actually the "safest."
The "wild west" era of autonomous testing is ending. We’re entering a phase of accountability where the software is treated with the same scrutiny as a teenager with a fresh permit. It’s the only way to build actual trust with the public. If these cars are really the future of transportation, they should be able to pass a basic driving test—and pay the price when they fail.
Watch the intersections. The next time a robot taxi ignores a stop sign, don't be surprised to see blue lights flashing behind it. The ghost in the machine is finally getting a court date.