California police officers experienced a futuristic dilemma last week: what happens when a car breaks traffic laws, but there’s no driver behind the wheel? The incident involved a Waymo autonomous vehicle making an illegal U-turn in San Bruno. The stop represents how traffic law enforcement lags behind the availability of self-driving cars.
What’s Happening & Why This Matters
A Ticket Without a Driver?

During a DUI enforcement operation, officers pulled over a Waymo robotaxi that executed a prohibited U-turn at a light. Usually, this would result in a citation, but police quickly realized their ticket books lacked an option for “robot.” With no human behind the wheel, they couldn’t issue a fine. Instead, the San Bruno Police Department notified Waymo and requested that the system be updated to prevent similar violations.
Waymo responded by saying its autonomous driving system is designed to follow traffic rules and that the company is investigating the incident. The company reiterated its commitment to improving safety through ongoing updates to its Waymo Driver technology.
California’s New Approach
This isn’t the first time driverless cars have tested the patience of first responders. From blocking firetrucks to dragging a pedestrian, autonomous vehicles in San Francisco and surrounding areas have been involved in troubling cases. Recognizing the issue, Governor Gavin Newsom signed a law in 2023 allowing police to issue a “notice of noncompliance” to companies when driverless cars break laws.

The law, which takes effect in July 2026, also requires autonomous vehicle companies to maintain emergency hotlines for first responders. The laws provide police a formal mechanism to enforce accountability without needing a human driver. Companies must also move their cars within two minutes if ordered to vacate an area, such as during an emergency response.
Public Concerns and Past Issues
The San Bruno Police Department noted that legislation will soon allow officers to hold companies accountable, pushing back against critics who accused them of being lenient with Waymo. Waymo, once a Google X research project, has faced multiple setbacks. Earlier this year, the company recalled over 1,200 vehicles due to a software flaw that caused crashes into barriers. The National Highway Traffic Safety Administration is also investigating reports of erratic Waymo behavior, including suspected violations of safety laws.
The latest event illustrates that driverless cars pose technological challenges, but legal and ethical ones. If machines make mistakes on the road, culpability shifts from individuals to corporations — and that change could reshape how we define liability in transportation.
TF Summary: What’s Next
The San Bruno incident illustrates the gap between technology and regulation. Until the 2026 law takes effect, police officers remain stuck in a gray zone where they can only report problems to companies rather than issue tickets. As self-driving fleets expand, these situations are expected to increase, prompting lawmakers to update rules faster.
Autonomous vehicles promise safer roads in theory, but incidents like this remind us that the systems are still learning. Expect more legal debates, more recalls, and a growing need for frameworks that hold companies accountable when algorithms misbehave on public streets.
— Text-to-Speech (TTS) provided by gspeech