Connect with us

Hi, what are you looking for?

News

Waymo keeps passing stopped school buses while the company dismisses calls to pause service

Image Credit: CBS Austin

Waymo keeps passing stopped school buses while the company dismisses calls to pause service
Image Credit: CBS Austin

Waymo’s self-driving cars keep breaking one of the clearest safety rules on the road: you stop for a school bus.

According to reporting by Karoline Leonard of the Austin American-Statesman and Bettie Cross of CBS Austin, Austin ISD says Waymo’s autonomous vehicles have illegally passed stopped school buses at least 20 times this school year – even after software “fixes,” meetings, and a federal investigation.

And despite a direct plea from the school district to pause service during bus hours, Waymo is still rolling.

Cameras Catch Waymo Sliding Past Stop-Armed Buses

In her detailed report, Karoline Leonard writes that Austin ISD’s bus cameras have recorded 19 illegal passes by Waymo vehicles so far in the 2025–26 school year.

These aren’t fuzzy eyewitness claims.

They’re videos recorded by stop-arm cameras mounted on every AISD bus since 2016.

Leonard explains that under Texas law, drivers must come to a complete stop when a school bus has its red lights flashing and the stop arm extended.

They’re not allowed to move again until the bus does.

Yet the footage Leonard describes shows something different.

Waymo’s driverless cars slow down, sometimes even briefly stop, but then accelerate past the bus anyway while it’s still loading or unloading students.

To crack down on all drivers who blow past buses, Leonard notes that AISD approved a contract with AlertBus to automatically issue $300 fines for each violation.

So far this school year, AISD Police have issued around 7,000 school bus traffic violations. Nineteen of those, Leonard reports, were committed by Waymo vehicles.

“The Vehicle Was Not Recognizing” The Law

Austin ISD Police Assistant Chief Travis Pickford has been blunt about what he sees in the videos.

In Leonard’s reporting, Pickford says the violations happened in all kinds of conditions – day and night, busy roads and side streets, intersections and turns.

“There was not a whole lot of consistency,” he said, “other than you could tell that the vehicle was not recognizing that it needed to obey the state law of stopping whenever the buses’ red lights were on.”

“The Vehicle Was Not Recognizing” The Law
Image Credit: CBS Austin

That’s a chilling way to describe a safety system.

Not “it misjudged the distance,” or “the timing was tricky,” but it didn’t recognize it needed to obey the law at all.

Pickford also told Leonard that AISD first contacted Waymo on October 29, after noticing 12 violations.

The district asked Waymo to pay $2,100 in fines and to “take immediate action” to fix the problem.

Waymo paid, Leonard reports, and wrote back on November 5 saying it was making “certain software updates” to improve how its cars respond to school buses.

But the tickets didn’t stop.

Between November 5 and November 20, Leonard says AISD recorded seven more violations, bringing the total to 19.

At that point, this stopped being an isolated glitch and started looking more like a pattern.

20th Violation After “Fix” – “It Is Not Fixed”

CBS Austin reporter Bettie Cross picks up the story from there.

In her TV report, Cross says Waymo has now received its 20th citation for a driverless car illegally passing a stopped school bus – after those software updates were supposedly in place.

Cross shows video captured on December 1 by an AISD bus camera.

A driverless Waymo car approaches a bus parked along the curb, red lights flashing, stop sign fully extended.

District officials told Cross the Waymo vehicle doesn’t come to a full stop until it reaches the rear of the bus.

By then, under Texas law, it’s already too late.

“The programming did not recognize what a vehicle is supposed to do around a stopped school bus with students loading and unloading,” Pickford told Cross.

He didn’t sugarcoat the follow-up either.

“It is not fixed,” he said about Waymo’s software. “That is the short answer. It is not fixed.”

Cross reports that AISD officials met with Waymo on December 4 to talk about that December 1 violation.

Waymo told the district it was “looking into it,” and AISD handed over the video and data so the company could analyze what went wrong.

But until something actually changes, the district says it wants something simple: Waymo should stop operating during school bus pick-up and drop-off times.

AISD: Pause During Bus Hours. Waymo: No.

In her print reporting, Karoline Leonard notes that on November 20, AISD formally asked Waymo to cease operations during school bus times:

  • 5:20 a.m. to 9:30 a.m. for morning pick-ups
  • 3 p.m. to 7 p.m. for afternoon drop-offs

AISD runs more than 600 bus routes per day, Leonard writes, and district officials say this is about the most basic kind of road safety: kids stepping off a bus and crossing the street.

AISD Pause During Bus Hours. Waymo No.
Image Credit: CBS Austin

AISD Police Chief Wayne Sneed told Leonard that with a human driver, “you know exactly who’s operating the vehicle.”

With a computer, he said, it’s hard to program for every single situation the car will face.

That’s not a small statement coming from a school police chief.

When Bettie Cross asked directly on camera if Waymo had agreed to pause service during bus times, Pickford’s answer was simple.

“As of right now, Waymo is telling you it will not cease operations. Is that correct?” Cross asked.

“That’s correct,” Pickford replied.

He explained that after AISD pointed out the December 1 violation, Waymo “disagreed with our risk assessment” and refused to stop operating during bus hours.

This is where the tension really shows.

AISD is thinking like a school district: one bad incident with a child is too many.

Waymo is thinking like a tech company: we’re improving, we’re safe overall, and we don’t think the risk justifies shutting down.

From the outside, it feels like two completely different definitions of “safe” colliding in the middle of the street.

Waymo’s Defense: Software Recall And Safety Stats

Both Leonard and Cross note that these school bus problems are on the radar of federal regulators.

Leonard writes that the National Highway Traffic Safety Administration (NHTSA) opened a probe into Waymo’s behavior around school buses after a video from Atlanta showed a Waymo car driving past a bus there.

Cross reports that NHTSA is investigating Waymo’s operations around school buses in both Austin and Atlanta.

Under that pressure, Waymo is trying to reassure regulators and the public.

Leonard reports that Waymo said it completed fleet-wide software updates by November 17 to address the behavior of slowing or stopping and then proceeding past buses.

“Improving road safety is our top priority at Waymo and we’re deeply invested in safe interaction with school buses,” the company said in a statement Leonard obtained.

“We swiftly implemented software updates to address this and will continue to rapidly improve.”

But then came that December 1 video.

In Cross’s follow-up reporting, Waymo goes a step further, announcing it will file a voluntary software recall with NHTSA.

Waymo’s Chief Safety Officer, Mauricio Peña, defended the company’s bigger picture safety record.

He said Waymo experiences twelve times fewer injury crashes involving pedestrians than human drivers, and that “holding the highest safety standards means recognizing when our behavior should be better.”

Peña said the recall is specifically about “appropriately slowing and stopping” around stopped school buses, and that no injuries have been linked to this behavior.

Waymo also told Cross that the updated software has already improved performance around school buses and claimed it’s now better than human drivers in those scenarios.

On paper, that sounds impressive.

But it doesn’t erase the videos of a driverless car gliding past a bus full of kids with the stop sign out.

When “Good Overall” Isn’t Good Enough Around Kids

When “Good Overall” Isn’t Good Enough Around Kids
Image Credit: CBS Austin

Stepping back from the technical details, there’s a deeper issue that both Karoline Leonard’s and Bettie Cross’s reporting quietly highlight.

If a human driver blew past a school bus 19 or 20 times, as Pickford pointed out to Leonard, there’s a good chance their license would be suspended.

Yet in this case, the “driver” is a fleet of computers.

There’s no license to yank, no single person to hold personally responsible.

Instead, you get meetings, software patches, recalls, and statements about aggregate safety and performance.

Waymo may very well be safer overall than the average distracted human driver in many scenarios.

The company’s pedestrian injury numbers might genuinely be lower.

But parents watching a car roll past a stop-armed bus are not thinking about national statistics.

They’re thinking: Did this machine just ignore the same law my teenager had to memorize for a permit test?

Leonard reports that AISD is now looking into “all legal remedies at its disposal.”

Cross says the district and its police department will keep monitoring for more violations.

However this plays out, the situation in Austin is a warning for every city experimenting with driverless cars.

It’s not enough for an autonomous vehicle company to say, “We’re safer than humans overall.”

When the issue is school buses, the bar isn’t “better than average.”

The bar is zero tolerance – and so far, Waymo’s software keeps coming up short.

You May Also Like

News

Image Credit: Max Velocity - Severe Weather Center