How AI is powering the next generation of robotaxis
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Dmitri Dolgov has worked on self-driving cars for 20 years. But even he was surprised when a Waymo vehicle in San Francisco was able to spot a pedestrian hidden behind a bus and swerve to avoid them.
“I was like, ‘What’s going on here?’,” said Dolgov, Waymo’s co-CEO, on a recent podcast. “I know we have pretty darn good sensors, and the software is very capable, but we don’t see through stuff.”
The answer lies in a key change in how the world’s most advanced creator of robotaxis, which is owned by Google’s parent Alphabet, develops its autonomous driving systems. Robotaxis have in recent years benefited from broader advances in artificial intelligence, allowing autonomous vehicles to “generalise” their experiences to new cities and situations — and even to predict a pedestrian’s next steps.
This is how self-driving cars have finally been able to make the leap from small-scale tests to today’s rapid international expansion — and how new operators such as Canada’s Waabi and the UK’s Wayve are setting out to challenge Waymo’s lead.
After decades in development, robotaxis are now facing their biggest test yet as they start to appear on city streets all over the world.
How has the technology evolved?
Waymo began as a Google project in 2009. For the first few years its robot driver had to be taught the rules of the road one at a time. Engineers at Waymo and other early pioneers, such as China’s Baidu, largely had to “hard code” key safety and navigation directives.
These early robot vehicles travelled millions of miles on city streets, first to scan them to build high-definition 3D maps of their environment, then to learn how to detect and respond to changes in that environment. They gathered data using an expensive array of sensors: radar and lidar that bounce radio and light waves off their surroundings and multiple cameras pointing in every direction.

The first step to broadening the robots’ horizons was simulation. All that sensor data was used to build virtual cities, inside which robot cars could be subjected to many more surprises than they encountered in the real world, helping solve many “edge cases” — unlikely but still safety-critical events — without the possibility of hurting anyone.
It took more than a decade and four generations of Waymo’s technology before its vehicles were ready to be deployed as a fully autonomous commercial service, beginning in suburban Chandler, Arizona in 2020.
The challenge of proving the technology has also meant stop-start progress from other operators. GM’s Cruise closed its project following a mishandled pedestrian dragging incident in 2023.
Waymo has only started to significantly scale services in the past year. Rivals such as Amazon’s Zoox have tentatively launched services, while smaller developers like Wayve and Waabi are still testing on public roads.
While Baidu and Waymo’s robotaxis still look like conventional cars — albeit with a bigger roof rack loaded with sensors — Zoox, founded in 2014, has gone a step further. Rather than retrofitting a traditional car, it has built its toaster shaped robotaxi, which has no steering wheel, from the ground up. The company is awaiting regulatory approval of its unconventional design before it can start charging for rides.
How does AI bring everything forward?
The “big jump” for Waymo came with its fifth-generation Driver, introduced in 2020, said Dolgov. This is the latest version of the integrated hardware and software system that underpins the company’s autonomous vehicles.
The fifth version runs primarily on the Jaguar I-Pace vehicles that are already ubiquitous in San Francisco and are now starting to appear in London, where Waymo began autonomous testing this month.
Generation five “was when we made this big bet on AI”, Dolgov told Stripe’s Cheeky Pint podcast. Whereas the fourth generation system used some AI models and machine models, “we made a much bigger bet and jump to AI as the backbone for the fifth generation”.
The car that stunned Dolgov by seeming to see right through the bus had been able to detect and predict the pedestrian’s next steps even though they were largely hidden by the vehicle.
By inferring just enough movement from the person’s feet underneath the bus it predicted that they were about to step out around it. “It just blew my mind,” Dolgov said.
Similar to the large AI models that sit behind OpenAI’s ChatGPT or Anthropic’s Claude, Waymo has its own foundation model that underpins three separate systems to build a safe autonomous car: the “driver” itself; a “simulator” for virtual testing; and a “critic” that rates the driver’s performance.
Combined with a constant stream of new data from Waymo’s vehicles on the road, this creates a feedback loop to enable constant improvement.
A visual language model, in Waymo’s case trained using Google’s Gemini AI system, sees and interprets the road ahead. A decoder system takes these inputs to predict how other road users might respond and plot the best way forward.

How are advances in AI shaking up the competition?
The combination of Waymo’s early start and Google backing has propelled it to the forefront of the robotaxi industry, driving more than 200mn miles fully autonomously with passengers.
However, it is now facing competition from new entrants in not only Silicon Valley but the UK, China and beyond. Many are betting that starting with a clean slate in the new era of AI will allow them to scale more quickly and cheaply than Waymo.
Tesla and Wayve believe this AI-first approach can be taken much further, and operators differ on what technology they believe is required for a safe and viable vehicle. While Waymo uses an expensive array of custom hardware, its rivals’ systems use fewer cameras and radar sensors — and with some even potentially doing away with lidar altogether, claiming it is not necessary — to make a much cheaper self-driving car.
Alex Kendall, chief executive of Wayve, which was founded in 2017, claims his company was the first to use “end to end deep learning” to teach its cars to drive themselves simply by watching how human drivers behave behind the wheel.
Kendall describes the Wayve system as a “general purpose foundational model for driving”, meaning its cars can be deployed in a wide variety of locations and environments that they haven’t seen before.
“We can generalise and scale anywhere,” he says, meaning its vehicles are not restricted to a set “geo fenced” area like Waymos are today. However, despite high-profile backing from chipmaker Nvidia and the ride-hailing app Uber, its cars have not yet been tested at scale with the general public so it remains to be seen how successful its approach will be.
What’s next for the rollout of robotaxis?
Waymo already operates in ten US cities and plans to deliver 1mn paid weekly rides across 17 cities including London by the end of this year. It is forecast to account for more than 7 per cent of the overall US rideshare market by 2030, according to JPMorgan.
Uber is targeting 15 cities globally with a host of robotaxi operators and plans to compete directly with Waymo in San Francisco with a fleet of vehicles from carmaker Lucid and AV developer Nuro. Zoox and Tesla have also laid out an ambitious roadmap for expansion.
Chinese operators are rolling out fleets of robotaxis in Asia, having deployed limited domestic services across cities such as Beijing and Shanghai. Guangzhou-based WeRide in March expanded services into Singapore. Waymo meanwhile is testing vehicles in Tokyo.
Jesse Levinson, co-founder and chief technology officer of Zoox, says that while a bespoke vehicle has higher upfront costs than a retrofitted production car, it enables the company to incorporate larger batteries that ensure the vehicle could run “most of the day and most of the night”. This enables the vehicle to also operate for more hours each day than a traditional cab, helping offset costs.
“These things actually make more of a difference than the actual bill of materials,” he says. “It just makes sense to have a design that was specifically created for the purpose it was [intended].”
Technology companies and investors are spending tens of billions of dollars on robotaxis in a race to deploy them on public roads and gobble up market share in the ride-hailing market. Uber alone has committed the group to more than $10bn in AV investments and vehicle commitments.
“The same thing that’s happening in Generative AI is happening in autonomous vehicles as well,” Uber chief Dara Khosrowshahi told investors last year. “We estimate that the US market alone is $1tn opportunity.”






Comments