Wayve co-founder and CEO Alex Kendall is optimistic about his autonomous vehicle startup’s potential to bring its technology to market. The company’s strategy centers on making its automated driving software cost-effective, adaptable to various hardware, and applicable across advanced driver-assistance systems (ADAS), robotaxis, and even robotics. This approach was detailed by Kendall during Nvidia’s GTC conference, emphasizing an end-to-end, data-driven learning model.
This method ensures the system’s driving decisions are based on sensor inputs, like cameras, eliminating reliance on high-definition maps or rule-based software, a departure from earlier autonomous vehicle (AV) technologies. This strategy has attracted investors. Wayve, which started in 2017 and has raised over $1.3 billion in the past two years, intends to license its self-driving software to automotive companies and fleet partners, including Uber.
While the company hasn’t yet announced specific automotive partnerships, a spokesperson informed TechCrunch that Wayve is having “strong discussions” with multiple original equipment manufacturers (OEMs) to integrate its software into different vehicle types. A key selling point is how easy it is to run the software. According to Kendall, OEMs can incorporate Wayve’s ADAS into new vehicles without additional hardware investments because the technology is designed to work with existing sensors, such as surround cameras and radar. Wayve’s software is also “silicon-agnostic,” meaning it can operate on the GPUs already within the vehicles of its OEM partners, according to Kendall.
“Entering into ADAS is really critical because it allows you to build a sustainable business, to build distribution at scale, and to get the data exposure to be able to train the system up to [Level] 4,” Kendall stated during the conference. A Level 4 driving system indicates it can handle its environment autonomously under specific conditions, without human intervention. Wayve’s initial commercialization plan involves ADAS. The startup has designed its AI driver to function without lidar, a system that many Level 4 technology developers consider crucial for creating a highly accurate 3D map of the environment using laser light.
Wayve’s approach mirrors Tesla’s, which is also developing an end-to-end deep learning model to power and continuously improve its self-driving software. Wayve intends to use ADAS to collect data that will advance its system to full autonomy, similarly to Tesla. (Tesla’s “Full Self-Driving” software currently performs some automated driving functions, but is not fully autonomous.)
A key technical difference between Wayve and Tesla is sensor reliance: Tesla relies solely on cameras, while Wayve is open to using lidar for near-term full autonomy. “Longer term, there’s certainly opportunity when you do build the reliability and the ability to validate a level of scale to shrink that [sensor suite] down further,” Kendall explained. “It depends on the product experience you want. Do you want the car to drive faster through fog? Then maybe you want other sensors [like lidar]. But if you’re willing for the AI to understand the limitations of cameras and be defensive and conservative as a result? Our AI can learn that.”
Kendall also mentioned GAIA-2, Wayve’s latest generative world model for autonomous driving. This model is trained on extensive real-world and synthetic data. It processes video, text, and other actions, which Kendall says helps Wayve’s AI driver behave more adaptively and human-like. “What is really exciting to me is the human-like driving behavior that you see emerge,” Kendall said. “Of course, there’s no hand-coded behavior. We don’t tell the car how to behave. There’s no infrastructure or HD maps, but instead, the emergent behavior is data-driven and enables driving behavior that deals with very complex and diverse scenarios, including scenarios it may never have seen before during training.”
Wayve shares a similar philosophy with Waabi, an autonomous trucking startup, also focused on an end-to-end learning system. Both companies emphasize scalable, data-driven AI models that can adapt to various driving environments, using generative AI simulators to test and train their technologies.