In a calculated pivot from its troubled past in self-driving technology, Uber Technologies Inc. on January 27, 2026, unveiled AV Labs, a new division designed to collect vast troves of real-world driving data and supply it to robotaxi developers. The move positions the ride-hailing giant as a neutral data powerhouse in the burgeoning autonomous vehicle sector, leveraging its operational scale across more than 600 cities without the capital-intensive burden of manufacturing or deploying its own fleets.
Chief Technology Officer Praveen Neppalli Naga, who leads the effort, told TechCrunch the primary aim is to ‘democratize this data,’ emphasizing that advancing partners’ technology holds greater value than immediate revenue. ‘The value of this data and having partners’ AV tech advancing is far bigger than the money we can make from this,’ Naga said. AV Labs begins modestly with a single sensor-equipped Hyundai Ioniq 5, outfitted with lidars, radars, and cameras, but plans to expand to 100 vehicles soon.
The data won’t be handed over raw. Instead, Uber processes it with a ‘semantic understanding’ layer to aid partners in real-time path planning and decision-making. An innovative ‘shadow mode’ integrates partners’ software into AV Labs cars driven by humans, flagging discrepancies to refine models toward more human-like performance. This addresses the shift to reinforcement learning, where massive volumes of edge-case data—rare scenarios like illegally passing school buses, as seen with Waymo—are crucial.
Reviving Lessons from Uber’s AV Exit
Uber’s history with autonomous vehicles is checkered. After a 2018 fatal pedestrian collision in Arizona halted its testing, the company sold its Advanced Technologies Group (ATG) to Aurora Innovation in 2020, a deal that valued Aurora at $10 billion, per TechCrunch. Now, AV Labs marks a return to data collection, but strictly as a service provider. VP of Engineering Danny Guo described the setup as ‘scrappy,’ with the team still installing sensors, yet confident in Uber’s edge: ‘The amount of data Uber can collect just outweighs everything that they can possibly do with their own data collection.’
Guo added in the TechCrunch interview: ‘Because if we don’t do this, we really don’t believe anybody else can.’ This underscores Uber’s bet on volume over proprietary development, targeting over 20 AV partners including Waymo, Waabi, Lucid Motors, May Mobility, Volkswagen, and Avride, though no formal contracts for data services have been signed yet.
The division plans to grow to a few hundred employees within a year, focusing on targeted deployments in cities specified by partners to capture location-specific challenges. Uber’s official announcement on its newsroom site highlights transforming operational data from millions of daily trips into high-quality inputs for perception, prediction, and planning.
Strategic Ties to NVIDIA and Expanding Partnerships
AV Labs complements Uber’s deepening alliance with NVIDIA, announced in late 2025, to build a ‘robotaxi data factory’ powered by the NVIDIA Cosmos platform. Uber aims to collect over three million hours of robotaxi-specific data for model training, with plans for up to 100,000 NVIDIA-DRIVE powered vehicles starting in 2027, as detailed in Uber’s investor release. CEO Dara Khosrowshahi stated: ‘NVIDIA is the backbone of the AI era, and is now fully harnessing that innovation to unleash L4 autonomy at enormous scale.’
Other collaborations include a next-generation robotaxi with Lucid Motors and Nuro, targeting 20,000 Lucid Gravity SUVs over six years, unveiled at CES 2026 per Uber’s investor site. Nuro CEO Jiajun Zhu noted: ‘By combining our self-driving technology with Lucid’s advanced vehicle architecture and Uber’s global platform, we’re proud to enable a robotaxi service designed to reach millions.’ Partnerships extend to Avride’s Dallas launch, WeRide in Abu Dhabi, and Volkswagen’s ID. Buzz microbuses in Los Angeles by late 2026.
Uber now operates AV services in at least 10 cities by end-2026, including Atlanta, Austin, Phoenix with Waymo, and Abu Dhabi driverless with WeRide, according to TechCrunch. This multi-partner approach mitigates risk, as analyst notes from Yahoo Finance highlight Uber’s asset-light model enabling scale without fleet ownership.
Investor Views and Market Echoes
On X, formerly Twitter, reactions spotlight Uber’s data moat. Andrew Macdonald, Uber executive, posted: ‘We’ve done data collection for multiple AV partners already and it’s increasingly clear that Uber has an edge: we can efficiently collect rare, rich, real-world driving data at unparalleled scale,’ linking the newsroom post garnering over 2,400 views. Investor @stockpickerspb analyzed: ‘Data – $UBER will have the indisputable lead… Compute – $NVDA… This is the team to beat.’
@ManuInvests echoed: ‘$UBER is doubling down on facilitating the growth and fragmentation of the AV industry,’ quoting Naga on data democratization. @WhiteRockInves1 called it ‘HUGE!!’ for Uber’s neutral aggregator strategy, gaining operational intelligence without R&D pitfalls. Reddit’s r/SelfDrivingCars discussed Uber as a ‘one-stop shop’ to protect market share amid high robotaxi costs exceeding $25 billion for Waymo.
Tekedia reported Guo’s view on Uber’s responsibility to ‘unlock the whole industry,’ while Tekedia noted shadow mode’s role in human-like refinement. No immediate stock impact evident, but analysts like Jefferies urge buying Uber on AV dips, citing partnerships reducing single-provider reliance, per Investing.com.
Challenges in Data-Driven Autonomy
Critics question scalability and monetization. AV Labs forgoes charging at launch, betting long-term ecosystem gains. Regulatory hurdles persist, with incidents like Waymo’s school bus passes underscoring data gaps Uber targets. Uber’s 600-city footprint enables precise collection, potentially extending to its ride-hail fleet, dwarfing partners’ constraints.
X user @macaronicapital noted: ‘Uber has more than 20 autonomous vehicle partners, and they all want one thing: data.’ As NVIDIA’s Jensen Huang said of robotaxis: ‘Robotaxis mark the beginning of a global transformation in mobility — making transportation safer, cleaner, and more efficient,’ per NVIDIA Newsroom.
AV Labs hires experts in machine learning and computer vision to mine data, simulate scenarios, and validate systems. Uber’s newsroom stresses: ‘Autonomous vehicles are an important part of that future — not as a winner-take-all product, but as a way to make the benefits of autonomy easier to access for more people, faster.’ This data flywheel could redefine Uber’s role in a $1 trillion mobility shift.


WebProNews is an iEntry Publication