Uber launches ‘AV Labs’ division to collect driving data for robotics partners

💥 Read this trending post from TechCrunch 📖

📂 **Category**: Transportation,autonomous vehicles,Exclusive,robotaxis,Uber,uber av labs

✅ **What You’ll Learn**:

Uber has more than 20 self-driving car partners, and they all want one thing: data. So the company says it will make this available through a new division called Uber AV Labs.

Despite the name, Uber is no Back to developing its own robotaxis, which it stopped doing after one of its test vehicles killed a pedestrian in 2018. (Uber eventually sold the division in 2020 in a complex deal with Aurora.) But it will send its own cars to cities decorated with data-collecting sensors for partners like Waymo, Waabi, Lucid Motors and others — though no contracts have been signed yet.

In general, self-driving cars are in the middle of a shift away from rule-based operation and toward greater reliance on reinforcement learning. As this happens, real-world driving data becomes invaluable for training these systems.

Uber told TechCrunch that the self-driving car companies that want this data the most are those that have already collected much of it themselves. It’s a sign that they, like many frontier AI labs, have realized that “solving” the most extreme cases is a big game.

Physical limit

Currently, the size of an autonomous vehicle company’s fleet is a physical limit to the amount of data it can collect. And while many of these companies create simulations of real-world environments to hedge against emergency situations, nothing beats driving on actual roads — and driving a lot — when it comes to figuring out all the weird, difficult, and unexpected scenarios that cars end up in.

Waymo provides an example of this gap. The company has had self-driving vehicles in operation or in testing for a decade, however its existing robotaxis were recently caught illegally driving past parked school buses.

Access to a broader set of driving data could help robotaxi companies solve some of these issues before or as they arise, Praveen Nepali Naga, Uber’s chief technology officer, told TechCrunch in an exclusive interview.

TechCrunch event

San Francisco
|
October 13-15, 2026

Uber will not charge any fees for this. At least not yet.

“Our goal, first of all, is to democratize this data, right? I mean, the value of this data and having partners in developing autonomous vehicle technology is much greater than the money we can make from that,” he said.

Danny Gu, Uber’s vice president of engineering, said the lab must build a foundation of basic data first before it can find product-market fit. “Because if we don’t do it, we don’t think anyone else can do it,” Joe said. “As someone who can unleash the entire industry and accelerate the entire ecosystem, we believe we have to take on that responsibility now.”

Screws and sensors

The new AV Labs division started small. So far, it only has one vehicle (the Hyundai Ioniq 5, though Uber says it’s not married to one model), and Guo told TechCrunch that his team is literally still working on sensors like sensors, radars, and cameras.

“We don’t know if the sensor array will fall off, but that’s our flaw,” he said with a laugh. “I think it’s going to take a while until we get, say, 100 cars on the road to start collecting data. But the prototype is there.”

Partners will not receive raw data. Once the Uber AV Labs fleet is up and running, Naga said the department “will have to massage and work on the data to help match partners.” This “semantic understanding” layer is what driving software at companies like Waymo will leverage to improve real-time route planning for robotaxis.

Until then, Gu said an in-between move will likely be made, where Uber will essentially plug the partner’s driving software into AV Labs’ cars to run in “shadow mode.” Any time an Uber AV Labs driver does something different than what the self-driving car software does in shadow mode, Uber will report it to the partner company.

This will not only help discover shortcomings in the driving software, but will also help train models to drive like a human and not like a robot, Guo said.

Tesla approach

If this approach sounds familiar, that’s because it’s basically what Tesla has been doing to train its self-driving vehicle software for the past decade. However, Uber’s approach lacks the same scale, as Tesla has millions of customer cars on roads around the world every day.

This doesn’t bother Uber. Guo said he expects to do more targeted data collection based on the needs of self-driving car companies.

“We have 600 cities that we can pick and choose [from]. If a partner tells us about a particular city they are interested in, we can only publish our information [cars]He said.

Naga said the company expects to grow this new division to a few hundred people within a year, and that Uber wants to move quickly. While he sees a future in which Uber’s entire fleet of cars can be leveraged to collect more training data, he knows the new division has to start somewhere.

“From our conversations with our partners, they just say, ‘Give us anything that would be useful,’” Guo said. “Because the amount of data Uber can collect is beyond what it can do with its own data collection.”

💬 **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#Uber #launches #Labs #division #collect #driving #data #robotics #partners**

🕒 **Posted on**: 1769520648

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *