The concept of shared rides has long been a noteworthy topic of discussion, and this business model is no longer limited to simply sharing a car with someone or joining an app to drop someone off from one place to another.
Uber, which can be described as the digital equivalent of the yellow taxi, plans to process data collected from vehicles operating in the field through its newly established subsidiary, AvLabs, transforming it into high-quality case studies and then opening this processed data to the ecosystem of autonomous vehicles and software developers.
This real-world data will provide the necessary resource for autonomous driving software to learn faster and more safely.
Observing the world’s largest simulation in real time
From America to Europe, from India to China, Uber has facilitated billions of real-world trips, thus seeing and continuing to see, real-world road conditions. This means Uber is much more than just a ride-sharing company; it’s now one of the world’s largest mobility companies.
The AVLabs initiative plans to transform this operational data, acquired from real-world vehicles equipped with specialized sensors, into a format usable by automotive and autonomous software companies through data mining, tagging, simulation, and validation.
In short, it aims to make the real-time data generated by monitoring the world’s largest simulation in near real-time more meaningful.
Uber’s goal isn’t to manufacture and deploy its own robotic taxi like Tesla’s Cybercab. More specifically, the goal is to use the vehicles as a kind of data collector, positioning them as catalysts to accelerate the ecosystem and then scaling and offering this big data collected by the vehicles to partner companies in a suitable and long-tail manner.
It would be more accurate to define AV Labs as a kind of awareness within the ecosystem; because it focuses on rare, chaotic and unexpected events on the world’s roads, aiming to learn from them.
Ultimately, it aims to “sell” these lessons to someone else…
Because these nodes are precisely the areas where autonomous systems get stuck and where they focus their efforts to solve them. If someone manages to provide this data, many obstacles will be overcome.
How will AV Labs work?
The working mechanism of AV Labs is not as complicated as it seems.
Test vehicles, equipped with specialized sensors such as LiDAR, radar, and cameras, and part of Uber’s operational network, will collect data from the field; the project begins with a special Hyundai Ioniq 5.
Data containing personal information and not necessary for the biz will be removed, cleaned or masked from this big data; in the final stage, the data will be labeled.
The remaining data will be analyzed, and real-world events from the field will be transferred to a simulation environment.
Software from partner companies will compare these scenarios to human driving in shadow mode.
Significant incidents identified will be incorporated into the training data, while models will be updated, processed, and redeployed to the field with continuously incoming data.
This approach, called the data flywheel, currently seems to be the most ideal approach for analyzing rare situations in the field.
In short, in the future, perhaps the driver’s console and steering wheel will become an icon. The vehicle you’re riding in will use shared experience gained from data collected by millions of other vehicles to take you from one place to another.
What advantages does AV Labs offer to the ecosystem?
Uber currently operates across a vast geography and therefore has a very large source of operational data. This data will open new horizons for developers with smaller fleets and limited geographical reach.
This big data will include complex traffic conditions; how drivers behave in sudden, unexpected situations on the roads; and, more broadly, city-specific infrastructure anomalies.
Access to Uber’s data will enable reinforcement learning and supervised models to quickly achieve their expected performance and will allow smaller developers to obtain scenarios that would take a very long time to achieve in the real world, much faster and at a lower cost.
Digital maps are extremely important these days and developers are investing in them more than ever before. The up-to-date data that AV Labs will obtain will allow for faster updating of digital maps and provide intelligence on city infrastructure with less delay.
The intelligence I’m referring to includes vital and important situations such as incorrectly marked roads, closed intersections, and one-way lanes.
…
We are in a new and completely different phase in improving driving with autonomous vehicles.
The rule-based approach we’ve used so far is now being replaced by reinforcement and supervised learning models based on big data. Therefore, which company can best solve the real-world problems related to autonomous driving, which will shape the future of the automotive industry, will be at the top of the list of topics being discussed.
If positioned correctly, AV Labs will act as an accelerator for mobility ecosystems, especially for those developing software related to autonomous driving. Because it will observe, identify, and differentiate rare cases in the real world that have been largely overlooked until now, enriching simulation sets and accelerating modeling…
Resources and further links…
- Uber launches an ‘AV Labs’ division to gather driving data for robotaxi partners
- Uber unveils AV Labs to shape robotaxi development
- Uber launches ‘AV Labs’ to monetize driving data for global robotaxi partners
- Next generation datasets for self-driving perception and forecasting
- A unified longitudinal trajectory dataset for automated vehicle


