Artificial intelligence powers many of the technologies and services underpinning Uber’s platform, allowing engineering and data science teams to make informed decisions that help improve user experiences for products across our lines of business.
At the forefront of this effort is Uber AI, Uber’s center for advanced artificial intelligence research and platforms. Uber AI powers applications in computer vision, natural language processing, deep learning, advanced optimization methods, and intelligent location and sensor processing across the company, as well as advancing fundamental research and engaging with the broader AI community through publications and open-source projects.
These machine learning and AI techniques and models allow Uber to move the needle across several verticals, from transportation and mobility to customer support and driver-partner navigation. In this year alone, AI research at Uber has led to significant improvements in demand prediction and more seamless pick-up experiences.
Read on to learn more about Uber AI’s 2019 highlights:
Improving location accuracy with sensing and perception
In 2019, Uber AI’s Sensing and Perception team worked on projects across our mobile and back-end stack to improve the coverage, accuracy, speed, and heading of vehicle locations on the Uber platform. Overcoming the limitations of GPS and having more precise locations makes it easier for riders and drivers to find one another, improves estimated times of arrival (ETAs), reduces rider and driver cancellations, and makes our marketplace operate more efficiently.
Leveraging computer vision to make Uber safer and more efficient
The Computer Vision Platform team has worked closely with product teams across Uber to enable scalable, reliable, and quick validation of driver identity when drivers go online. And, as Uber onboards a growing number of drivers and restaurants to our platform, we’ve built automated deep learning transcription technology that’s suited to Uber’s specific use case—documents with blocks of text that need to be output as structured data for downstream processing, rather than a text blob.
Enhancing real-time forecasting with neural networks
Uber leverages ML models powered by neural networks to forecast rider demand, pick-up and drop-off ETAs, and hardware capacity planning requirements, among other variables that drive our operations. To improve our forecasting abilities in 2019 and beyond, we developed new tools and techniques to enhance these models, including X-Ray, GENIE, and HotStarts.
X-Ray is an in-house tool that allows us to search thousands of features in parallel, uncovering those that will improve a model’s predictions. In 2019, we deployed this tool to production in systems across the company. In 2020, we plan to integrate X-Ray into the Michelangelo feature store for more accurate ML model feature assessment, which will enable us to further fine tune our predictions.
Also launched in 2019, GENIE, a novel architecture for deep learning creatively applied to temporal prediction, powered a 12.3 percent improvement in demand forecasting in over 100 cities worldwide, while HotStarts for AutoTune, our optimization-as-a-service tool, reduced the cost of tuning ML models and algorithms by a factor of 5-10 for recurring tasks.
Creating more seamless communication with conversational AI
To facilitate the best end-to-end experience possible for users, Uber is committed to making communication with our customers easier and more accessible. In 2019, we leveraged Uber’s conversational AI platform, empowering our support teams to resolve user issues as accurately and quickly as possible. Further, we used this platform to lessen the potential for distracted driving by allowing driver-partners to more seamlessly communicate with riders via hands-free pick-up and one-click chat.
To this end, we also developed and open sourced the Plato Research Dialogue System, a flexible conversational AI platform for building, training, and deploying conversational AI agents, enabling state of the art research in conversational AI. While currently only used for research purposes, Plato has the potential to be leveraged in production.
Publishing original AI research
In 2019, we published new and original research on projects as diverse as our Paired Open-Ended Trailblazer (POET), an algorithm that simultaneously generates and develops solutions for complex challenges, and Deconstructing Lottery Tickets: Zeros, Signs, and the Supermask, which studies the complexities of the famed lottery ticket algorithm. We presented these projects and many others at NeurIPS, ICLR, ICML, and other venues on our work in neural networks, conversational AI, and sensing and perception, among other disciplines.
We also shared some of our cutting edge research through blog articles, and source code, including EvoGrad, a Python library that makes it easy to prototype natural evolution-like algorithms for training ML models, and Hypothesis GU Funcs, a Python package for unit testing. Not only do projects like these contribute to Uber’s own catalog of AI research, but also enable us to collaborate with others in AI.
Fostering AI collaboration through open source
In 2019, Uber AI was committed to sharing knowledge and best practices with the broader scientific community through open source projects.
For instance, Uber released Ludwig, an open source deep learning toolbox built on top of TensorFlow that allows users to train and test deep learning models without writing code. In addition to being adopted by teams at Apple, IBM, and Nvidia, Ludwig has been incorporated into a variety of our tools and products at Uber, from our Customer Obsession Ticket Agent (COTA) to our food delivery time prediction algorithms. In July 2019, we released Ludwig version 0.2, featuring integration with Comet.ml and the implementation of audio/speech functionality.
Another example of our commitment to collaboration is the adoption of Pyro, an Uber-created open source project, by the LF AI Foundation, an organization under the auspices of the Linux Foundation that supports open source innovation in AI and deep learning. Pyro, a universal probabilistic programming language written in Python, is used by academic institutions such as Stanford University, the Massachusetts Institute of Technology, and Harvard University, and has reached over 5,700 stars on GitHub.
Looking towards 2020
Next year, Uber AI will continue to innovate, collaborate, and contribute to Uber’s platform services through the application of AI across our business. Whether powering sensing and perception systems that improve our routes or enabling the more accurate forecasting of rider demand in the cities we serve, AI is fundamental to Uber’s growth as a company and its ability to deliver safer and more reliable experiences on our platform.
For more on Uber AI, be sure to check out related articles on the Uber Engineering Blog and our AI research papers on the Uber Research publication page. Interested in developing the next generation of AI tools and services? Consider applying for a role on our team as a machine learning engineer.
Zoubin Ghahramani
Zoubin Ghahramani is Chief Scientist of Uber and a world leader in the field of machine learning, significantly advancing the state-of-the-art in algorithms that can learn from data. He is known in particular for fundamental contributions to probabilistic modeling and Bayesian approaches to machine learning systems and AI. Zoubin also maintains his roles as Professor of Information Engineering at the University of Cambridge and Deputy Director of the Leverhulme Centre for the Future of Intelligence. He was one of the founding directors of the Alan Turing Institute (the UK's national institute for Data Science and AI), and is a Fellow of St John's College Cambridge and of the Royal Society.
Posted by Zoubin Ghahramani
Related articles
Most popular
Horacio’s story: gaining mobility independence through innovative transportation solutions
Shifting E2E Testing Left at Uber
Streamlining Financial Precision: Uber’s Advanced Settlement Accounting System
Continuous deployment for large monorepos
Products
Company