Skip to main content
Uber AI

Introducing EvoGrad: A Lightweight Library for Gradient-Based Evolution

July 22, 2019 / Global
Featured image for Introducing EvoGrad: A Lightweight Library for Gradient-Based Evolution
Figure 1. This figure shows a simple parabolic fitness landscape in a single-dimensional search space (x-axis). Maximal fitness (fitness is plotted on the y-axis) is achieved when the search distribution is centered on 5 in the search space.
Figure 2. The ES algorithm optimizes the objective by moving its population distribution (shown by the histogram of samples from the distribution) across the search space to the optimal value at x=5, as depicted by the above graph.
Figure 3. A 1D behavior landscape in a single-dimensional search space (x-axis) is shown. Maximal evolvability is achieved when the search distribution is centered around 7.85 in the search space (because small perturbations of X generate the greatest resulting diversity of behaviors).
Figure 4. The blue histogram represents the empirical samples taken from the population distribution in each generation of evolution. The population distribution is successfully optimized by evolvability ES to the highest-variance part of the behavioral landscape.
Figure 5. As in Figure 4, the blue histogram represents the empirical samples taken from the population distribution in each generation of evolution. The population distribution is successfully optimized by evolvability ES to the highest-entropy part of the behavioral landscape (which in this landscape is the same as the highest-variance part).
Figure 6. The progress of evolvability ES over generations of evolution is shown above. Each dot represents a neural network policy sampled from the population, and its plotted location represents where an ant robot controlled by that neural network locomotes to at the end of its simulation. Over generations of evolution, the population encodes increasingly well-adapted and diverse behaviors, able to walk farther and farther in all directions.
Alex Gajewski

Alex Gajewski

Alex Gajewski is a third-year undergrad at Columbia University studying math and computer science, and was as a summer 2018 intern with Uber AI. He is excited by the potential of emerging technologies like machine learning to change the ways we interact with each other and ourselves.

Jeff Clune

Jeff Clune

Jeff Clune is the former Loy and Edith Harris Associate Professor in Computer Science at the University of Wyoming, a Senior Research Manager and founding member of Uber AI Labs, and currently a Research Team Leader at OpenAI. Jeff focuses on robotics and training neural networks via deep learning and deep reinforcement learning. He has also researched open questions in evolutionary biology using computational models of evolution, including studying the evolutionary origins of modularity, hierarchy, and evolvability. Prior to becoming a professor, he was a Research Scientist at Cornell University, received a PhD in computer science and an MA in philosophy from Michigan State University, and received a BA in philosophy from the University of Michigan. More about Jeff’s research can be found at JeffClune.com

Kenneth O. Stanley

Kenneth O. Stanley

Before joining Uber AI Labs full time, Ken was an associate professor of computer science at the University of Central Florida (he is currently on leave). He is a leader in neuroevolution (combining neural networks with evolutionary techniques), where he helped invent prominent algorithms such as NEAT, CPPNs, HyperNEAT, and novelty search. His ideas have also reached a broader audience through the recent popular science book, Why Greatness Cannot Be Planned: The Myth of the Objective.

Joel Lehman

Joel Lehman

Joel Lehman was previously an assistant professor at the IT University of Copenhagen, and researches neural networks, evolutionary algorithms, and reinforcement learning.

Posted by Alex Gajewski, Jeff Clune, Kenneth O. Stanley, Joel Lehman

Category: