Graduate Machine Learning · Population-Based Optimisation

Evolutionary Learning in Machine Learning

Population-based, gradient-free, parallelisable — how Darwinian search complements and sometimes beats gradient descent on the hardest problems in ML.

About This Course

For most of the deep-learning era, gradient-based optimisation was the only game in town. That has changed. Evolutionary learning — a family of population-based, derivative-free, embarrassingly-parallel optimisation algorithms with intellectual roots going back to the 1960s — has re-emerged as a serious tool in modern machine learning. Salimans et al. (2017) showed that a basic evolution strategy can match deep-RL on Atari with a fraction of the wall-clock time on a CPU cluster. NEAT and MAP-Elites have produced robotic gaits, level-design agents, and protein designs that gradient-based methods systematically miss because they live behind deceptive fitness landscapes.

This course traces the classical foundations — genetic algorithms, evolution strategies, genetic programming — and then develops the modern programme: CMA-ES, neuroevolution, novelty search and Quality-Diversity, evolutionary reinforcement learning, and the open-endedness frontier. The throughline is the question of how to optimise in spaces where gradients lie about the global structure, and what evolutionary computation contributes that gradient methods cannot.

Key Numbers

1975

Holland’s Adaptation in Natural and Artificial Systems

2001

Hansen & Ostermeier publish CMA-ES

2002

NEAT (Stanley & Miikkulainen)

2015

MAP-Elites (Mouret & Clune)

2017

OpenAI ES — ES on Atari at deep-RL scale

8

Modules in this course

Featured Talk — LEAP: Library for Evolutionary Algorithms in Python

Eric Scott (MITRE / George Mason) introduces LEAP, a Python framework for evolutionary computation that ships GAs, ES, GP, neuroevolution, and parallel/distributed runners under a single composable pipeline API. A practical reference implementation that mirrors the algorithms developed across this course.

LEAP · EC Library Tutorial

LEAP — Library for Evolutionary Algorithms in Python

Walkthrough of LEAP's pipeline architecture (operator-stream design), built-in encodings (binary, real-valued, tree, neural), distributed evaluation via Dask, and worked examples on benchmark suites.

Companion Series — The Nature of Code, Ch. 9 (Daniel Shiffman)

Daniel Shiffman’s 19-video chapter on genetic algorithms and evolutionary computing — intuitive, code-along walkthroughs in p5.js. The Shakespeare-monkey example, smart-rocket simulations, the travelling-salesperson GA, and evolutionary steering behaviours. Read this alongside Module 1 (Genetic Algorithms) and Module 4 (Neuroevolution) for a hands-on counterpart to the formal treatment.

Chapter 9 — Genetic Algorithm Foundations

Coding Train · The Nature of Code

9.1 — Genetic Algorithm: Introduction

Coding Train · The Nature of Code

9.2 — Genetic Algorithm: How It Works

Coding Train · The Nature of Code

9.3 — Shakespeare Monkey Example

Coding Train · The Nature of Code

9.4 — Looking at Code

Coding Train · The Nature of Code

9.5 — Fitness, Genotype vs Phenotype

Coding Train · The Nature of Code

9.6 — Improved Fitness Function

Coding Train · The Nature of Code

9.7 — Pool Selection

Coding Train · The Nature of Code

9.8 — Weighted Selection

Coding Train · The Nature of Code

9.9 — Interactive Selection

Coding Train · The Nature of Code

9.10 — Continuous Evolutionary System

Coding Train · The Nature of Code

9.x — Genetic Algorithms & Evolutionary Computing (Outro)

Coding Challenges — GA Applications

Coding Train · Coding Challenge

CC #29 — Smart Rockets in p5.js

Coding Train · Coding Challenge

CC #35.4 — Travelling Salesperson with GA

Coding Train · Coding Challenge

CC #35.5 — TSP with GA & Crossover

Coding Train · Coding Challenge

CC #69 — Evolutionary Steering Behaviours, Pt 1

Coding Train · Coding Challenge

CC #69 — Evolutionary Steering Behaviours, Pt 2

Coding Train · Coding Challenge

CC #69 — Evolutionary Steering Behaviours, Pt 3

Coding Train · Coding Challenge

CC #69 — Evolutionary Steering Behaviours, Pt 4

Coding Train · Coding Challenge

CC #69 — Evolutionary Steering Behaviours, Pt 5 (Bonus)

Eight Modules

M0

History & Foundations

Holland 1975 (genetic algorithms), Rechenberg & Schwefel (evolution strategies), Fogel (evolutionary programming), Koza 1992 (genetic programming). The four classical pillars and how they unify into modern evolutionary computation.

HollandRechenbergKoza

M1

Genetic Algorithms

Bit-string representations, single-point and two-point crossover, mutation, fitness-proportional / tournament / rank selection, the schema theorem, building-block hypothesis, and modern criticisms.

SchemaCrossoverSelection

M2

Evolution Strategies & CMA-ES

(1+1)-ES, (μ,λ)-ES, self-adaptation of step sizes, derandomised mutation, the covariance-matrix update of CMA-ES (Hansen & Ostermeier 2001), step-size control, and modern noisy/large-scale variants.

CMA-ES(μ,λ)-ESSelf-adapt

M3

Genetic Programming

Koza tree-based GP, automatically defined functions (ADFs), bloat and parsimony pressure, grammatical evolution, linear GP, Cartesian GP, and the relationship to symbolic regression.

Koza TreesBloatSymReg

M4

Neuroevolution & NEAT

Direct vs indirect neural-network encodings, NEAT (Stanley & Miikkulainen 2002), historical markings & speciation, HyperNEAT and CPPN-based development, ES-HyperNEAT.

NEATHyperNEATCPPN

M5

Novelty & Quality Diversity

Lehman & Stanley on deception in objective-driven search, novelty search, MAP-Elites (Mouret & Clune 2015), CMA-ME, illumination of the behaviour space, and applications in robotics and protein design.

NoveltyMAP-ElitesCMA-ME

M6

Evolutionary RL

OpenAI ES (Salimans 2017) as a scalable RL alternative, NES, PEPG, ARS (augmented random search), evolutionary parameter search vs gradient methods, and their hybrids (ERL, P3S, CERL).

OpenAI ESNESARS

M7

Open-Ended Frontiers

POET (Wang et al. 2019), AI-GAs (Clune 2019), evolutionary architecture search, evolved AI training, the open-endedness research programme, and the role of evolutionary methods in foundation-model training.

POETAI-GAsNAS

Cross-Links

Machine Learning,Machine Learning for Science,Probability & Statistics,Numerical Methods,Origin of Life,Molecular Biology,Meta-Learning for Protein Simulation.