Frank Hutter
[intermediate/advanced] Automated Machine Learning, or Deep Learning 2.0: AI that Builds and Improves AI
Summary
Throughout the history of AI, there is a clear pattern that manual elements of AI methods are eventually replaced by better-performing automatically-found ones; for example, deep learning (DL) replaced manual feature engineering with learned representations. The logical next step in representation learning is to also (meta-)learn the best architectures for these representations, as well as the best algorithms & hyperparameters for learning them. In this short course, I will discuss various works towards this aim from the area of AutoML — AI that builds and improves AI. Specifically, I will show various advances demonstrating that AutoML can be efficient and arguing for an emphasis on multi-objective AutoML to also account for the various dimensions of trustworthiness (such as algorithmic fairness, robustness, and uncertainty calibration). The first lecture will focus on basics and the second lecture on speedup techniques. Finally, in the third lecture, I will focus on AI that builds AI, with a deep-dive into a novel approach that learns an entire classification algorithm for small tabular datasets that achieves a new state of the art at the cost of a single forward pass. Throughout, I will illustrate the methods with Colab notebook demos.
Syllabus
Lecture 1: The Basics
- The Big Picture
- Hyperparameter Optimization (HPO)
- Bayesian optimization for HPO + demo
- Multi-objective HPO
Lecture 2: Advanced HPO
- Speedup Techniques for HPO + demo
- AutoML for Foundation Models: QuickTune + demo
Lecture 3: Metalearning entire algorithms
- Prior-Fitted Networks (PFNs)
- TabPFN v1
- TabPFN v2 + demo
References
Deep Learning 2.0:
https://www.automl.org/deep-learning-2-0-extending-the-power-of-deep-learning-to-the-meta-level/
HPO:
- Survey: https://www.automl.org/wp-content/uploads/2019/05/AutoML_Book_Chapter1.pdf
- PriorBand: https://openreview.net/forum?id=uoiwugtpCH
- ifBO: https://openreview.net/forum?id=VyoY3Wh9Wd
QuickTune:
https://openreview.net/forum?id=tqh1zdXIra
PFNs:
Pre-requisites
Basic knowledge of deep learning.
Short bio
Frank Hutter is a Full Professor for Machine Learning at the University of Freiburg (Germany), currently on leave as a Hector Endowed Fellow at the ELLIS Institute Tübingen. Before taking up a faculty job at the University of Freiburg in 2013, he did a PhD (2004-2009) and postdoc (2009-2013) at the University of British Columbia (UBC) in Canada. He received the 2010 CAIAC doctoral dissertation award for the best thesis in AI in Canada, as well as several best paper awards and prizes in international ML competitions. He is a Fellow of ELLIS and EurAI, Director of the ELLIS unit Freiburg, and the recipient of 3 ERC grants. Frank is best known for his research on automated machine learning (AutoML), including neural architecture search, efficient hyperparameter optimization, and meta-learning. He co-authored the first book on AutoML and the prominent AutoML tools Auto-WEKA, Auto-sklearn and Auto-PyTorch, won the first two AutoML challenges with his team, is co-teaching the first MOOC on AutoML, co-organized 15 AutoML-related workshops at ICML, NeurIPS and ICLR, and founded the AutoML conference as general chair in 2022 and 2023. In recent years, his focus has been on the intersection of foundation models and AutoML, including the first foundation model for tabular data, TabPFN, and improving pretraining and fine-tuning with AutoML.