Meta-Learning for Hyperparameter Optimization

Date: Wednesday, Sep 13, 2023, 14:30 - 16:00

Speakers:
  • Josif Grabocka, University of Freiburg
  • Martin Wistuba, Amazon Research
Website:

https://neural-architecture-search.github.io/tutorial-automlconf-2023/

Abstract:

Hyperparameter Optimization (HPO) has been recognized as a crucial element in achieving state-of-the-art performance across a wide range of machine learning tasks. However, the complexity and compute-intensive nature of HPO makes it a challenging feat, often inhibiting its adoption and the successful deployment of ML systems to novel tasks. This tutorial proposal delves into meta-learning for HPO. Our focus will be on presenting a comprehensive guide on meta-learning strategies which facilitates the transfer of design choices across tasks, thereby reducing the computational burden. This tutorial aims to equip participants with the knowledge to implement meta-learning strategies for HPO, ultimately enhancing the efficiency and effectiveness of their Machine Learning systems. Ideal attendees would be machine learning practitioners, researchers, and enthusiasts interested in leveraging meta-learning for hyperparameter optimization.

Bio:

Josif Grabocka

Josif Grabocka is an Assistant Professor of Representation Learning at the University of Freiburg, where he heads the RELEA research group. His primary interest lies in hyperparameter optimization, time-series mining and tabular data.


Martin Wistuba

Martin Wistuba is a researcher at Amazon Web Services where he works on automation of hyperparameter optimization and Neural Architecture Search. Earlier, he was at IBM Research, where he developed tools to automate deep learning. He received his Ph.D. in Machine Learning from the University of Hildesheim. His research interest includes AutoML, in particular the idea of meta-knowledge transfer to speed up Bayesian optimization and Neural Architecture Search.