Image Credit: Ina Lindauer
Hyperparameter optimization is a powerful approach to achieve the best performance on many different problems. However classical approaches to solve this problem all ignore the iterative nature of many algorithms. Dynamic algorithm configuration (DAC) is capable of generalizing over prior optimization approaches, as well as handling optimization of hyperparameters that need to be adjusted over multiple time-steps. To allow us to use this framework, we need to move from the classical view of algorithms as a black-box to more of a gray or even white-box view to unleash the full potential of AI algorithms with DAC.
Our benchmark library for DAC provides such gray- and white-box versions of several target algorithms from several domains, including AI Planning, Evolutionary Computation and Deep Learning.
Our work on DAC
List of Publications on DAC by Our Groups
- S. Adriaensen and A. Biedenkapp and G. Shala and N. Awad and T. Eimer and M. Lindauer and F. Hutter
Automated Dynamic Algorithm Configuration
In: arXiv:2205.13881 [cs.AI]
Link to source code
- A. Biedenkapp and D. Speck and S. Sievers and F. Hutter and M. Lindauer and J. Seipp
Learning Domain-Independent Policies for Open List Selection
In: Workshop on Bridging the Gap Between AI Planning and Reinforcement Learning (PRL @ ICAPS’22)
- A. Biedenkapp and N. Dang and M. S. Krejca and F. Hutter and C. Doerr
Theory-inspired Parameter Control Benchmarks for Dynamic Algorithm Configuration
In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO’22)
Won the best paper award (GECH track)
Link to source code
- T. Eimer and A. Biedenkapp and M. Reimer and S. Adriaensen and F. Hutter and M. Lindauer
DACBench: A Benchmark Library for Dynamic Algorithm Configuration
In: Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI’21)
Link to source code of DACBench and accompanying blog post.
Link to the video presentation at IJCAI’21.
- D. Speck and A. Biedenkapp and F. Hutter and R. Mattmüller and M. Lindauer
Learning Heuristic Selection with Dynamic Algorithm Configuration
In: Proceedings of the Thirty-First International Conference on Automated Planning and Scheduling (ICAPS’21)
Link to source code and data of using DAC to switch heuristics in AI planning.
Link to the video presentation for a workshop version of the paper presented at PRL@ICAPS’20.
- G. Shala and A. Biedenkapp and N. Awad and S. Adriaensen and M. Lindauer and F. Hutter
Learning Step-Size Adaptation in CMA-ES
In: Proceedings of the Sixteenth International Conference on Parallel Problem Solving from Nature (PPSN’20)
Link to source code and data as well as trained policies and accompanying blog post.
Link to the video poster presentation at PPSN’20. - A. Biedenkapp and H. F. Bozkurt and T. Eimer and F. Hutter and M. Lindauer
Dynamic Algorithm Configuration: Foundation of a New Meta-Algorithmic Framework
In: Proceedings of the Twenty-fourth European Conference on Artificial Intelligence (ECAI’20)
Link to source code of usage of DAC on artificial benchmarks. Link to accompanying blog post.
Link to the video presentation at ECAI’20. - A. Biedenkapp and H. F. Bozkurt and F. Hutter and M. Lindauer
Towards White-box Benchmarks for Algorithm Control
In: DSO Workshop@IJCAI’19
Note: In this early work we referred to DAC as “Algorithm Control”