Dynamic Algorithm Configuration

Image Credit: Ina Lindauer

Hyperparameter optimization is a powerful approach to achieve the best performance on many different problems. However classical approaches to solve this problem all ignore the iterative nature of many algorithms. Dynamic algorithm configuration (DAC) is capable of generalizing over prior optimization approaches, as well as handling optimization of hyperparameters that need to be adjusted over multiple time-steps. To allow us to use this framework, we need to move from the classical view of algorithms as a black-box to more of a gray or even white-box view to unleash the full potential of AI algorithms with DAC.
Our benchmark library for DAC provides such gray- and white-box versions of several target algorithms from several domains, including AI Planning, Evolutionary Computation and Deep Learning.


Our work on DAC

Dynamic Algorithm Configuration

Dynamic Algorithm Configuration on Artificial Functions

Dynamic Algorithm Configuration for Evolutionary Algorithms

Dynamic Algorithm Configuration for AI Planning

DACBench: Benchmarking Dynamic Algorithm Configuration

Literature Overview

List of Publications on DAC by Our Groups


Related Topics

Contextual RL

Algorithm Configuration

Algorithm Selection