AutoML.org

Freiburg-Hannover-Tübingen

Neural Architecture Search

Neural Architecture Search (NAS) automates the process of architecture design of neural networks.  NAS approaches optimize the topology of the networks, incl. how to connect nodes and which operators to choose. User-defined optimization metrics can thereby include accuracy, model size or inference time to arrive at an optimal architecture for specific applications. Due to the extremely large search space, traditional evolution or reinforcement learning-based AutoML algorithms tend to be computationally expensive. Hence recent research on the topic has focused on exploring more efficient ways for NAS. In particular, recently developed gradient-based and multi-fidelity methods have provided a promising path and boosted research in these directions. Our group has been very active in developing state of the art NAS methods and has been at the forefront of driving NAS research forward. We give a summary of a few recent important work released from our group –

Open-Source Libraries

NASLib

NASLib is a modular and flexible Neural Architecture Search (NAS) library. Its purpose is to facilitate NAS research in the community and allow for fair comparisons of diverse recent NAS methods by providing a common modular, flexible and extensible codebase. The library also provides an easy-to-use interface with the popular NAS benchmarks (e.g. NASBench 101, 201, 301), with various discrete (e.g. Random Search, Regularized Evolution) and one-shot (DARTS, GDAS, DrNAS, etc.) optimizers already integrated in.

The library currently has an active, strong, developer community who are maintaining it along with constantly adding new benchmarks, optimizers and search spaces into the code base.

Auto-PyTorch

Based on the well-known DL framework PyTorch, Auto-PyTorch automatically optimizes both the neural architecture and the hyperparameter configuration. To this end, Auto-PyTorch combines ideas from efficient multi-fidelity optimization, meta-learning and ensembling.

NAS Benchmarks

Research on NAS is often very expensive because training and evaluating a single deep neural network might require between minutes or even days. Therefore, we provide several benchmark packages for NAS that either provide tabular or surrogate benchmarks, allowing efficient research on NAS.

Best practices for NAS Research

The rapid development of new NAS approaches makes it hard to compare these against each other. To ensure reliable and reproducible results, we also provide best practices for scientific research on NAS and our checklist for new NAS papers. In our CVPR 2021 NAS Workshop paper we discuss some practical considerations that help improve the stability, efficiency and overall performance of neural architecture search methods.

Selected NAS Papers

Literature Overview

NAS is one of the booming subfields of AutoML and the number of papers is quickly increasing. To provide a comprehensive overview of the recent trends, we provide the following sources:

One-Shot NAS Methods

Meta Learning of Neural Architectures

Neural Ensemble Search

Joint NAS and Hyperparameter Optimization

Multi-Objective NAS

Application-Specific NAS

Large-scale study of NAS methods