Author Archives: Thomas Elsken

LEMONADE: Efficient Multi-objective Neural Architecture Search with Network Morphisms


Most work on neural architecture search (NAS, see our recent survey) solely optimizes for one criterion: high performance (measured in terms of accuracy). This often results in large and complex network architectures that cannot be used in real-world applications with several other important criteria including memory requirement, energy consumption and latency. The other problem in many traditional NAS methods (such as evolutionary, RL or Bayesian Optimization) is that they require enormous computational costs: running these methods entails training hundreds to thousands of neural networks from scratch, making the development phase very expensive and limiting research on them to a few research groups with massive compute resources.

Figure 1: In this example, LEMONADE concurrently minimizes two objectives: validation error and number of parameters. Starting from trivial but poorly-performing networks in generation 0, the population improves over the course of LEMONADE with respect to both objectives. Each colored line denotes the performance of the population from each iteration / generation.

In our ICLR 2019 paper Efficient Multi-Objective Neural Architecture Search via Lamarckian Evolution, we proposed an algorithm called LEMONADE (Lamarckian Evolutionary for Multi-Objective Neural Architecture DEsign) to deal with both of these problems. On a high level, LEMONADE is a simple evolutionary algorithm for optimizing multiple objectives such as accuracy, memory requirements and latency. (more…)

Read More