AutoML.org

Freiburg-Hannover-Tübingen

Wrapping Up AutoML-Conf 2022 and Introducing the 2023 Edition

The inaugural AutoML conference 2022 was an exciting adventure for us! With 170 attendees in the very first iteration, we assess this conference as a big success and are confirmed in our belief that it was the right time to transition from a workshop series to a full-fledged conference. In this blogpost, we will summarize […]

Read More

TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second

A radically new approach to tabular classification: we introduce TabPFN, a new tabular data classification method that takes < 1 second & yields SOTA performance (competitive with the best AutoML pipelines in an hour). So far, it is limited in scale, though: it can only tackle problems up to 1000 training examples, 100 features and […]

Read More

Deep Learning 2.0: Extending the Power of Deep Learning to the Meta-Level

Deep Learning (DL) has been able to revolutionize learning from raw data (images, text, speech, etc) by replacing domain-specific hand-crafted features with features that are jointly learned for the particular task at hand. In this blog post, I propose to take deep learning to the next level, by also jointly (meta-)learning other, currently hand-crafted, elements […]

Read More

Introducing Reproducibility Reviews

By Frank Hutter, Isabelle Guyon, Marius Lindauer and Mihaela van der Schaar (general and program chairs of AutoML-Conf 2022) Did you ever try to reproduce a paper from a top ML conference and failed to do so? You’re not alone! At AutoML-Conf (see automl.cc), we’re aiming for a higher standard: with the papers we publish […]

Read More

Announcing the Automated Machine Learning Conference 2022

Modern machine learning systems come with many design decisions (including hyperparameters, architectures of neural networks and the entire data processing pipeline), and the idea of automating these decisions gave rise to the research field of automated machine learning (AutoML). AutoML has been booming over the last decade, with hundreds of papers published each year now […]

Read More

Auto-PyTorch: Multi-Fidelity MetaLearning for Efficient and Robust AutoDL

By Auto-PyTorch is a framework for automated deep learning (AutoDL) that uses BOHB as a backend to optimize the full deep learning pipeline, including data preprocessing, network training techniques and regularization methods. Auto-PyTorch is the successor of AutoNet which was one of the first frameworks to perform this joint optimization.

Read More

NAS-Bench-301 and the Case for Surrogate NAS Benchmarks

The Need for Realistic NAS Benchmarks Neural Architecture Search (NAS) is a logical next step in representation learning as it removes human bias from architecture design, similar to deep learning removing human bias from feature engineering. As such, NAS has experienced rapid growth in recent years, leading to state-of-the-art performance on many tasks. However, empirical […]

Read More

Our seven 2019 papers on neural architecture search (NAS)

By Neural Architecture Search (NAS) is a very hot topic in AutoML these days, and our group is very actively publishing in this area. We have seven NAS papers in 2019, which may make us one of the world’s most active groups in NAS (only closely surpassed by a small company called Google ;-). Here […]

Read More

RobustDARTS

By Understanding and Robustifying Differentiable Architecture Search Optimizing in the search of neural network architectures was initially defined as a discrete problem which intrinsically required to train and evaluate thousands of networks. This of course required huge amount of computational power, which was only possible for few institutions. One-shot neural architecture search (NAS) democratized this […]

Read More

Best Practices for Scientific Research on Neural Architecture Search

By Neural architecture search (NAS) is currently one of the hottest topics in automated machine learning (see AutoML book), with a seemingly exponential increase in the number of papers written on the subject, see the figure above. While many NAS methods are fascinating (please see our survey article for an overview of the main trends […]

Read More