tinyML Talks Webcast - recorded March 10, 2021
"Efficient Multi-Objective Neural Architecture Search with Evolutionary Algorithms"
Thomas Elsken
Bosch Center for Artificial Intelligence
Deep Learning has enabled remarkable progress over the last years on a variety of tasks such as image recognition. One crucial aspect for this progress are novel neural architectures. Currently employed architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. This led to a growing interest in neural architecture search (NAS), the process of automatically finding neural network architectures for a task at hand. While recent approaches have achieved state-of-the-art predictive performance, they are problematic under resource constraints for two reasons: (1) the neural architectures found are typically solely optimized for high predictive performance, without penalizing excessive resource consumption; (2) most architecture search methods require vast computational resources in the order of hundreds of thousands of GPU days. After giving a short introduction to NAS, we address these by proposing LEMONADE, an evolutionary algorithm for multi-objective architecture search that allows approximating the entire Pareto front of architectures under multiple objectives, such as predictive performance and resource consumption. LEMONADE employs an inheritance mechanism for neural architectures to generates child networks that are warm started with the predictive performance of their trained parents in order to overcome the need for immense computational resources.
Ещё видео!