Energy-Aware Neural Architecture Search: Leveraging Genetic Algorithms for Balancing Performance and Consumption
Résumé
In recent years, the field of deep learning has been propelled forward by increasingly complex and resource-intensive neural network models. Despite their impressive performance, their training consumes much energy, incurring significant environmental impacts. Even more so, when neural architecture search (NAS) methods train hundreds of thousands of models to benchmark optimization. We offer to create greener benchmarks using genetic algorithms. We leverage our previous CNNGen approach producing random CNN topologies based on our context-free grammar. We then use the NSGA-II genetic algorithm to create topologies balancing performance and energy consumption. We rely on machine-learning-based predictors that estimate candidates' performance and energy consumption to avoid training during generations, saving significant computational costs. This paper reports on our experiments and discusses further developments.
| Origine | Fichiers produits par l'(les) auteur(s) |
|---|---|
| Licence |