Communication Dans Un Congrès Année : 2024

Energy-Aware Neural Architecture Search: Leveraging Genetic Algorithms for Balancing Performance and Consumption

Résumé

In recent years, the field of deep learning has been propelled forward by increasingly complex and resource-intensive neural network models. Despite their impressive performance, their training consumes much energy, incurring significant environmental impacts. Even more so, when neural architecture search (NAS) methods train hundreds of thousands of models to benchmark optimization. We offer to create greener benchmarks using genetic algorithms. We leverage our previous CNNGen approach producing random CNN topologies based on our context-free grammar. We then use the NSGA-II genetic algorithm to create topologies balancing performance and energy consumption. We rely on machine-learning-based predictors that estimate candidates' performance and energy consumption to avoid training during generations, saving significant computational costs. This paper reports on our experiments and discusses further developments.

Fichier principal
Vignette du fichier
WIVACE2024_CNNGEN_Postproceedings_Springer_CR.pdf (585.74 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04957679 , version 1 (19-02-2025)

Licence

Identifiants

  • HAL Id : hal-04957679 , version 1

Citer

Antoine Gratia, Paul Temple, Gilles Perrouin, Pierre-Yves Schobbens. Energy-Aware Neural Architecture Search: Leveraging Genetic Algorithms for Balancing Performance and Consumption. WIVACE 2024 - XVIII International Workshop on Artificial Life and Evolutionary Computation, Sep 2024, Namur, Belgium. pp.1-13. ⟨hal-04957679⟩
94 Consultations
241 Téléchargements

Partager

  • More