Gradient-Based Neural Architecture Search: A Comprehensive Evaluation †
Authors: Ali, S. and Wani, M.A.
Journal: Machine Learning and Knowledge Extraction
Volume: 5
Issue: 3
Pages: 1176-1194
eISSN: 2504-4990
DOI: 10.3390/make5030060
Abstract:One of the challenges in deep learning involves discovering the optimal architecture for a specific task. This is effectively tackled through Neural Architecture Search (NAS). Neural Architecture Search encompasses three prominent approaches—reinforcement learning, evolutionary algorithms, and gradient descent—that have demonstrated noteworthy potential in identifying good candidate architectures. However, approaches based on reinforcement learning and evolutionary algorithms often necessitate extensive computational resources, requiring hundreds of GPU days or more. Therefore, we confine this work to a gradient-based approach due to its lower computational resource demands. Our objective encompasses identifying the optimal gradient-based NAS method and pinpointing opportunities for future enhancements. To achieve this, a comprehensive evaluation of the use of four major Gradient descent-based architecture search methods for discovering the best neural architecture for image classification tasks is provided. An overview of these gradient-based methods, i.e., DARTS, PDARTS, Fair DARTS and Att-DARTS, is presented. A theoretical comparison, based on search spaces, continuous relaxation strategy and bi-level optimization, for deriving the best neural architecture is then provided. The strong and weak features of these methods are also listed. Experimental results for comparing the error rate and computational cost of these gradient-based methods are analyzed. These experiments involved using bench marking datasets CIFAR-10, CIFAR-100 and ImageNet. The results show that PDARTS is better and faster among the examined methods, making it a potent candidate for automating Neural Architecture Search. By effectively conducting a comparative analysis, our research provides valuable insights and future research directions to address the criticism and gaps in the literature.
Source: Scopus
Gradient-Based Neural Architecture Search: A Comprehensive Evaluation
Authors: Ali, S. and Wani, M.A.
Journal: MACHINE LEARNING AND KNOWLEDGE EXTRACTION
Volume: 5
Issue: 3
Pages: 1176-1194
eISSN: 2504-4990
DOI: 10.3390/make5030060
Source: Web of Science (Lite)