Tri-level Optimization for Gradient-based Neural Architecture Search
Authors: Ali, S. and Wani, M.A.
Journal: Proceedings 2024 International Conference on Machine Learning and Applications Icmla 2024
Pages: 1546-1552
DOI: 10.1109/ICMLA61862.2024.00238
Abstract:In this paper, we introduce a novel tri-level optimization technique for Neural Architecture Search (NAS) that extends the capabilities of existing bi-level optimization techniques, such as those employed in Differentiable Architecture Search (DARTS). Our approach incorporates an additional optimization layer by introducing a parameter associated with nodes within the cell structure, alongside optimizing operations and weights. This tri-level optimization framework aims to improve the search process by allowing the algorithm to determine the optimal number of nodes within a cell dynamically. By including this additional parameter, our method not only optimizes the architectural design but also adapts the complexity of the cell structure to achieve improved performance. We validate our approach through experiments on CIFAR-10 and demonstrate its effectiveness in finding more efficient neural network architectures compared to conventional or bi-level NAS methods. Additionally, we benchmark our proposed method against state-of-the-art NAS methods to underscore its competitive performance.
Source: Scopus
Tri-level Optimization for Gradient-based Neural Architecture Search
Authors: Ali, S. and Wani, M.A.
Journal: 2024 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA
Pages: 1546-1552
eISSN: 1946-0759
ISBN: 979-8-3503-7489-6
ISSN: 1946-0740
DOI: 10.1109/ICMLA61862.2024.00238
Source: Web of Science (Lite)