When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search

Guocheng Qian, Xuanyang Zhang, Guohao Li, Chen Zhao, Yukang Chen, Xiangyu Zhang, Bernard Ghanem, Jian Sun

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

The key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space. We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures while also achieving a higher search accuracy. TNAS introduces an architecture tree and a binary operation tree, to factorize the search space and substantially reduce the exploration size. TNAS performs a modified bi-level Breadth-First Search in the proposed trees to discover a high-performance architecture. Impressively, TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201. The average test accuracy is 94.35%, which outperforms the state-of-the-art.
Original languageEnglish (US)
Title of host publication2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)
PublisherIEEE
ISBN (Print)978-1-6654-8740-5
DOIs
StatePublished - Aug 23 2022

Fingerprint

Dive into the research topics of 'When NAS Meets Trees: An Efficient Algorithm for Neural Architecture Search'. Together they form a unique fingerprint.

Cite this