aoiang / few-shot-NAS

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Few-shot Neural Architecture Search

Yiyang Zhao, Linnan Wang, Yuandong Tian, Rodrigo Fonseca, Tian Guo

Introduction

One-shot Neural Architecture Search uses a single supernet to approximate the performance each architecture. However, this performance estimation is super inaccurate because of co-adaption among operations in supernet. Few-shot NAS uses multiple supernets with less edges(operations) and each of them covers different regions of the search space to alleviate the undesired co-adaption. Compared to one-shot NAS, few-shot NAS greatly improve the performance of architecture evaluation with a small increase of overhead. Please click here to see our paper.

Paper

Few-shot Neural Architecture Search

If you use the few-shot NAS data or code, please cite:

@InProceedings{pmlr-v139-zhao21d,
  title = 	 {Few-Shot Neural Architecture Search},
  author =       {Zhao, Yiyang and Wang, Linnan and Tian, Yuandong and Fonseca, Rodrigo and Guo, Tian},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {12707--12718},
  year = 	 {2021},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v139/zhao21d/zhao21d.pdf},
  url = 	 {http://proceedings.mlr.press/v139/zhao21d.html},
}

How to use

Few-shot NAS on NasBench201

Please refer here to see how to use few-shot NAS improve the search performance on NasBench201.

Few-shot NAS on Cifar10

Please refer here to test our state-of-the-art models searched by few-shot NAS.

Media Coverage

English version

Facebook AI Research blog post

Poster

Chinese version

机器之心专栏文字介绍

机器之心专栏直播回放

Bilibili

About


Languages

Language:Python 95.4%Language:Shell 4.6%