zhizhangxian / NOAH

Searching prompt modules for parameter-efficient transfer learning.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Neural Prompt Search

S-Lab, Nanyang Technological University

The idea is simple: we view existing parameter-efficient tuning modules, including Adapter, LoRA and VPT, as prompt modules and propose to search the optimal configuration via neural architecture search. Our approach is named NOAH (Neural prOmpt seArcH).


[arXiv]

Updatas

[05/2022] arXiv paper has been released.

Environment Setup

conda create -n NOAH python=3.8
conda activate NOAH
pip install -r requirements.txt

Data Preparation

1. Visual Task Adaptation Benchmark (VTAB)

cd data/vtab-source
python get_vtab1k.py

2. Few-Shot and Domain Generation

  • Images

    Please refer to DATASETS.md to download the datasets.

  • Train/Val/Test splits

    Please refer to files under data/XXX/XXX/annotations for the detail information.

Quick Start For NOAH

We use the VTAB experiments as examples.

1. Downloading the Pre-trained Model

Model Link
ViT B/16 link

2. Supernet Training

sh configs/NOAH/VTAB/supernet/slurm_train_vtab.sh PATH-TO-YOUR-PRETRAINED-MODEL

3. Subnet Search

sh configs/NOAH/VTAB/search/slurm_search_vtab.sh PARAMETERS-LIMITES

4. Subnet Retraining

sh configs/NOAH/VTAB/subnet/slurm_retrain_vtab.sh PATH-TO-YOUR-PRETRAINED-MODEL

5. Performance

fig1

Citation

If you use this code in your research, please kindly cite this work.

@inproceedings{zhang2022NOAH,
      title={Neural Prompt Search}, 
      author={Yuanhan Zhang and Kaiyang Zhou and Ziwei Liu},
      year={2022},
      archivePrefix={arXiv},
}

Acknoledgments

Part of the code is borrowed from CoOp, AutoFormer, timm and mmcv.

Thanks Zhou Chong (https://chongzhou96.github.io/) for the code of downloading the VTAB-1k.

About

Searching prompt modules for parameter-efficient transfer learning.

License:MIT License


Languages

Language:Python 88.9%Language:Shell 8.4%Language:JavaScript 2.1%Language:HTML 0.3%Language:CSS 0.2%Language:Ruby 0.1%