KingJamesSong / gradnorm_ood

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

This is the source code for our paper: On the Importance of Gradients for Detecting Distributional Shifts in the Wild by Rui Huang, Andrew Geng, and Sharon Li. Code is modified from Google BiT, ODIN, Outlier Exposure, deep Mahalanobis detector, Robust OOD Detection and MOS.

While previous works mainly rely on output space or feature space to detect out-of-distribution (OOD) inputs, this work proposes a novel gradient-based approach for OOD detection.

Usage

1. Dataset Preparation

In-distribution dataset

Please download ImageNet-1k and place the training data and validation data in ./dataset/id_data/ILSVRC-2012/train and ./dataset/id_data/ILSVRC-2012/val, respectively.

Out-of-distribution dataset

Following MOS, we use the following 4 OOD datasets for evaluation: iNaturalist, SUN, Places, and Textures.

For iNaturalist, SUN, and Places, we have sampled 10,000 images from the selected concepts for each dataset, which can be download from the following links:

wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/iNaturalist.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/SUN.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/Places.tar.gz

For Textures, we use the entire dataset, which can be downloaded from their official website.

Please put all downloaded OOD datasets into ./dataset/ood_data/. For more details about these OOD datasets, please check out the MOS paper.

2. Pre-trained Model Preparation

We omit the process of pre-training a classification model on ImageNet-1k. For the ease of reproduction, we provide our pre-trained network below:

wget http://pages.cs.wisc.edu/~huangrui/finetuned_model/BiT-S-R101x1-flat-finetune.pth.tar

Put the downloaded model in ./checkpoints/pretrained_models.

For more diverse pre-trained models, one can also refer to BiT-S pre-trained model families.

3. OOD Detection Evaluation

To reproduce our GradNorm results, please run:

./scripts/test.sh GradNorm iNaturalist(/SUN/Places/Textures)

To reproduce baseline approaches, please run:

./scripts/test.sh MSP(/ODIN/Energy/Mahalanobis) iNaturalist(/SUN/Places/Textures)

Note for Mahalanobis

Before testing, make sure you have tuned and saved its hyperparameters first by running:

./scripts/tune_mahalanobis.sh

OOD Detection Results

GradNorm achieves state-of-the-art performance averaged on the 4 OOD datasets.

results

Citation

If you find our codebase useful, please cite our work:

@inproceedings{huang2021importance,
  title={On the Importance of Gradients for Detecting Distributional Shifts in the Wild},
  author={Huang, Rui and Geng, Andrew and Li, Yixuan},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

About

On the Importance of Gradients for Detecting Distributional Shifts in the Wild

License:Apache License 2.0


Languages

Language:Python 98.4%Language:Shell 1.6%