Guo-Xiaoqing / SimT

[CVPR22'] SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation

by Xiaoqing Guo.

Summary:

Intoduction:

This repository is for our CVPR 2022 paper "SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation"(知乎)

Framework:

Usage:

Requirement:

Pytorch 1.3 & Pytorch 1.7 are ok

Python 3.6

Preprocessing:

Clone the repository:

git clone https://github.com/Guo-Xiaoqing/SimT.git
cd SimT 
bash sh_warmup.sh ## Stage of warmup
bash sh_simt.sh ## Stage of training with SimT

Data preparation:

The pseudo labels generated from the UDA black box of BAPA-Net [1] can be downloaded from Google Drive

The pseudo labels generated from the SFDA black box of SFDASeg [2] can be downloaded from Google Drive

[1] Yahao Liu, Jinhong Deng, Xinchen Gao, Wen Li, and Lixin Duan. Bapa-net: Boundary adaptation and prototype align- ment for cross-domain semantic segmentation. In ICCV, pages 8801–8811, 2021.

[2] Jogendra Nath Kundu, Akshay Kulkarni, Amit Singh,Varun Jampani, and R Venkatesh Babu. Generalize then adapt: Source-free domain adaptive semantic segmentation. In ICCV, pages 7046–7056, 2021.

Pretrained model:

You should download the pretrained model, warmup UDA model, and warmup SFDA model from Google Drive, and then put them in the './snapshots' folder for initialization.

Well trained model:

You could download the well trained UDA and SFDA models from Google Drive.

Log file

Log file can be found here

Citation:

@inproceedings{guo2022simt,
  title={SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation},
  author={Guo, Xiaoqing and Liu, Jie and Liu, Tongliang and Yuan, Yixuan},
  booktitle= {CVPR},
  year={2022}
}

Questions:

Please contact "xiaoqingguo1128@gmail.com"

About

[CVPR22'] SimT: Handling Open-set Noise for Domain Adaptive Semantic Segmentation

License:MIT License


Languages

Language:Python 99.0%Language:Shell 1.0%