Chuan-shanjia / A-loss-function-for-change-detection

1st place solution to the Satellite Remote Sensing Image Change Detection Challenge hosted by SenseTime

Home Page:https://rs.sensetime.com/competition/index.html#/info

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

A loss function for change detection

Accepted for publication at IGARSS-22, Kuala Lumpur, Malaysia.

Here, we provide the pytorch implementation of the paper: UAL: UNCHANGED AREA LOSS-FUNCTION FOR CHANGE DETECTION NETWORKS.

Our Method

Task Description

Given two images of the same scene acquired at different times, we are required to mark the changed and unchanged areas. Moreover, as for the changed areas, we need to annotate their detailed semantic masks.

The change detection task in this competition can be decomposed into two sub-tasks:

  • binary segmentation of changed and unchanged areas.
  • semantic segmentation of changed areas.

Model

My Improvement

In this project,we propose a loss function named UAL-function (Unchanged Area Loss-function). UAL aims to establish the semantic label correspondence within unchanged regions. It is simple and effective for improving semantic segmentation and change detection with respect to the feature separability.

Reproduction

We also reproduct FC-Siam-conc and change the code to accomplish two sub-tasks.

We compare our models with FC-Siam-conc and DTCDSCN

Getting Started

Dataset

Description | Download [password: f3qq]

Pretrained Model

resnet-18 | resnet-34 | resnet-50

Final Trained Model

fcn-resnet18 | fcn-resnet34 | pspnet-resnet18 | pspnet-resnet34

File Organization

# store the whole dataset and pretrained backbones
mkdir -p data/dataset ; mkdir -p data/pretrained_models ;

# store the trained models
mkdir -p outdir/models ; 

# store predictions of validation set and testing set
mkdir -p outdir/masks/val/im1 ; mkdir -p outdir/masks/val/im2 ;
mkdir -p outdir/masks/test/im1 ; mkdir -p outdir/masks/test/im2 ;

├── data
    ├── dataset                    # download from the link above
    │   ├── train                  # training set
    |   |   ├── im1
    |   |   └── ...
    │   └── val                    # the final testing set (without labels)
    |
    └── pretrained_models
        ├── resnet18.pth
        ├── resnet34.pth
        └── ...

Training

# Please refer to utils/options.py for more arguments
# If hardware supports, more backbones can be trained, such as resnet50, resnet101
CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py --backbone "resnet18" --pretrained --model "fcn"

Testing

# Modify the backbones, models and checkpoint paths in L39-44 in test.py manually according to your saved models
# Or simply use our final trained models
CUDA_VISIBLE_DEVICES=0,1,2,3 python test.py```

About

1st place solution to the Satellite Remote Sensing Image Change Detection Challenge hosted by SenseTime

https://rs.sensetime.com/competition/index.html#/info

License:MIT License


Languages

Language:Python 100.0%