mygit007hub / DeepHash-pytorch

Implementation of Some Deep Hash Algorithms, Including DPSH、DSH、DHN、HashNet、DSDH、DTSH、DFH、GreedyHash、CSQ.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DeepHash-pytorch

Implementation of Some Deep Hash Algorithms Baseline.

How to run

My environment is

python==3.7.0  torchvision==0.5.0  pytorch==1.4.0  

You can easily train and test any algorithm just by

pyhon DSH.py  
pyhon DPSH.py  
pyhon DHN.py    
pyhon DSDH.py    

If you have any problems, feel free to contact me by email(1142732931@qq.com) or raise an issue.

Precision Recall Curve

Precision Recall Curve

I add some code in DSH.py:

if "cifar10-1" == config["dataset"] and epoch > 29:
    P, R = pr_curve(trn_binary.numpy(), tst_binary.numpy(), trn_label.numpy(), tst_label.numpy())
    print(f'Precision Recall Curve data:\n"DSH":[{P},{R}],')

To get the Precision Recall Curve, you should copy the data (which is generated by the above code ) to precision_recall_curve.py and run this file.

cd utils
pyhon precision_recall_curve.py   

Dataset

There are three different configurations for cifar10

  • config["dataset"]="cifar10" will use 1000 images (100 images per class) as the query set, 5000 images( 500 images per class) as training set , the remaining 54,000 images are used as database.
  • config["dataset"]="cifar10-1" will use 1000 images (100 images per class) as the query set, the remaining 59,000 images are used as database, 5000 images( 500 images per class) are randomly sampled from the database as training set.
  • config["dataset"]="cifar10-2" will use 10000 images (1000 images per class) as the query set, 50000 images( 5000 images per class) as training set and database.

You can download NUS-WIDE here
Use data/nus-wide/code.py to randomly select 100 images per class as the query set (2,100 images in total). The remaining images are used as the database set, from which we randomly sample 500 images per class as the training set (10, 500 images in total).

You can download ImageNet, NUS-WIDE-m and COCO dataset here ,where is the data split copy from.

NUS-WIDE-m is different from NUS-WIDE, so i made a distinction.

269,648 images in NUS-WIDE , and 195834 images which are associated with 21 most frequent concepts.

NUS-WIDE-m has 223,496 images,and NUS-WIDE-m is used in HashNet(ICCV2017) and code HashNet caffe and pytorch

download mirflickr , and use ./data/mirflickr/code.py to randomly select 1000 images as the test query set and 4000 images as the train set.

Paper And Code

It is difficult to implement all by myself, so I made some modifications based on these codes
DSH(CVPR2016)
paper Deep Supervised Hashing for Fast Image Retrieval
code DSH-pytorch

DPSH(IJCAI2016)
paper Feature Learning based Deep Supervised Hashing with Pairwise Labels
code DPSH-pytorch

DHN(AAAI2016)
paper Deep Hashing Network for Efficient Similarity Retrieval
code DeepHash-tensorflow

DTSH(ACCV2016)
paper Deep Supervised Hashing with Triplet Labels
code DTSH

HashNet(ICCV2017)
paper HashNet: Deep Learning to Hash by Continuation
code HashNet caffe and pytorch

GreedyHash(NIPS2018)
paper Greedy Hash: Towards Fast Optimization for Accurate Hash Coding in CNN
code GreedyHash

DSDH(NIPS2017)
paper Deep Supervised Discrete Hashing
code DSDH_PyTorch

DFH(BMVC2019)
paper Push for Quantization: Deep Fisher Hashing
code Push-for-Quantization-Deep-Fisher-Hashing

ISDH(arxiv2018)
paper Instance Similarity Deep Hashing for Multi-Label Image Retrieval
code ISDH-Tensorflow

IDHN(TMM2019)
paper Improved Deep Hashing with Soft Pairwise Similarity for Multi-label Image Retrieval
code IDHN-Tensorflow

DBDH(Neurocomputing2020)
paper Deep balanced discrete hashing for image retrieval

ADSH(AAAI2018)
paper Asymmetric Deep Supervised Hashing
code1 ADSH matlab + pytorch
code2 ADSH_pytorch

DAGH(ICCV2019, not implement here)
paper Deep Supervised Hashing with Anchor Graph
code DAGH-Matlab

DAPH(ACMMM2017, not completely implement here)
paper Deep Asymmetric Pairwise Hashing

LCDSH(IJCAI2017)
paper Locality-Constrained Deep Supervised Hashing for Image Retrieval

DSHSD(IEEE ACCESS 2019)
paper Deep Supervised Hashing Based on Stable Distribution

CSQ(CVPR2020)
paper Central Similarity Quantization for Efficient Image and Video Retrieval
code CSQ-pytorch

Deep Unsupervised Image Hashing by Maximizing Bit Entropy(AAAI2021)
paper Deep Unsupervised Image Hashing by Maximizing Bit Entropy
code Deep-Unsupervised-Image-Hashing

Mean Average Precision,48 bits[AlexNet].

Algorithmsdatasetthis impl.paper
DSHcifar10-1 0.800 0.6755
nus_wide_21 0.798 0.5621
ms coco 0.655 -
imagenet 0.576 -
mirflickr 0.735 -
DPSHcifar10 0.775 0.757
nus_wide_21 0.844 0.851(0.812*)
imagenet 0.502 -
ms coco 0.711 -
voc2012 0.608 -
mirflickr 0.781 -
HashNetcifar10 0.782 -
nus wide81 m 0.764 0.7114
nus_wide_21 0.830 -
imagenet 0.644 0.6633
ms coco 0.724 0.7301
DHNcifar10 0.781 0.621
nus_wide_21 0.841 0.758
imagenet 0.486 -
ms coco 0.712 -
mirflickr 0.775 -
DSDHcifar10-1 0.790 0.820
nus_wide_21 0.833 0.829
imagenet 0.300 -
ms coco 0.681 -
mirflickr 0.765 -
DTSHcifar 10 0.800 0.774
nus_wide_21 0.829 0.824
ms coco 0.760 -
imagenet 0.631 -
mirflickr 0.753 -
DFHcifar10-1 0.801 0.844
nus_wide_21 0.837 0.842
ms coco 0.717 -
imagenet 0.519 0.747
mirflickr 0.766 -
GreedyHashcifar10-1 0.817 0.822
cifar10-2 0.932 0.944
imagenet 0.678 0.688
ms coco 0.728 -
nuswide_21 0.793 -
ADSHcifar10-1 0.921 0.9390
nuswide_21 0.622 0.9055
CSQ(ResNet50,64bit)coco 0.883 0.861
imagenet 0.881 0.873
nuswide_21_m 0.844 0.839
Due to time constraints, I can't test many hyper-parameters

About

Implementation of Some Deep Hash Algorithms, Including DPSH、DSH、DHN、HashNet、DSDH、DTSH、DFH、GreedyHash、CSQ.


Languages

Language:Python 100.0%