XudongLinthu / cvpr16-deepbit

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks (CVPR16)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CVPR16-DeepBit

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks

Created by Kevin Lin, Jiwen Lu, Chu-Song Chen, Jie Zhou

Introduction

We propose a new unsupervised deep learning approach to learn compact binary descriptor. We enforce three criterions on binary codes which are learned at the top layer of our network: 1) minimal loss quantization, 2) evenly distributed codes and 3) rotation invariant bits. Then, we learn the parameters of the networks with a back-propagation technique. Experimental results on three different visual analysis tasks including image matching, image retrieval, and object recognition demonstrate the effectiveness of the proposed approach.

The details can be found in the following CVPR 2016 paper

Citation

If you find DeepBit useful in your research, please consider citing:

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks
Kevin Lin, Jiwen Lu, Chu-Song Chen and Jie Zhou
IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016

CIFAR10 retrieval results

Performance comparison of different unsupervised hashing algorithms on CIFAR10 dataset. The table shows the mean average precision (mAP) of top 1000 returned images with respect to different number of hash bits. We provide better results here:

 Method      |   16 bits   |   32 bits   |   64 bits

-----------------|:-----------:|:-----------:|:-----------: KMH | 13.59 | 13.93 | 14.46 SphH | 13.98 | 14.58 | 15.38 SpeH | 12.55 | 12.42 | 12.56 SH | 12.95 | 14.09 | 13.89 PCAH | 12.91 | 12.60 | 12.10 LSH | 12.55 | 13.76 | 15.07 PCA-ITQ | 15.67 | 16.20 | 16.64 Deep Hash | 16.17 | 16.62 | 16.96 Ours (Nov.2015) | 19.43 | 24.86 | 27.73 Ours (Apr.2016) | 20.53 | 25.44 | 29.49

Prerequisites

  1. MATLAB (tested with 2015a on 64-bit Ubuntu)
  2. Caffe's prerequisites

Installation

Adjust Makefile.config and simply run the following commands:

$ make all -j8
$ make matcaffe

For a faster build, compile in parallel by doing make all -j8 where 8 is the number of parallel threads for compilation (a good choice for the number of threads is the number of cores in your machine).

Retrieval evaluation on CIFAR10

First, run the following command to download and set up CIFAR10 Dataset, VGG16 pre-trained on ILSVRC12, DeepBit 32-bit model trained on CIFAR10. This script will rotate training data and create leveldb files.

$ ./prepare.sh

Launch matalb and run run_cifar10.m to perform the evaluation of precision at k and mean average precision at k. We set k=1000 in the experiments. The bit length of binary codes is 32.

>> run_cifar10

Then, you will get the mAP result as follows.

>> MAP = 0.25446596

Note: CIFAR10 dataset is split into training and test sets, with 50,000 and 10,000 images, respectively. During retrieval process, the 50,000 training images are treated as the database. We use the 10,000 test images as the query samples.

Train DeepBit on CIFAR10

Simply run the following command to train DeepBit:

$ cd /examples/deepbit-cifar10-32
$ ./train.sh

The training process takes a few hours on a desktop with Titian X GPU. You will finally get your model named DeepBit32_final_iter_1.caffemodel under folder /examples/deepbit-cifar10-32/

To use the model, modify the model_file in run_cifar10.m to link your model:

    model_file = './YOUR/MODEL/PATH/filename.caffemodel';

Launch matlab, run run_cifar10.m and test the model!

>> run_cifar10

Resources

Note: This documentation may contain links to third party websites, which are provided for your convenience only. Third party websites may be subject to the third party’s terms, conditions, and privacy statements.

If the automatic "fetch_data" fails, you may manually download the resouces from:

  1. For ./prepare.sh:

DeepBit models in the paper:

  1. The proposed models trained on CIFAR10:

Experiments on Descriptor Matching and Object Recognition

comming soon...

Contact

Please feel free to leave suggestions or comments to Kevin Lin (kevinlin311.tw@iis.sinica.edu.tw), Jiwen Lu (lujiwen@tsinghua.edu.cn) or Chu-Song Chen (song@iis.sinica.edu.tw)

About

Learning Compact Binary Descriptors with Unsupervised Deep Neural Networks (CVPR16)

License:Other


Languages

Language:C++ 81.0%Language:Python 8.5%Language:Cuda 4.6%Language:CMake 3.0%Language:MATLAB 1.7%Language:Makefile 0.7%Language:Shell 0.4%Language:M 0.0%