zhaoyang10 / JULE-Torch

Project for our CVPR2016 paper "Joint Unsupervised Learning of Deep Representations and Image Clusters"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Joint Unsupervised Learning (JULE) of Deep Representations and Image Clusters.

Overview

This project is a Torch implementation for our CVPR 2016 paper, which performs jointly unsupervised learning of deep CNN and image clusters. The intuition behind this is that better image representation will facilitate clustering, while better clustering results will help representation learning. Given a unlabeled dataset, it will iteratively learn CNN parameters unsupervisedly and cluster images.

Disclaimer

This is a torch version reimplementation to the code used in our CVPR paper. There is a slight difference between the code used to report the results in our paper. The Caffe version code can be found here.

License

This code is released under the MIT License (refer to the LICENSE file for details).

Citation

If you find our code is useful in your researches, please consider citing:

@inproceedings{yangCVPR2016joint,
    Author = {Yang, Jianwei and Parikh, Devi and Batra, Dhruv},
    Title = {Joint Unsupervised Learning of Deep Representations and Image Clusters},
    Booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    Year = {2016}
}

Dependencies

  1. Torch. Install Torch by:

    $ curl -s https://raw.githubusercontent.com/torch/ezinstall/master/install-deps | bash
    $ git clone https://github.com/torch/distro.git ~/torch --recursive
    $ cd ~/torch; 
    $ ./install.sh      # and enter "yes" at the end to modify your bashrc
    $ source ~/.bashrc

    After installing torch, you may also need install some packages using LuaRocks:

    $ luarocks install nn
    $ luarocks install image 

    It is preferred to run the code on GPU. Thus you need to install cunn:

    $ luarocks install cunn
  2. lua-knn. It is used to compute the distance between neighbor samples. Go into the folder, and then compile it manually. The steps are:

    $ mkdir build && cd build
    $ cmake -D CUDA_TOOLKIT_ROOT_DIR=/your/cuda-toolkit/dir ..
    $ make

    You will see libknn.so in the same folder. Then copy it to folder torch/install/lib/lua/5.1, and copy init.lua to folder torch/install/share/lua/5.1/knn. Make a new folder knn if you cannot find it.

Typically, you can run our code after installing the above two packages. Please let me know if error occurs.

Train model

  1. It is very simple to run the code for training model. For example, if you want to train on USPS dataset, you can run:

    $ th train.lua -dataset USPS -eta 0.9

    Note that it runs on fast mode by default. You can change it to regular mode by setting "-use_fast 0". In the above command, eta is the unfolding rate. For face dataset, we recommand 0.2, while for other datasets, it is set to 0.9 to save training time. During training, you will see the normalize mutual information (NMI) for the clustering results.

  2. You can train multiple models in parallel by:

    $ th train.lua -dataset USPS -eta 0.9 -num_nets 5

    By this way, you weill get 5 different models, and thus 5 possible different results. Statistics such as mean and stddev can be computed on these results.

  3. You can also get the clustering performance when using raw image data and random CNN by

    $ th train.lua -dataset USPS -eta 0.9 -updateCNN 0
  4. You can also change other hyper parameters for model training, such as K_s, K_c, number of epochs in each partial unrolled period, etc.

Datasets

We upload six small datasets: COIL-20, USPS, MNIST-test, CMU-PIE, FRGC, UMist. The other large datasets, COIL-100, MNIST-full and YTF can be found in my google drive here.

Compared Approaches

We upload the code for the compared approaches in matlab folder. Please refer to the original paper for details and cite them properly. In this foler, we also attach the evaluation code for two metric: normalized mutual information (NMI) and clustering accuracy (AC).

Extensions

  1. Data Visualization: With a few modifications, the proposed approach can be used to visualize high-dimensional data in low dimension, e.g., 2D, 3D. As shown in our paper, the visualization performance beats parametric t-SNE on MNIST dataset. Please refer to the repo for detail. More experimental results on both image data and other modality data will come soon.

Q&A

You are welcome to send message to (jw2yang at vt.edu) if you have any issue on this code.

About

Project for our CVPR2016 paper "Joint Unsupervised Learning of Deep Representations and Image Clusters"

License:MIT License


Languages

Language:MATLAB 79.9%Language:C++ 11.5%Language:Lua 5.3%Language:Jupyter Notebook 1.0%Language:C 0.9%Language:Python 0.7%Language:M 0.6%Language:Shell 0.0%