tkhan3 / fashion-compatibility

Learning Type-Aware Embeddings for Fashion Compatibility

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Learning Type-Aware Embeddings for Fashion Compatibility

fashion-compatibility contains a PyTorch implementation for our paper. If you find this code or our dataset useful in your research, please consider citing:

@inproceedings{VasilevaECCV18FasionCompatibility,
Author = {Mariya I. Vasileva and Bryan A. Plummer and Krishna Dusad and Shreya Rajpal and Ranjitha Kumar and David Forsyth},
Title = {Learning Type-Aware Embeddings for Fashion Compatibility},
booktitle = {ECCV},
Year = {2018}
}

This code was tested on an Ubuntu 16.04 system using Pytorch version 0.1.12. It is based on the official implementation of the Conditional Similarity Networks paper.

Usage

You can download the Polyvore Outfits dataset including the splits and questions for the compatibility and fill-in-the-blank tasks from here. After unpacking the dataset make any necessary updates to the data root directory in polyvore_outfits.py.

Afterwards, you can train the model using python main.py. You can see a listing and description of many tuneable parameters with:

    python main.py --help

For example, to learn masks used for the projections for each type specific embeddings, rather than the default which creates fixed masks, you would use:

    python main.py --name {your experiment name} --learned

By default the code outputs the results on the test set after training. However, if you wanted to re-run the test for many settings you have to use the same flags during testing as you had during training. For example, if you trained with the --use_fc to train fully connected type-specific embeddings rather than a mask, at test time you would use:

   python main.py --test --use_fc --resume runs/{your experiment name}/model_best.pth.tar

About

Learning Type-Aware Embeddings for Fashion Compatibility

License:BSD 3-Clause "New" or "Revised" License


Languages

Language:Python 100.0%