This code is for our 2022 ECCV paper A Sketch Is Worth a Thousand Words: Image Retrieval with Text and Sketch
This repo is based on open_clip implementation from https://github.com/mlfoundations/open_clip
|---model/ : Contain the trained model*
|---sketches/ : Contain example query sketch
|---images/ : Contain 100 randomly sampled images from COCO TBIR benchmark
|---notebooks/ : Contain the demo ipynb notebook
|---code/
|---training/model_configs/ : Contain model config file for the network
|---clip/ : Contain source code for running the notebook
*need to be downloaded first
- Pytorch
- ftfy
-
Simply open jupyter notebook in
notebooks/Retrieval_Demo.ipynb
for an example of how to retrieve images using our model, -
You can use your own set of images and sketches by modifying the
images/
andsketches/
folder accordingly. -
Colab version of the notebook is available [here]
If you find it this code useful for your research, please cite:
"A Sketch Is Worth a Thousand Words: Image Retrieval with Text and Sketch"
Patsorn Sangkloy, Wittawat Jitkrittum, Diyi Yang, James Hays in ECCV, 2022.
@article{
tsbir2022,
author = {Patsorn Sangkloy and Wittawat Jitkrittum and Diyi Yang and James Hays},
title = {A Sketch is Worth a Thousand Words: Image Retrieval with Text and Sketch},
journal = {European Conference on Computer Vision, ECCV},
year = {2022},
}