- python 3
- pytorch 0.4.0
- nltk
There is a folder materials/
, which contains some meta data and programs already.
- Download: http://nlp.stanford.edu/data/glove.6B.zip
- Unzip it, find and put
glove.6B.300d.txt
tomaterials/
.
cd materials/
- Run
python make_induced_graph.py
, getimagenet-induced-graph.json
- Run
python make_dense_graph.py
, getimagenet-dense-graph.json
# 200082 edges - Run
python make_dense_grouped_graph.py
, getimagenet-dense-grouped-graph.json
cd materials/
- find the construct section in 'construct_multi_weight_graph.ipynb'.
cd materials/
, run python process_resnet.py
, get fc-weights.json
and resnet101-base.pth
Download ImageNet by your own. and AwA2, create the softlinks (command ln -s
): materials/datasets/imagenet
and materials/datasets/awa2
, to the root directory of the dataset.
An ImageNet root directory should contain image folders, each folder with the wordnet id of the class.
An AwA2 root directory should contain the folder JPEGImages.
Make a directory save/
for saving models.
In most programs, use --gpu
to specify the devices to run the code (default: use gpu 0).
- Only-word-vectors: Run
python train_gcn.py
, get results insave/gcn
- DHG: Run
python train_gcn_att.py
, get results insave/gcn-att
- DHG-r: Run
python train_gcn_att_r.py
, get results insave/gcn-att-r
, in which edges are grouped by directions.
In the results folder:
*.pth
is the state dict of Graph Networks model*.pred
is the prediction file, which can be loaded bytorch.load()
. It is a python dict, having two keys:wnids
- the wordnet ids of the predicted classes,pred
- the predicted fc weights
Run python train_resnet_fit.py
with the args:
--pred
: the.pred
file for finetuning--train-dir
: the directory contains 1K imagenet training classes, each class with a folder named by its wordnet id--save-path
: the folder you want to save the result, e.g.save/resnet-fit-xxx
python train_resnet_fit.py --pred save/gcn-dense-att/epoch-3000.pred --train-dir materials/datasets/imagenet --save-path save/resnet-fit
(In the paper's setting, --train-dir is the folder composed of 1K classes from fall2011.tar, with the missing class "teddy bear" from ILSVRC2012.)
Run python evaluate_imagenet.py
with the args:
--cnn
: path to resnet101 weights, e.g.materials/resnet101-base.pth
orsave/resnet-fit-xxx/x.pth
--pred
: the.pred
file for testing--test-set
: load test set inmaterials/imagenet-testsets.json
, choices:[2-hops, 3-hops, all]
python evaluate_imagenet.py --cnn materials/resnet101-base.pth --pred save/gcn-dense-att/epoch-3000.pred --test-set 2-hops
- (optional)
--keep-ratio
for the ratio of testing data,--consider-trains
to include training classes' classifiers,--test-train
for testing with train classes images only.
Run python evaluate_awa2.py
with the args:
--cnn
: path to resnet101 weights, e.g.materials/resnet101-base.pth
orsave/resnet-fit-xxx/x.pth
--pred
: the.pred
file for testing- (optional)
--consider-trains
to include training classes' classifiers