Shih-Han Chou, Yi-Chun Chen, Kuo-Hao Zeng, Hou-Ning Hu, Jianlong Fu, Min Sun
Association for the Advancement of Artificial Intelligence (AAAI) ,2018
Official Implementation of AAAI 2018 paper "Self-view Grounding Given a Narrated 360° Video" in Pytorch.
Project page: http://aliensunmin.github.io/project/360grounding/
Paper: ArXiv, AAAI18
- Linux
- NVIDIA GPU + CUDA 7.0 + CuDNNv5.1
- Python 2.7 with numpy
- Pytorch 0.3.0
- Clone this repo
git clone https://github.com/ShihHanChou/360grounding.git
cd 360grounding
mkdir data
mkdir data/trained_model
- Download our dataset
Please download the dataset here and place it under ./data
.
- To train a model with downloaded dataset:
python main.py --batch_size $batch_size$ --epoches $#epoches$ --save_dir $save_directory$ --mode train --video_len $video_sample_length$ --MAX_LENGTH 33
- To test a model with downloaded dataset:
python main.py --batch_size $batch_size$ --epoches $which_epoches$ --save_dir $save_directory$ --mode test --MAX_LENGTH 30
If you find our code useful for your research, please cite
@inproceedings{chou2018self,
title={Self-view grounding given a narrated 360 video},
author={Chou, Shih-Han and Chen, Yi-Chun and Zeng, Kuo-Hao and Hu, Hou-Ning and Fu, Jianlong and Sun, Min},
booktitle={Thirty-Second AAAI Conference on Artificial Intelligence},
year={2018}
}