nerdslab / EIT

PyTorch implementation of "Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers" (NeurIPS 2022)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

EIT: Embedded Interaction Transformer

PyTorch implementation of "Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers" (NeurIPS 2022).

Instructions

Synthetic experiments

Use synthetic_data_threebody.py or synthetic_data_twobody.py to generate synthetic data, and then use the corresponding synthetic_exp_EXP.py to run experiments. The models are defined in synthetic_exp_twobody.py.

Neural activity experiments

The modified transformer basic operation sets are inside the my_transformers folder.

The datasets, models, and trainers are inside the neural_kits folder.

The scripts are inside main.py, where the flag MAEorVIT determines the experiment type. This codebase provides:

  1. The training of EIT,
  2. The training of the individual module of EIT,
  3. Transfer pre-trained EIT across different animals,
  4. Transfer pre-trained EIT across different targets,
  5. Transfer pre-trained EIT across different timepoints,
  6. Training a vanilla transformer supervisedly or self-supervisedly as the benchmark.

Where to find the training datasets: For the Mihi-Chewie datasets, you can checkout this. For the Jenkins’ Maze datasets, you can checkout this.

Code contributors

  • Ran Liu (Maintainer), github: ranliu98

  • Jingyun Xiao, github: jingyunx

  • Mehdi Azabou , github: mazabou

Citation

If you find the code useful for your research, please consider citing our work:

@inproceedings{liu2022seeing,
  title={Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers},
  author={Liu, Ran and Azabou, Mehdi and Dabagia, Max and Xiao, Jingyun and Dyer, Eva L},
  journal={arXiv preprint arXiv:2206.06131},
  year={2022}
}

AND Swap-VAE

@article{liu2021drop,
  title={Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity},
  author={Liu, Ran and Azabou, Mehdi and Dabagia, Max and Lin, Chi-Heng and Gheshlaghi Azar, Mohammad and Hengen, Keith and Valko, Michal and Dyer, Eva},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  pages={10587--10599},
  year={2021}
}

About

PyTorch implementation of "Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers" (NeurIPS 2022)

License:MIT License


Languages

Language:Python 100.0%