neithen-Lu / encoding_recurrence_into_transformers

The official guidance for reproducing the Encoding Recurrence into Transformers at ICLR 2023.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

encoding_recurrence_into_transformers

The official guidance for reproducing the Encoding Recurrence into Transformers at ICLR 2023.

Thanks for the interest in our paper. Unfortunately, due to our collaborators' corporate regulations, we are unable to share the codes. Nevertheless, we strive to ensure a certain level of reproducibility for our work. Therefore, we have provided:

Computation details of our non-dilated and dilated REMs in Appendix D.1, the detailed training scheme and hyperparameter settings in Appendix E.2. A detailed description of the repositories we used to build our model in the supplementary materials.

Of course we welcome any unofficial implementation of our work!

About

The official guidance for reproducing the Encoding Recurrence into Transformers at ICLR 2023.