mit-han-lab / lite-transformer

[ICLR 2020] Lite Transformer with Long-Short Range Attention

Home Page:https://arxiv.org/abs/2004.11886

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Could you please point out the core code, as there are too many fairseq code. Thank you!

guotong1988 opened this issue · comments

I am new to lite-transformer.
@Michaelvll Thank you!
@chenw23 Thank you!

Thank you for asking! The main logic of our design is placed in the transformer_multibranch_v2.py, and multibranch.py.