Official Pytorch implementation of the paper: "Locally Shifted Attention With Early Global Integration"
Repository from Github https://github.comshellysheynin/Locally-SAG-TransformerRepository from Github https://github.comshellysheynin/Locally-SAG-Transformer