LibertFan / MAN

Mask Attention Networks: Rethinking and Strengthen Transformer in NAACL2021

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Mask Attention Networks

This repo contains the codes and pretrained models for our paper:

Mask Attention Networks: Rethinking and Strengthen Transformer

The two sub-directories includes reproducible codes and instructions for the machine translation and abstractive summarization. Please find the READMEs in the sub-directories for the detailed instructions for reproduction.

About

Mask Attention Networks: Rethinking and Strengthen Transformer in NAACL2021

License:MIT License


Languages

Language:Python 96.2%Language:Shell 2.6%Language:Lua 0.6%Language:C++ 0.5%