waleking / Awesome-Transformers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Awesome-Transformers

A list of transformers

  1. Attention Is All You Need, NIPS 2017 (paper) originial paper

  2. Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, ACL 2019 (paper) (pytorch & tensorflow code) segment-level recurrence, and relative position encoding

  3. COMET: Commonsense Transformers for Automatic Knowledge Graph Construction, ACL 2019 (paper) (pytorch code) used for KG's tuples

  4. Adaptive Attention Span in Transformers, ACL 2019 (paper) (pytorch code) adaptive attention span

  5. XLNet: Generalized Autoregressive Pretraining for Language Understanding, arxiv 2019 (paper) (tensorflow code) permutation language model

  6. Syntactically Supervised Transformers for Faster Neural Machine Translation, ACL 2019 (paper) (pytorch code) non-autoregressive decoding

  7. Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction, ACL 2019 (paper) (pytorch code) relation extraction

  8. Learning Deep Transformer Models for Machine Translation, ACL 2019 (paper) (pytorch code) residual

  9. Large Batch Optimization for Deep Learning: Training BERT in 76 Minutes, arxiv 2019 (paper) distributed computation

  10. Universal Transformers, ICLR 2019 (paper) (tensorflow code) recurrent Transformer blocks

  11. Lattice Transformer for Speech Translation, ACL 2019 (paper) on lattice (directed acyclic graph, a.k.a., DAG)

  12. ERNIE: Enhanced Language Representation with Informative Entities, ACL 2019 (paper) (code)

About