microcoder-py / attn-is-all-you-need

A TFX implementation of the paper on transformers, Attention is All You Need

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Attention Pipelines Are All You Need

A TFX implementation of the paper on transformers, Attention is All You Need

To Understand Transformers

I have written a blog post to explain what Transformers themselves do. In this example, I have followed the paper exactly and haven't used teacher forcing during training. Anyone looking for an example with teacher forcing should try and consider the official Tensorflow Guide on Transformers

There are also some amazing resources I found on Transformers and Attention in general. These two should suffice for an overview of the whole concept.

To Understand TFX Pipelines

I have not described the functioning of the pipeline here, but anyone who might be inclined to understand it can refer an older repository I built: example-tfx-pipeline-text-classifier

It is an example on text classification, but the difference between that and Neural Machine Translation should be apparent once you start reading the code carefully. Wherever needed, I have added additional comments in the code for better understanding, feel free to drop in questions if any

Prerequisites

python3

TFX:

pip install tfx

TFX - Basic Shared Libraries:

pip install tfx-bsl

Execution

  1. Store your data in a CSV format in a folder titled data_root at same file hierarchy as pipeline.py
  2. Run python3 pipeline.py

If you want to change any defaults, do so by modifying the code

CITATION

@misc{vaswani2017attention,
      title={Attention Is All You Need}, 
      author={Ashish Vaswani and Noam Shazeer and Niki Parmar and Jakob Uszkoreit and Llion Jones and Aidan N. Gomez and Lukasz Kaiser and Illia Polosukhin},
      year={2017},
      eprint={1706.03762},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

A TFX implementation of the paper on transformers, Attention is All You Need

License:MIT License


Languages

Language:Python 100.0%