NMT by Jointly Learning to Algin and Translate (Using Seq2seq + Attention)
Sequence to sequence networks are powerful methods to use two recurrent neural networks (encoder-decoder) to turn one sequence into another; in this case, a French-English translation.
This code uses modified parts of the tutorial given by PyTorch tutorials in "Translation with a Sequence to Sequence Network and Attention" in the following link:
https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html
The Seq2seq model architecture used in this project closely follows the above tutorial.
References
For further studies about word embeddings, read the papers below:
- NMT by Jointly Learning to Algin and Translate (Bahdanau et al.)
- Sequence to Sequence Learning with Neural Networks (Sutskever et al.)
Run
python3 train.py