DS1011-NLP Final Project Neural Machine Translation
- Recurrent neural network based encoder-decoder without attention
- vi-en: model_1_simple_vi_en.ipynb
- zh-en: model_1_simple_zh_en.ipynb
- Recurrent neural network based encoder-decoder with attention
-
(a). dot_product based attention
-
vi-en: dotpro_atten_vi_en.ipynb
-
zh-en: dotpro_atten_zh_en.ipynb
-
(b). neural net based attention
-
vi-en: neural_attent_vi_en.ipynb
-
zh-en: neural_attent_zh_en.ipynb
-
- Replace the recurrent encoder with convolutional based encoder
- Convolution based dot product attention: conv_dot_product_attention.ipynb
- Transformer, fully self-attention translation system
- Transformer.ipynb
- Preprocessing:
- Preprocessing.ipynb: for dot_product based attention model, Convolution based dot product attention model and Transformer