abulice / transformer-multi30k

NMT with Convolutional Seq2Seq and Transfromer

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Transformer-multi30k

NMT with Convolutional Seq2Seq and Transfromer

Dependancies

  • Used Python Version:3.7.0
  • Install necessary modules with sudo pip3 install -r requiremnets.txt command.

Model Training and Testing:

To train and test the model --> python3 train_and_test.py

Model Parameters:

For Conv-seq2seq model:

  • Embedding_dimension = 256
  • Hidden_dimension = 512
  • Number of encoder layers = 10
  • Number of decoder layers = 10
  • Encoder kernel size = 3
  • Decoder kernel size = 3
  • Encoder dropout = 0.25
  • Decoder dropout = 0.25

For Transformer:

  • Hidden_dimension = 512
  • Number of layers = 6
  • Number of heads = 8
  • Position feedforward layer dimension = 2048
  • Dropout = 0.1

About

NMT with Convolutional Seq2Seq and Transfromer


Languages

Language:Python 100.0%