keon / seq2seq

Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

mini seq2seq

Minimal Seq2Seq model with attention for neural machine translation in PyTorch.

This implementation focuses on the following features:

  • Modular structure to be used in other projects
  • Minimal code for readability
  • Full utilization of batches and GPU.

This implementation relies on torchtext to minimize dataset management and preprocessing parts.

Model description

Requirements

  • GPU & CUDA
  • Python3
  • PyTorch
  • torchtext
  • Spacy
  • numpy
  • Visdom (optional)

download tokenizers by doing so:

python -m spacy download de
python -m spacy download en

References

Based on the following implementations

About

Minimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch

License:MIT License


Languages

Language:Python 100.0%