abhishek-niranjan / tensorflow2-rnn-tutorials

Tensorflow 2.0 tutorials for RNN based architectures for textual problems

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The tutorial notebooks in this repository are aimed to help beginners program basic recurrent neural networks(RNNs) for textual problems in tensorflow-2.

Prerequisite

  • Understanding of basic textual processing methods, i.e familiarity with tokenization, etc.
  • Functioning of basic RNN, GRU, LSTM cells
  • Fundamentals of RNN based Encoder-Decoder Architectures
  • Attention mechanism, mainly Bahdanau Attention and Luong Attention
  • A basic idea of beam search algorithm.

Contents

  1. utils directory contains helper classes and functions.

    • utils/dataset.py contains NMTDataset class which creates training and validation tf.data.Dataset splits and also returns input and target side tokenizers (tf.keras.preprocessing.text.Tokenizer). The working of utils/dataset.py have been explained in first notebook on text-processing notebook
    • utils/attention.py contains BahdanauAttention and LuongAttention class. These attention mechanisms have also been explained in [encoder-decoder with attention notebooks](lesson 4 and lesson 5)
  2. tutorial-notebooks directory contains all the jupyter notebooks.

About

Tensorflow 2.0 tutorials for RNN based architectures for textual problems


Languages

Language:Jupyter Notebook 98.4%Language:Python 1.6%