The crash course to the deep learning for NLP at ABBYY.
The textbook: Neural Network Methods in Natural Language Processing by Yoav Goldberg
A short overview of the most popular architectures in the deep learning.
A brief introduction into the keras framework.
Examples of models for sentiment analysis on the IMDB movie review dataset.
The Colab notebook.
Additional materials and deadlines.
An introduction into the PyTorch framework.
Example of a simple bag-of-words model for the sentiment analysis.
Examples of Word2Vec models: CBOW, Skip-Gram and Negative Sampling.
The Colab notebook.
Additional materials and deadlines.
A simple implementation of the vanilla RNN and an illustration of the vanishing gradient problem.
Examples of character-level LSTMs for:
- Text classification: language prediction by the surname's spelling;
- Conditional text generation: generation of the surname given language.
The Colab notebook.
Additional materials and deadlines.
An implementation of a character-level BiLSTM POS-tagger.
An overview of word-level language models: N-gram, fully-connected and RNN ones.
The Colab notebook.
Additional materials and deadlines.
An implementation of simple Seq2Seq model for machine translation.
An example of Attention-based machine translation model.
The Colab notebook.
Additional materials and deadlines.
An example of character-level convolutional neural network for word classification.
An explanation of similarities and difference between convolutions and attention.
An overview of Transformer architecture.
The Colab notebook.
Additional materials and deadlines.