There are 3 repositories under attention-seq2seq topic.
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Image to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
TensorFlow implementation of Match-LSTM and Answer pointer for the popular SQuAD dataset.
Fully batched seq2seq example based on practical-pytorch, and more extra features.
Gathers Tensorflow deep learning models.
Attention-based end-to-end ASR on TIMIT in PyTorch
Text Summarizer implemented in PyTorch
Generates summary of a given news article. Used attention seq2seq encoder decoder model.
A T5-based Seq2Seq Model that Generates Titles for Machine Learning Papers using the Abstract
Convolution Sequence to Sequence models for Hand Written Text Recognition
Summaries and notes on Deep Learning research papers in natural language processing(NLP) domain.
An Image Caption Generation based search
Analysis of 'Attention is not Explanation' performed for the University of Amsterdam's Fairness, Accountability, Confidentiality and Transparency in AI Course Assignment, January 2020
Used Tensorflow and Keras Framework
Basic seq2seq model including simplest encoder & decoder and attention-based ones
Tensorflow2.0 implementation of neural machine translation with Bahdanau attention
A simple attention deep learning model to answer questions about a given video with the most relevant video intervals as answers.
Generating short length description of news articles
Seq2Seq model that restores punctuation on English input text.
Vietnamese and Chinese to English
Text Summarization using Abstractive text summarization
Three different implementations for neural machine translation
English-Hindi translation with attention. WIP