There are 1 repository under luong-attention topic.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Tensorflow 2.0 tutorials for RNN based architectures for textual problems
My current project about building chatbot with deep neural networks. This repo show my chatbot code for deployment purpose
A simple and easy to understand NLP teaching
Seq2Seq model implemented with pytorch, using Bahdanau Attention and Luong Attention.
Image Captioning is the process of generating textual description of an image. It uses both Natural Language Processing and Computer Vision to generate the captions.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
This repository contains TensorFlow/Keras models for implementing an Encoder-Decoder architecture for sequence-to-sequence tasks. It includes components such as Encoder, Decoder, Embedding Layer, LSTM Layer, Attention Mechanism, and more.