There are 5 repositories under attention-lstm topic.
Plot the vector graph of attention based text visualisation
This repository contains PyTorch implementation of 4 different models for classification of emotions of the speech.
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
[AAAI SAP 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Ipython Notebooks for solving problems like classification, segmentation, generation using latest Deep learning algorithms on different publicly available text and image data-sets.
Forex price movement forecast
A Tensorflow 2 (Keras) implementation of DA-RNN (A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction, arXiv:1704.02971)
This is a implementation of integrating a simple but efficient attention block in CNN + bidirectional LSTM for video classification.
attention mechanism in keras, like Dense and RNN...
SERVER: Multi-modal Speech Emotion Recognition using Transformer-based and Vision-based Embeddings
Generating text sequences using attention-based Bi-LSTM
Deep representation of visual and textual descriptions using StackGAN
This repository implementation of the Attention mechanism using Tensorflow using various examples.
Experiments with Deep Learning for generating music
📃 | Deep Text Recognition Implementation using PyTorch
An Image Caption Generation based search
Keyword Spotting using RNN with Attention layers
Detect anomalies from a embed system log using RNN with attention layer.
Prepare Text Reviews Summary
A NMT model using LSTM to convert a sentence from source language (Spanish) to target language (English)
Automatic Essay Grader implemented using Keras.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Conversational agent model for (person A-person B-person A) dialogue types, based on seq2seq model, global attention and antilanguage model
Use Encoder Decoder architecture with LSTM for English->German translation network
Tensoflow 1.11 implementation of the Neural Sign Language Translation CVPR 2018 paper
Deep neural network for sequential data
Pytorch implementation of MirrorGAN