There are 13 repositories under attention-model topic.
DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Keras Attention Layer (Luong and Bahdanau scores).
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
Text classification using deep learning models in Pytorch
Implementation of the Swin Transformer in PyTorch.
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
A Structured Self-attentive Sentence Embedding
Camouflaged Object Detection, CVPR 2020 (Oral)
A PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Attention OCR Based On Tensorflow
This repository implements the the encoder and decoder model with attention model for OCR
PyTorch implementation of batched bi-RNN encoder and attention-decoder.
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Bidirectional LSTM network for speech emotion recognition.
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
attention model for entailment on SNLI corpus implemented in Tensorflow and Keras
code of Relation Classification via Multi-Level Attention CNNs
Soft attention mechanism for video caption generation
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Notes about "Attention is all you need" video (https://www.youtube.com/watch?v=bCz4OMemCcA)
ECG Classification
A Keras-based library for analysis of time series data using deep learning algorithms.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Image Captioning based on Bottom-Up and Top-Down Attention model
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
HBAM: Hierarchical Bi-directional Word Attention Model
Deep Visual Attention Prediction (TIP18)