There are 12 repositories under attention-model topic.
DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Attention mechanism Implementation for Keras.
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Text classification using deep learning models in Pytorch
Implementation of the Swin Transformer in PyTorch.
Neural Machine Translation with Keras
The implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
A Structured Self-attentive Sentence Embedding
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
:punch: CV中常用注意力模块;即插即用模块;ViT模型. PyTorch Implementation Collection of Attention Module and Plug&Play Module
Attention OCR Based On Tensorflow
Camouflaged Object Detection, CVPR 2020 (Oral)
A PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
This repository implements the the encoder and decoder model with attention model for OCR
PyTorch implementation of batched bi-RNN encoder and attention-decoder.
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Bidirectional LSTM network for speech emotion recognition.
attention model for entailment on SNLI corpus implemented in Tensorflow and Keras
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
code of Relation Classification via Multi-Level Attention CNNs
Code & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Soft attention mechanism for video caption generation
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
A Keras-based library for analysis of time series data using deep learning algorithms.
Seq2SeqSharp is a tensor based fast & flexible encoder-decoder deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, many different types of encoders/decoders(Transformer, LSTM, BiLSTM and so on), multi-GPUs supported and so on.
Image Captioning based on Bottom-Up and Top-Down Attention model
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Deep Visual Attention Prediction (TIP18)
The implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
The implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Chatbot in russian with speech recognition using PocketSphinx and speech synthesis using RHVoice. The AttentionSeq2Seq model is used. Imlemented using Python3+TensorFlow+Keras.
Official Pytorch Implementation of our paper: Video Person Re-ID : Fantastic Techniques and Where to Find Them