There are 33 repositories under attention-mechanism topic.
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
all kinds of text classification models and more with deep learning
A collection of important graph embedding, classification and representation learning papers with implementations.
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Attention mechanism Implementation for Keras.
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Reformer, the efficient Transformer, in Pytorch
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Text classifier for Hierarchical Attention Networks for Document Classification
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
TensorFlow Implementation of "Show, Attend and Tell"
An implementation of Performer, a linear attention-based transformer, in Pytorch
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Visualizing RNNs using the attention mechanism
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Attention mechanism for processing sequential data that considers the context for each timestamp.
MORAN: A Multi-Object Rectified Attention Network for Scene Text Recognition
Implementation of Bottleneck Transformer in Pytorch
Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
End-to-end ASR/LM implementation with PyTorch
A Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
A Structured Self-attentive Sentence Embedding