There are 2 repositories under attention-network topic.
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)
[CoRL 2023] Context-Aware Deep Reinforcement Learning for Autonomous Robotic Navigation in Unknown Area - - Public code and model
Google Research 3rd YouTube-8M Video Understanding Challenge 2019. Temporal localization of topics within video. International Conference on Computer Vision (ICCV) 2019.
Image captioning using beam search heuristic on top of the encoder-decoder based architecture
Python 3 supported version for DySAT
This is the official source code of our IEA/AIE 2021 paper
locality-aware invariant Point Attention-based RNA ScorEr
Speech recognition model for recognising Macedonian spoken language.
This work proposes a feature refined end-to-end tracking framework with a balanced performance using a high-level feature refine tracking framework. The feature refine module enhances the target feature representation power that allows the network to capture salient information to locate the target. The attention module is employed inside the feature refine mechanism to improve network discrimination power that augments the network ability to track the target in challenging scenarios.
Gated-ViGAT. Code and data for our paper: N. Gkalelis, D. Daskalakis, V. Mezaris, "Gated-ViGAT: Efficient bottom-up event recognition and explanation using a new frame selection policy and gating mechanism", IEEE International Symposium on Multimedia (ISM), Naples, Italy, Dec. 2022.
Using attention network to extend image quality assessment algorithms for video quality assessment
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Deep learning model for non-coding regulatory variants
A TensorFlow 2.0 Implementation of the Transformer: Attention Is All You Need
Graphs are a general language for describing and analyzing entities with relations/interactions.
An implementation of Transformer Networks using Chainer
A customized version of the Relational Aware Graph Attention Network for large scale EHR records.
An attention network for predicting peptide lengths (and other features) from mass spectrometry data.