There are 1 repository under transformer-encoder topic.
[NeurIPS‘2021] "TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Implementation of Transformer Model in Tensorflow
[IGARSS'22]: A Transformer-Based Siamese Network for Change Detection
Seq2SeqSharp is a tensor based fast & flexible deep neural network framework written by .NET (C#). It has many highlighted features, such as automatic differentiation, different network types (Transformer, LSTM, BiLSTM and so on), multi-GPUs supported, cross-platforms (Windows, Linux, x86, x64, ARM), multimodal model for text and images and so on.
Multi-module Recurrent Convolutional Neural Network with Transformer Encoder for ECG Arrhythmia Classification
This repository contains PyTorch implementations of the models from the paper An Empirical Study MIME: MIMicking Emotions for Empathetic Response Generation.
The repo is for the Heart Disease classification project using Transformer Encoders in PyTorch.
This project aims to implement the Transformer Encoder blocks using various Positional Encoding methods.
Temporary remove unused tokens during training to save ram and speed.
Generating English Rock lyrics using BERT
题目知识点预测标注。Question knowledge point prediction.
Official Pytorch implementation of (Roles and Utilization of Attention Heads in Transformer-based Neural Language Models), ACL 2020
Transformer Encoder with Multiscale Deep Learning for Pain Classification Using Physiological Signals
Vision Transformer Implementation in TensorFlow
Code for the ACL 2019 paper "Observing Dialogue in Therapy: Categorizing and Forecasting Behavioral Codes"
Contextual embedding for text blobs.
PyTorch implementation of RealFormer: Transformer Likes Residual Attention
Transformer OCR is a Optical Character Recognition tookit built for researchers working on both OCR for both Vietnamese and English. This project only focused on variants of vanilla Transformer (Conformer) and Feature Extraction (CNN-based approach).
:sparkles: Solve multi_dimensional multiple knapsack problem using state_of_the_art Reinforcement Learning Algorithms and transformers
Contains pytorch implementation of Transformer and EvolvedTransformer architectures. WIP
Repository for a transformer I coded from scratch and trained on the tiny-shakespeare dataset.
Detection of MBTI-type personality with NLP and Deep Learning
A Transformer Implementation that is easy to understand and customizable.
Pytorch实现transformer编码器+attention的文本分类算法
French English Machine Translation. Natural language processing (NLP) transformer model from "Attention Is All You Need"
Co-Driven Recognition of Semantic Consistency via the Fusion of Transformer and HowNet Sememes Knowledge
Topic modeling using BERT and LDA combination.
A deep learning classification tool for anomalous diffusion trajectories.
Semantic Textual Similarity between two document