lgstd's repositories

adaptive-transformers-in-rl

Adaptive Attention Span for Reinforcement Learning

Language:PythonStargazers:0Issues:1Issues:0

adaptive_transformer

Code for the paper "Adaptive Transformers for Learning Multimodal Representations" (ACL SRW 2020)

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

AST

Adversarial Sparse Transformer for Time Series Forecasting

Language:PythonStargazers:0Issues:1Issues:0
Language:Jupyter NotebookStargazers:0Issues:1Issues:0

bert_seq2seq

pytorch实现bert做seq2seq任务,使用unilm方案,现在也可以做文本分类,情感分析等NLU任务。

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

ContinuousParetoMTL

[ICML 2020] PyTorch Code for "Efficient Continuous Pareto Exploration in Multi-Task Learning"

Language:Jupyter NotebookStargazers:0Issues:1Issues:0

DeBERTa

The implementation of DeBERTa

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

Deformable-DETR

Deformable DETR: Deformable Transformers for End-to-End Object Detection.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:0Issues:0

DLinear

This is an official implementation for "Are Transformers Effective for Time Series Forecasting?"

Language:PythonStargazers:0Issues:0Issues:0

fastai1

v1 of the fastai library. v2 is the current version. v1 is still supported for bug fixes, but will not receive new features.

Language:Jupyter NotebookLicense:Apache-2.0Stargazers:0Issues:1Issues:0

Highway-Transformer

[ACL‘20] Highway Transformer demo code.

Language:PythonStargazers:0Issues:1Issues:0

Informer2020

The GitHub repository for the paper accepted by AAAI 2021.

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0

NeZha_Chinese_PyTorch

NEZHA: Neural Contextualized Representation for Chinese Language Understanding

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

Nonstationary_Transformers

Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415

Language:PythonLicense:MITStargazers:0Issues:0Issues:0

Optimus

Optimus: the first large-scale pre-trained VAE language model

Language:PythonStargazers:0Issues:1Issues:0

pytorch-softdtw-cuda

Fast CUDA implementation of (differentiable) soft dynamic time warping for PyTorch using Numba

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

pytorchic-bert

Pytorch Implementation of Google BERT

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:1Issues:0

recurrent-transformer

[ACL 2020] PyTorch code for MART: Memory-Augmented Recurrent Transformer for Coherent Video Paragraph Captioning

Language:Jupyter NotebookLicense:MITStargazers:0Issues:1Issues:0

reformer-pytorch

Reformer, the efficient Transformer, in Pytorch

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

RoFormer_pytorch

RoFormer_pytorch

Language:PythonStargazers:0Issues:1Issues:0

segatron_aaai

codes and pre-trained models of paper "Segatron: Segment-aware Transformer for Language Modeling and Understanding"

Language:PythonStargazers:0Issues:1Issues:0
Language:Jupyter NotebookStargazers:0Issues:0Issues:0
Language:PythonStargazers:0Issues:1Issues:0

Tianchi2020ChineseMedicineQuestionGeneration

2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛

Language:PythonStargazers:0Issues:1Issues:0

transformer

Neutron: A pytorch based implementation of Transformer and its variants.

Language:PythonLicense:GPL-3.0Stargazers:0Issues:1Issues:0

Transformers-RL

An easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"

Language:PythonLicense:MITStargazers:0Issues:1Issues:0

trax

Trax — your path to advanced deep learning

Language:PythonLicense:Apache-2.0Stargazers:0Issues:1Issues:0
Language:PythonStargazers:0Issues:1Issues:0