LyapunovStability's repositories
Awesome-Diffusion-Models
A collection of resources and papers on Diffusion Models and Score-based Models, a darkhorse in the field of Generative Models
mvts_transformer
Multivariate Time Series Transformer, public version,A Transformer-Based Framework for Multivariate Time Series Representation Learning
partial-encoder-decoder
An encoder-decoder framework for learning from incomplete data
TNformer-MP
Temporal Neighboring Multi-Modal Transformer with Missingness-Aware Prompt for Hepatocellular Carcinoma Prediction
AirEvaluation
GreenEyes: An Air Quality Level Fitting Model based on WaveNet
Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
awesome-domain-adaptation
A collection of AWESOME things about domian adaptation
Awesome-TimeSeries-SpatioTemporal-LM-LLM
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
denoising-diffusion-gan
Tackling the Generative Learning Trilemma with Denoising Diffusion GANs https://arxiv.org/abs/2112.07804
DTW-Pool
Implementation of dynamic temporal pooling (DTP) for time series classification
FiLM
FiLM: Frequency improved Legendre Memory Model for Long-term Time Series Forecasting
LTSF-Linear
This is the official implementation for AAAI-23 paper "Are Transformers Effective for Time Series Forecasting?"
MIMIC_Extract
MIMIC-Extract:A Data Extraction, Preprocessing, and Representation Pipeline for MIMIC-III
N-BEATS
N-BEATS is a neural-network based model for univariate timeseries forecasting. N-BEATS is a ServiceNow Research project that was started at Element AI.
PatchTST
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." https://arxiv.org/abs/2211.14730
pytorch-adapt
Domain adaptation pytorch package
SCINet
The GitHub repository for the paper: “Time Series is a Special Sequence: Forecasting with Sample Convolution and Interaction“. (NeurIPS 2022)
tab-transformer-pytorch
Implementation of TabTransformer, attention network for tabular data, in Pytorch
TARNet
Task-Aware Reconstruction for Time-Series Transformer
torchdiffeq
Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.
transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
UDA
ToAlign--Task-oriented Alignment for Unsupervised Domain Adaptation
wandb_tutorial
how to use wandb