Mingyue-Cheng / Awesome-FM4TS

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Awesome Self-supervised Foundation Model and Transferring Learning in Time Series Representation

Foundation Model

  1. Das, A., Kong, W., Sen, R., & Zhou, Y. (2023). A decoder-only foundation model for time-series forecasting. arXiv preprint arXiv:2310.10688.

Self-supervised Time Series Representation

  1. Liu J, Chen S. TimesURL: Self-supervised Contrastive Learning for Universal Time Series Representation Learning[J]. arXiv preprint arXiv:2312.15709, 2023.
  2. Cheng M, Liu Q, Liu Z, et al. TimeMAE: Self-Supervised Representations of Time Series with Decoupled Masked Autoencoders[J]. arXiv preprint arXiv:2303.00320, 2023.
  3. Rasul K, Ashok A, Williams A R, et al. Lag-Llama: Towards Foundation Models for Time Series Forecasting[J]. arXiv preprint arXiv:2310.08278, 2023.
  4. Das A, Kong W, Sen R, et al. A decoder-only foundation model for time-series forecasting[J]. arXiv preprint arXiv:2310.10688, 2023.
  5. Garza A, Mergenthaler-Canseco M. TimeGPT-1[J]. arXiv preprint arXiv:2310.03589, 2023.
  6. Zhang K, Wen Q, Zhang C, et al. Self-Supervised Learning for Time Series Analysis: Taxonomy, Progress, and Prospects[J]. arXiv preprint arXiv:2306.10125, 2023.

Transferring Learning from Pre-trained Time Series Model

Parameter-Efficient Tuning

Prompt Tuning

About