ouryuu / LLM4TS

Large Language Models for Time Series.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

English | 简体中文

LLM4TS: Large Language Models for Time Series

This project collects the papers and codes of Large Language Models (LLMs) and Foundation Models (FMs) for Time Series (TS). Hope this project can help you to understand the LLMs and FMs for TS.

🦙 LLMs for Time Series

After the success of BERT, GPT, and other LLMs in NLP, some researchers have proposed to apply LLMs to Time Series (TS) tasks. They fintune the LLMs on TS datasets and achieve state-of-the-art results.

  • PromptCast: A New Prompt-based Learning Paradigm for Time Series Forecasting Hao, in arXiv 2022. [Paper]

  • One Fits All: Power General Time Series Analysis by Pretrained LM, in arXiv 2023. [Paper]

  • Temporal Data Meets LLM -- Explainable Financial Time Series Forecasting, in arXiv 2023. [Paper]

  • TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series. [Paper]

  • LLM4TS: Two-Stage Fine-Tuning for Time-Series Forecasting with Pre-Trained LLMs. [Paper]

  • The first step is the hardest: Pitfalls of Representing and Tokenizing Temporal Data for Large Language Models. [Paper]

  • Large Language Models Are Zero-Shot Time Series Forecasters. [Paper]

  • TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting. [Paper]

  • Time-LLM: Time Series Forecasting by Reprogramming Large Language Models. [Paper]

  • S2IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting. [Paper]

📍 Survey

  • Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook. [Survey]

  • Position Paper: What Can Large Language Models Tell Us about Time Series Analysis. [Survey]

  • Foundation Models for Time Series Analysis: A Tutorial and Survey [Survey]

📍 Similar Things

  • Large Language Models are Few-Shot Health Learners, in arXiv 2023. [Paper]

  • Frozen Language Model Helps ECG Zero-Shot Learning, in arXiv 2023.[Paper]

🧱 Foundation Models for Time Series

Recently, some kinds of Foundation Models (FMs) for Time Series (TS) have been proposed. These FMs aims to learn the representation of Time Series from large datasets and then transfer the representation to downstream tasks. Compared with TS-LLMs, these methods do not depend on the pretrained LLMs.

  • Tiny Time Mixers (TTMs): Fast Pretrained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series. [Paper]

  • A decoder-only foundation model for time-series forecasting. [Paper]

  • TimeGPT-1. [Paper]

  • Lag-Llama: Towards Foundation Models for Time Series Forecasting. [Paper]

  • Unified Training of Universal Time Series Forecasting Transformers. [Paper]

  • MOMENT: A Family of Open Time-series Foundation Models. [Paper]

  • Chronos: Learning the Language of Time Series [Paper] [GitHub]

🔗 Related Fields

Here, some related fields are listed. These fields are not the main focus of this project, but they are also important for understanding how LLMs are applied to other fields rather than NLP and FMs in specific fields are developed.

📍 PreTrained Time Series

  • A Survey on Time-Series Pre-Trained Models, in arXiv 2023. [Paper]

  • Transfer learning for Time Series Forecasting. [GitHub]

  • TST: A transformer-based framework for multi- variate time series representation learning. [Paper]

  • Ti-mae: Self-supervised masked time series autoencoders. [Paper]

  • SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling. [Paper]

  • Cost: Contrastive learning of disentangled seasonal-trend rep- resentations for time series forecasting.[Paper]

  • TS2Vec: Towards Universal Representation of Time Series. [Paper]

📍 LLM for Recommendation Systems

  • Recommendation as Language Processing (RLP): A Unified Pretrain, Personalized Prompt & Predict Paradigm (P5), in arXiv 2022. [Paper]
  • LLM4Rec. [GitHub]

📍 LLM/FM for Tabular Data

  • AnyPredict: Foundation Model for Tabular Prediction, in arXiv 2023. [Paper]
  • XTab: Cross-table Pretraining for Tabular Transformers, in ICML 2023. [Paper]

📍 LLM in Production (LLMOps)

About

Large Language Models for Time Series.