DC-research / TEMPO

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Time Series Foundation Model

TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting

The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)".

TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.

Please try our foundation model demo [here].

Build the environment

conda create -n tempo python=3.8
conda activate tempo
pip install -r requirements.txt

Get Data

Download the data from [Google Drive] or [Baidu Drive], and place the downloaded data in the folder./dataset. You can also download the STL results from [Google Drive], and place the downloaded data in the folder./stl.

Run TEMPO

Training Stage

bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather].sh

Test

After training, we can test TEMPO model under the zero-shot setting:

bash [ecl, etth1, etth2, ettm1, ettm2, traffic, weather]_test.sh

Pre-trained Models

You can download the pre-trained model from [Google Drive] and then run the test script for fun.

TETS dataset

Here is the prompts use to generate the coresponding textual informaton of time series via [OPENAI ChatGPT-3.5 API]

The time series data are come from [S&P 500]. Here is the EBITDA case for one company from the dataset:

Example of generated contextual information for the Company marked above:

You can download the processed data with text embedding from GPT2 from: [TETS].

Feel free to connect DefuCao@USC.EDU / YanLiu.CS@USC.EDU if you’re interested in applying TEMPO to your real-world application.

Cite

@inproceedings{
cao2024tempo,
title={{TEMPO}: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting},
author={Defu Cao and Furong Jia and Sercan O Arik and Tomas Pfister and Yixiang Zheng and Wen Ye and Yan Liu},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=YH5w12OUuU}
}
@article{
   Jia_Wang_Zheng_Cao_Liu_2024, 
   title={GPT4MTS: Prompt-based Large Language Model for Multimodal Time-series Forecasting}, 
   volume={38}, 
   url={https://ojs.aaai.org/index.php/AAAI/article/view/30383}, 
   DOI={10.1609/aaai.v38i21.30383}, 
   number={21}, 
   journal={Proceedings of the AAAI Conference on Artificial Intelligence}, 
   author={Jia, Furong and Wang, Kevin and Zheng, Yixiang and Cao, Defu and Liu, Yan}, 
   year={2024}, month={Mar.}, pages={23343-23351} 
   }

About


Languages

Language:Python 95.9%Language:Shell 4.1%