thuml / iTransformer

Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah

Home Page:https://arxiv.org/abs/2310.06625

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

iTransformer

The repo is the official implementation for the paper: iTransformer: Inverted Transformers Are Effective for Time Series Forecasting. [Slides], [Poster].

Updates

🚩 News (2024.03) Introduction of our work in Chinese is available.

🚩 News (2024.02) iTransformer has been accepted as ICLR 2024 Spotlight.

🚩 News (2023.12) iTransformer available in GluonTS with probablistic emission head and support for static covariates.

🚩 News (2023.12) We received lots of valuable suggestions. A revised version (24 Pages) is now available.

🚩 News (2023.10) iTransformer has been included in [Time-Series-Library] and achieves state-of-the-art in Lookback-$96$ forecasting.

🚩 News (2023.10) All the scripts for the experiments in our paper are available.

Introduction

🌟 Considering the characteristics of multivariate time series, iTransformer breaks the conventional structure without modifying any Transformer modules. Inverted Transformer is all you need in MTSF.

πŸ† iTransformer achieves the comprehensive state-of-the-art in challenging multivariate forecasting tasks and solves several pain points of Transformer on extensive time series data.

Overall Architecture

iTransformer regards independent time series as variate tokens to capture multivariate correlations by attention and utilize layernorm and feed-forward networks to learn series representations.

The pseudo-code of iTransformer is as simple as the following:

Usage

  1. Install Pytorch and necessary dependencies.
pip install -r requirements.txt
  1. The datasets can be obtained from Google Drive or Tsinghua Cloud.

  2. Train and evaluate the model. We provide all the above tasks under the folder ./scripts/. You can reproduce the results as the following examples:

# Multivariate forecasting with iTransformer
bash ./scripts/multivariate_forecasting/Traffic/iTransformer.sh

# Compare the performance of Transformer and iTransformer
bash ./scripts/boost_performance/Weather/iTransformer.sh

# Train the model with partial variates, and generalize on the unseen variates
bash ./scripts/variate_generalization/ECL/iTransformer.sh

# Test the performance on the enlarged lookback window
bash ./scripts/increasing_lookback/Traffic/iTransformer.sh

# Utilize FlashAttention for acceleration
bash ./scripts/efficient_attentions/iFlashTransformer.sh

Main Result of Multivariate Forecasting

We evaluate the iTransformer on challenging multivariate forecasting benchmarks (generally hundreds of variates). Comprehensive good performance (MSE/MAE$\downarrow$) is achieved.

Challenging Multivariate Time Series Forecasting Benchmarks (Avg Results)

Online Transaction Load Prediction of Alipay Trading Platform (Avg Results)

General Performance Boosting on Transformers

By introducing the proposed framework, Transformer and its variants achieve significant performance improvement, demonstrating the generality of the iTransformer approach and benefiting from efficient attention mechanisms.

Zero-shot Generalization on Variates

Technically, iTransformer is able to forecast with arbitrary numbers of variables. We train iTransformers on partial variates and forecast unseen variates with good generalizability.

Better Utilization of Lookback Windows

While previous Transformers do not benefit from the enlarged lookback window. iTransformers show a surprising improvement with the increasing length of the lookback window.

Model Analysis

Benefiting from inverted Transformer modules:

  • (Left) Inverted Transformers learn better time series representations (more similar CKA) favored by forecasting.
  • (Right) The inverted self-attention module learns interpretable multivariate correlations.

  • Visualization of the variates from Market and the learned multivariate correlations. Each variate represents the monitored interface values of an application, and the applications can be further grouped into refined categories.

Model Abalations

iTransformer that utilizes attention on variate dimensions and feed-forward on temporal dimension generally achieves the best performance. However, the performance of vanilla Transformer (the third row) performs the worst among these designs, indicating the disaccord of responsibility when the conventional architecture is adopted.

Model Efficiency

We propose a training strategy for high-dimensional time series. While the performance (Left) remains stable on partially trained variates of each batch with the sampled ratios, the memory footprint (Right) of the training process can be cut off significantly.

Citation

If you find this repo helpful, please cite our paper.

@article{liu2023itransformer,
  title={iTransformer: Inverted Transformers Are Effective for Time Series Forecasting},
  author={Liu, Yong and Hu, Tengge and Zhang, Haoran and Wu, Haixu and Wang, Shiyu and Ma, Lintao and Long, Mingsheng},
  journal={arXiv preprint arXiv:2310.06625},
  year={2023}
}

Acknowledgement

We appreciate the following GitHub repos a lot for their valuable code and efforts.

This work was supported by Ant Group through the CCF-Ant Research Fund.

Contact

If you have any questions or want to use the code, feel free to contact:

About

Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah

https://arxiv.org/abs/2310.06625

License:MIT License


Languages

Language:Python 60.0%Language:Shell 40.0%