mfriendly / tsai

Time series Timeseries Deep Learning Machine Learning Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai

Home Page:https://timeseriesai.github.io/tsai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool


tsai

CI PyPI DOI PRs

Description

State-of-the-art Deep Learning library for Time Series and Sequences.

tsai is an open-source deep learning package built on top of Pytorch & fastai focused on state-of-the-art techniques for time series tasks like classification, regression, forecasting, imputation...

tsai is currently under active development by timeseriesAI.

What's new:

March 2022

  • ⚡️ Starting with tsai 0.3.0 you'll get faster installs and imports through a better use of dependencies.
  • New visualization methods: learn.feature_importance() and learn.step_importance() will help you gain better insights on how your models works.
  • New calibration model: learn.calibrate_model() for time series classification tasks.

November, 2021

  • ✅ Implemented some of the learnings from reviewing Kaggle's latest time series competition (see Medium blog post for more details) like:
    • improved RNN initialization (based on a kernel shared by https://www.kaggle.com/junkoda)
    • added the option to pass a feature extractor to RNNPlus & TSiT (Transformer) models.
    • created a MultiConv layer that allows the concatenation of original features with the output of one or multiple convolution layers in parallel.

September, 2021

  • See our new tutorial notebook on how to track your experiments with Weights & Biases Open In Colab

  • tsai just got easier to use with the new sklearn-like APIs: TSClassifier, TSRegressor, and TSForecaster!! See this for more info.

  • New tutorial notebook on how to train your model with larger-than-memory datasets in less time achieving up to 100% GPU usage!! Open In Colab

  • tsai supports now more input formats: np.array, np.memmap, zarr, xarray, dask, list, L, ...

Previously

  • MINIROCKET a SOTA Time Series Classification model (now available in Pytorch): You can now check MiniRocket's performance in our new tutorial notebook Open In Colab

"Using this method, it is possible to train and test a classifier on all of 109 datasets from the UCR archive to state-of-the-art accuracy in less than 10 minutes." A. Dempster et al. (Dec 2020)

  • Multi-class and multi-label time series classification notebook: you can also check our new tutorial notebook: Open In Colab

  • Self-supervised learning: Learn how to leverage your unlabeled datasets Open In Colab

  • New visualization: We've also added a new PredictionDynamics callback that will display the predictions during training. This is the type of output you would get in a classification task for example:

Installation

You can install the latest stable version from pip using:

pip install tsai

Or you can install the cutting edge version of this library from github by doing:

pip install -Uqq git+https://github.com/timeseriesAI/tsai.git

Once the install is complete just run:

from tsai.all import *

Note: starting with tsai 0.3.0 tsai will only install hard dependencies. Other soft dependencies (which are only required for selected tasks) will not be installed by default (this is the recommended approach. If you require any of the dependencies that is not installed, tsai will ask you to install it when necessary). If you still want to install tsai with all its dependencies you can do it by running:

pip install tsai[extras]

Documentation

Here's the link to the documentation.

Available models:

Here's a list with some of the state-of-the-art models available in tsai:

among others!

How to start using tsai?

To get to know the tsai package, we'd suggest you start with this notebook in Google Colab: 01_Intro_to_Time_Series_Classification It provides an overview of a time series classification task.

We have also develop many other tutorial notebooks.

To use tsai in your own notebooks, the only thing you need to do after you have installed the package is to run this:

from tsai.all import *

Examples

These are just a few examples of how you can use tsai:

Binary, univariate classification

Training:

from tsai.all import *
X, y, splits = get_classification_data('ECG200', split_data=False)
batch_tfms = TSStandardize()
clf = TSClassifier(X, y, splits=splits, path='models', arch=InceptionTimePlus, batch_tfms=batch_tfms, metrics=accuracy, cbs=ShowGraph())
clf.fit_one_cycle(100, 3e-4)
clf.export("clf.pkl") 

Inference:

from tsai.inference import load_learner
clf = load_learner("models/clf.pkl")
probas, target, preds = clf.get_X_preds(X[splits[0]], y[splits[0]])

Multi-class, multivariate classification

Training:

from tsai.all import *
X, y, splits = get_classification_data('LSST', split_data=False)
batch_tfms = TSStandardize(by_sample=True)
mv_clf = TSClassifier(X, y, splits=splits, path='models', arch=InceptionTimePlus, batch_tfms=batch_tfms, metrics=accuracy, cbs=ShowGraph())
mv_clf.fit_one_cycle(10, 1e-2)
mv_clf.export("mv_clf.pkl")

Inference:

from tsai.inference import load_learner
mv_clf = load_learner("models/mv_clf.pkl")
probas, target, preds = mv_clf.get_X_preds(X[splits[0]], y[splits[0]])

Multivariate Regression

Training:

from tsai.all import *
X, y, splits = get_regression_data('AppliancesEnergy', split_data=False)
batch_tfms = TSStandardize(by_sample=True)
reg = TSRegressor(X, y, splits=splits, path='models', arch=TSTPlus, batch_tfms=batch_tfms, metrics=rmse, cbs=ShowGraph(), verbose=True)
reg.fit_one_cycle(100, 3e-4)
reg.export("reg.pkl")

Inference:

from tsai.inference import load_learner
reg = load_learner("models/reg.pkl")
raw_preds, target, preds = reg.get_X_preds(X[splits[0]], y[splits[0]])

The ROCKETs (RocketClassifier, RocketRegressor, MiniRocketClassifier, MiniRocketRegressor, MiniRocketVotingClassifier or MiniRocketVotingRegressor) are somewhat different models. They are not actually deep learning models (although they use convolutions) and are used in a different way.

⚠️ You'll also need to install sktime to be able to use them. You can install it separately or use:

pip install tsai[extras]

Training:

from tsai.all import *
from sklearn.metrics import mean_squared_error
X_train, y_train, X_test, y_test = get_regression_data('AppliancesEnergy')
rmse_scorer = make_scorer(mean_squared_error, greater_is_better=False)
mr_reg = MiniRocketRegressor(scoring=rmse_scorer)
mr_reg.fit(X_train, y_train)
mr_reg.save("minirocket_regressor")

Inference:

mr_reg = load_rocket("minirocket_regressor")
y_pred = mr_reg.predict(X_test)
mean_squared_error(y_test, y_pred, squared=False)

Forecasting

You can use tsai for forecast in the following scenarios:

  • univariate or multivariate time series input
  • univariate or multivariate time series output
  • single or multi-step ahead

You'll need to:

  • prepare X (time series input) and the target y (see documentation)
  • select one of tsai's models ending in Plus (TSTPlus, InceptionTimePlus, TSiTPlus, etc). The model will auto-configure a head to yield an output with the same shape as the target input y.

Single step

Training:

from tsai.all import *
ts = get_forecasting_time_series("Sunspots").values
X, y = SlidingWindow(60, horizon=1)(ts)
splits = TimeSplitter(235)(y) 
batch_tfms = TSStandardize()
fcst = TSForecaster(X, y, splits=splits, path='models', batch_tfms=batch_tfms, bs=512, arch=TSTPlus, metrics=mae, cbs=ShowGraph())
fcst.fit_one_cycle(50, 1e-3)
fcst.export("fcst.pkl")

Inference:

from tsai.inference import load_learner
fcst = load_learner("models/fcst.pkl", cpu=False)
raw_preds, target, preds = fcst.get_X_preds(X[splits[0]], y[splits[0]])
raw_preds.shape

output: torch.Size([2940, 1])

Multi-step

This example show how to build a 3-step ahead univariate forecast.

Training:

from tsai.all import *
ts = get_forecasting_time_series("Sunspots").values
X, y = SlidingWindow(60, horizon=3)(ts)
splits = TimeSplitter(235)(y) 
batch_tfms = TSStandardize()
fcst = TSForecaster(X, y, splits=splits, path='models', batch_tfms=batch_tfms, bs=512, arch=TSTPlus, metrics=mae, cbs=ShowGraph())
fcst.fit_one_cycle(50, 1e-3)
fcst.export("fcst.pkl")

Inference:

from tsai.inference import load_learner
fcst = load_learner("models/fcst.pkl", cpu=False)
raw_preds, target, preds = fcst.get_X_preds(X[splits[0]], y[splits[0]])
raw_preds.shape

output: torch.Size([2938, 3])

Input data format

The input format for all time series models and image models in tsai is the same. An np.ndarray (or array-like object like zarr, etc) with 3 dimensions:

[# samples x # variables x sequence length]

The input format for tabular models in tsai (like TabModel, TabTransformer and TabFusionTransformer) is a pandas dataframe. See example.

How to contribute to tsai?

We welcome contributions of all kinds. Development of enhancements, bug fixes, documentation, tutorial notebooks, ...

We have created a guide to help you start contributing to tsai. You can read it here.

Citing tsai

If you use tsai in your research please use the following BibTeX entry:

@Misc{tsai,
    author =       {Ignacio Oguiza},
    title =        {tsai - A state-of-the-art deep learning library for time series and sequential data},
    howpublished = {Github},
    year =         {2022},
    url =          {https://github.com/timeseriesAI/tsai}
}

About

Time series Timeseries Deep Learning Machine Learning Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai

https://timeseriesai.github.io/tsai/

License:Apache License 2.0


Languages

Language:Jupyter Notebook 97.4%Language:Python 2.6%Language:Makefile 0.0%