perpetual-ml / perpetual

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

Home Page:https://perpetual-ml.com/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Python Versions PyPI Version Crates.io Version Discord

Perpetual

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

PerpetualBooster is a gradient boosting machine (GBM) algorithm which doesn't need hyperparameter optimization unlike other GBM algorithms. Similar to AutoML libraries, it has a budget parameter. Increasing the budget parameter increases the predictive power of the algorithm and gives better results on unseen data. Start with a small budget (e.g. 1.0) and increase it (e.g. 2.0) once you are confident with your features. If you don't see any improvement with further increasing the budget, it means that you are already extracting the most predictive power out of your data.

Benchmark

Hyperparameter optimization usually takes 100 iterations with plain GBM algorithms. PerpetualBooster achieves the same accuracy in a single run. Thus, it achieves around 100x speed-up at the same accuracy with different budget levels and with different datasets. The speed-up might be slightly lower or significantly higher than 100x depending on the dataset.

The following table summarizes the results for the California Housing dataset (regression):

Perpetual budget LightGBM n_estimators Perpetual mse LightGBM mse Perpetual cpu time LightGBM cpu time Speed-up
1.0 100 0.192 0.192 7.6 978 129x
1.5 300 0.188 0.188 21.8 3066 141x
2.1 1000 0.185 0.186 86.0 8720 101x

You can reproduce the results using the scripts in the examples folder.

Usage

You can use the algorithm like in the example below. Check examples folders for both Rust and Python.

from perpetual import PerpetualBooster

model = PerpetualBooster(objective="SquaredLoss")
model.fit(X, y, budget=1.0)

Documentation

Documentation for the Python API can be found here and for the Rust API here.

Installation

The package can be installed directly from pypi.

pip install perpetual

To use in a Rust project, add the following to your Cargo.toml file to get the package from crates.io.

perpetual = "0.2.0"

Paper

PerpetualBooster prevents overfitting with a generalization algorithm. The paper is work-in-progress to explain how the algorithm works. Check our blog post for a high level introduction to the algorithm.

About

A self-generalizing gradient boosting machine which doesn't need hyperparameter optimization

https://perpetual-ml.com/

License:GNU Affero General Public License v3.0


Languages

Language:Rust 74.3%Language:Python 20.3%Language:Jupyter Notebook 5.2%Language:PowerShell 0.1%Language:Shell 0.0%