ptannor / augboost

AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation (2019 IJCAI paper)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

AugBoost

Gradient Boosting Enhanced with Step-Wise Feature Augmentation. Based on the 2019 IJCAI paper by Philip Tannor and Lior Rokach.

About

The code in this repository is based heavily on scikit-learn's 'gradient_boosting.py'. We started this as a fork of sklearn, but split away when we saw it would be more convenient. Thanks! =]

Prerequisites

  • Python 3.6.5 (Ubuntu)
  • sklearn.version = '0.19.1'
  • keras.version = '2.2.0'
  • tensorflow.version = '1.8.0'

And a number of small packages which are included in Anaconda. The most important prerequisite is probably the version of sklearn, although we haven't checked if any of them are necessary.

Getting Started

After cloning the repository, the 2 modules in it can be imported using these lines of code:

from AugBoost import AugBoostClassifier as ABC
from AugBoost import AugBoostRegressor as ABR    #regression module has an issue and doesn't work yet

Meanwhile, only the code for classification tasks works =[

Create your model using code that looks like this:

model = ABC(n_estimators=10, max_epochs=1000, learning_rate=0.1, \
    n_features_per_subset=round(len(X_train.columns)/3), trees_between_feature_update=10,\
    augmentation_method='nn', save_mid_experiment_accuracy_results=False)

And then train and predict like this:

model.fit(X_train, y_train)
model.predict(X_val)

In the file 'notebook for experiments.ipynb' there is an example of code for running experiments with AugBoost.

About

AugBoost: Gradient Boosting Enhanced with Step-Wise Feature Augmentation (2019 IJCAI paper)

License:Other


Languages

Language:Python 78.9%Language:Jupyter Notebook 21.1%