stackgbm offers a minimalist, research-oriented implementation of model stacking (Wolpert, 1992) for gradient boosted tree models built by xgboost (Chen and Guestrin, 2016), lightgbm (Ke et al., 2017), and catboost (Prokhorenkova et al., 2018).
Install from GitHub:
remotes::install_github("nanxstats/stackgbm")
To install all dependencies, check out the instructions from manage dependencies.
stackgbm implements a classic two-layer stacking model: the first layer generates "features" produced by gradient boosting trees. The second layer is a logistic regression that uses these features as inputs.
For a more comprehensive and flexible implementation of model stacking, see stacks in tidymodels and StackingClassifier in scikit-learn.
Please note that the stackgbm project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.