34j / boost-loss

Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Boost Loss

CI Status Documentation Status Test coverage percentage

Poetry black pre-commit

PyPI Version Supported Python versions License

Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost. This sounds very simple, but in reality it took a lot of work.

Installation

Install this via pip (or your favourite package manager):

pip install boost-loss

Usage

Basic Usage

import numpy as np

from boost_loss import LossBase
from numpy.typing import NDArray


class L2Loss(LossBase):
    def loss(self, y_true: NDArray, y_pred: NDArray) -> NDArray:
        return (y_true - y_pred) ** 2 / 2

    def grad(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # dL/dy_pred
        return - (y_true - y_pred)

    def hess(self, y_true: NDArray, y_pred: NDArray) -> NDArray: # d^2L/dy_pred^2
        return np.ones_like(y_true)
import lightgbm as lgb

from boost_loss import apply_custom_loss
from sklearn.datasets import load_boston


X, y = load_boston(return_X_y=True)
apply_custom_loss(lgb.LGBMRegressor(), L2Loss()).fit(X, y)

Built-in losses are available. 1

from boost_loss.regression import LogCoshLoss

torch.autograd Loss 2

import torch

from boost_loss.torch import TorchLossBase


class L2LossTorch(TorchLossBase):
    def loss_torch(self, y_true: torch.Tensor, y_pred: torch.Tensor) -> torch.Tensor:
        return (y_true - y_pred) ** 2 / 2

Contributors ✨

Thanks goes to these wonderful people (emoji key):

34j
34j

💻 🤔 📖

This project follows the all-contributors specification. Contributions of any kind welcome!

Footnotes

  1. Inspired by orchardbirds/bokbokbok

  2. Inspired by TomerRonen34/treeboost_autograd

About

Utilities for easy use of custom losses in CatBoost, LightGBM, XGBoost.

License:MIT License


Languages

Language:Python 99.7%Language:JavaScript 0.3%