tidymodels / embed

Extra recipes for predictor embeddings

Home Page:https://embed.tidymodels.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

embed

R-CMD-check Codecov test coverage CRAN_Status_Badge Downloads

Introduction

embed has extra steps for the recipes package for embedding predictors into one or more numeric columns. Almost all of the preprocessing methods are supervised.

These steps are available here in a separate package because the step dependencies, rstanarm, lme4, and keras, are fairly heavy.

Some steps handle categorical predictors:

  • step_lencode_glm(), step_lencode_bayes(), and step_lencode_mixed() estimate the effect of each of the factor levels on the outcome and these estimates are used as the new encoding. The estimates are estimated by a generalized linear model. This step can be executed without pooling (via glm) or with partial pooling (stan_glm or lmer). Currently implemented for numeric and two-class outcomes.

  • step_embed() uses keras::layer_embedding to translate the original C factor levels into a set of D new variables (< C). The model fitting routine optimizes which factor levels are mapped to each of the new variables as well as the corresponding regression coefficients (i.e., neural network weights) that will be used as the new encodings.

  • step_woe() creates new variables based on weight of evidence encodings.

  • step_feature_hash() can create indicator variables using feature hashing.

For numeric predictors:

  • step_umap() uses a nonlinear transformation similar to t-SNE but can be used to project the transformation on new data. Both supervised and unsupervised methods can be used.

  • step_discretize_xgb() and step_discretize_cart() can make binned versions of numeric predictors using supervised tree-based models.

  • step_pca_sparse() and step_pca_sparse_bayes() conduct feature extraction with sparsity of the component loadings.

Some references for these methods are:

Getting Started

There are two articles that walk through how to use these embedding steps, using generalized linear models and neural networks built via TensorFlow.

Installation

To install the package:

install.packages("embed")

Note that to use some steps, you will also have to install other packages such as rstanarm and lme4. For all of the steps to work, you may want to use:

install.packages(c("rpart", "xgboost", "rstanarm", "lme4"))

To get a bug fix or to use a feature from the development version, you can install the development version of this package from GitHub.

# install.packages("pak")
pak::pak("tidymodels/embed")

Contributing

This project is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

About

Extra recipes for predictor embeddings

https://embed.tidymodels.org

License:Other


Languages

Language:R 100.0%