ChristophReich1996 / SmeLU

PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].

Home Page:https://arxiv.org/pdf/2202.06499.pdf

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Smooth ReLU in PyTorch

License: MIT

drawingdrawing

Unofficial PyTorch reimplementation of the Smooth ReLU (SmeLU) activation function proposed in the paper Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations by Gil I. Shamir and Dong Lin.

This repository includes an easy-to-use pure PyTorch implementation of the Smooth ReLU.

In case you run into performance issues with this implementation, please have a look at my Triton SmeLU implementation.

Installation

The SmeLU can be installed by using pip.

pip install git+https://github.com/ChristophReich1996/SmeLU

Example Usage

The SmeLU can be simply used as a standard nn.Module:

import torch
import torch.nn as nn
from smelu import SmeLU

network: nn.Module = nn.Sequential(
    nn.Linear(2, 2),
    SmeLU(),
    nn.Linear(2, 2)
)

output: torch.Tensor = network(torch.rand(16, 2))

For a more detailed examples on hwo to use this implementation please refer to the example file (requires Matplotlib to be installed).

The SmeLU takes the following parameters.

Parameter Description Type
beta Beta value if the SmeLU activation function. Default 2. float

Reference

@article{Shamir2022,
        title={{Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations}},
        author={Shamir, Gil I and Lin, Dong},
        journal={{arXiv preprint arXiv:2202.06499}},
        year={2022}
}

About

PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].

https://arxiv.org/pdf/2202.06499.pdf

License:MIT License


Languages

Language:Python 100.0%