Pe4enIks / TrainableActivation

Implementation for the article "Trainable Activations for Image Classification"

Home Page:https://doi.org/10.20944/preprints202301.0463.v1

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PWC PWC

Trainable Activations for Image Classification

We propose a set of the trainable activation functions — Cosinu-Sigmoidal Linear Unit (CosLU), DELU, Linear Combination (LinComb), Normalized Linear Combination (NormLinComb), Rectified Linear Unit N (ReLUN), Scaled Soft Sign (ScaledSoftSign), Shifted Rectified Linear Unit (ShiLU).

Pretrained weights.

CosLU

$$CosLU(x) = (x + \alpha \cos(\beta x))\sigma(x)$$

$$\sigma(x) = \frac{1}{1 + e^{-x}}$$

CosLU

DELU

$$ DELU(x) = \begin{cases} SiLU(x), x \leqslant 0 \\ (n + 0.5)x + |e^{-x} - 1|, x > 0 \end{cases} $$

$$SiLU(x) = x\sigma(x)$$

DELU

LinComb

$$LinComb(x) = \sum\limits_{i=0}^{n} w_i \mathcal{F}_i(x)$$

LinComb

NormLinComb

$$NormLinComb(x) = \frac{\sum\limits_{i=0}^{n} w_i \mathcal{F}_i(x)}{\mid \mid W \mid \mid}$$

NormLinComb

ReLUN

$$ReLUN(x) = min(max(0, x), n)$$

ReLUN

ScaledSoftSign

$$ScaledSoftSign(x) = \frac{\alpha x}{\beta + |x|}$$

ScaledSoftSign

ShiLU

$$ShiLU(x) = \alpha ReLU(x) + \beta$$

$$ReLU(x) = max(0, x)$$

ShiLU

INSTALLATION

Create venv.

python3 -m venv venv

Activate venv.

source venv/bin/activate

Install dependencies.

pip install -r requirements.txt

PROJECT STRUCTURE

There are 3 main files - train.py, test.py, plot.py. You should run train.py first, then test.py, then plot.py.

Use whatever configuration you want to test. Configurations can be found in the configs folder, train.py and test.py use the same config. There are several plot configurations in the configs/plot folder.

There are many predefined run scripts in the scripts folder, just run one of them as .sh, scripts/train.sh and scripts/test.sh are scripts to train and test all possible configurations, scripts/plot.sh to plot results after training and testing.

All the results of the train / test phases are in the logs folder.

All proposed trainable activations are in activation.py.

HOW TO RUN

Let's say I want to train and test the ResNet-8 model with CosLU trainable activation on the CIFAR-10 dataset.

python train.py --config configs/coslu/cifar10/resnet8.yaml
python test.py --config configs/coslu/cifar10/resnet8.yaml

If you want to train and test all proposed trainable activations with a specific model and dataset, you can use the script from the scripts folder. For example, train and test the DNN2 model on the MNIST dataset.

sh scripts/dnn2_mnist.sh

Train and test all possible configurations.

sh scripts/train.sh
sh scripts/test.sh

Plot graphics for all configurations, it will work even if some configurations haven't been trained.

sh scripts/plot.sh

CITATION

Project CITATION.

LICENSE

Project is distributed under MIT License.

About

Implementation for the article "Trainable Activations for Image Classification"

https://doi.org/10.20944/preprints202301.0463.v1

License:MIT License


Languages

Language:Python 56.5%Language:Shell 43.5%