There are 0 repository under activation-function-exploration topic.
This repository features hands-on Jupyter Notebooks, covering everything from fundamental concepts to advanced neural network architectures.
We introduce two novel hybrid activation functions: S3 (Sigmoid-Softsign) and its improved version S4 (Smoothed S3)
Graph by matplotlib
Linear and Non Linear Activation Functions : Linear, ReLU, Sigmoid, Softmax, Tanh
This is a Fake and AI image prediction using Transfer Learning
This is a custom-built neural network that detects handwritten numbers from image inputs. It uses ReLU activation in the hidden layers and a softmax activation function in the output layer for classification. The model is trained using backpropagation with a loss function to minimize prediction errors, achieving over 99% accuracy when predicting
A Benchmark for Activation Function Exploration for Neural Architecture Search (NAS)