There are 1 repository under activation-function topic.
Artificial Intelligence Learning Notes.
[ECCV2024 - Oral] Adaptive Parametric Activation
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
[ICLR 2024] Dynamic Neural Response Tuning
Co-VeGAN: Complex-Valued Generative Adversarial Network for Compressive Sensing MR Image Reconstruction
Javascript implementation of some activation functions.
Source for the paper "Universal Activation Function for machine learning"
This program implements logistic regression from scratch using the gradient descent algorithm in Python to predict whether customers will purchase a new car based on their age and salary.
[NeurIPS 2024] Dual-Perspective Activation: Efficient Channel Denoising via Joint Forward-Backward Criterion for Artificial Neural Networks
Neural_Networks_From_Scratch
Implementing an Image classification neural network to classify Street House View Numbers
m-arcsinh: A Reliable and Efficient Function for Supervised Machine Learning (scikit-learn, TensorFlow, and Keras) and Feature Extraction (scikit-learn)
WraLU is an artifact for the paper "ReLU Hull Approximation" (POPL'24), which provides a sound but incomplete neural network verifier by over-approximating ReLU function hull.
Hyper-Flexible Convolutional Neural Networks Based on Generalized Lehmer and Power Means
A small walk-through to show why ReLU is non linear!
:package: Non-official SPOCU activation function implementation for Pytorch and Tensorflow.
ReLU++: A modified ReLU activation function with enhanced performance for deep learning models.
Logit-space logical activation functions for pytorch
PyTorch implementation of the Leaky Hardtanh activation function
This repository offers a Python Package for the PyTorch implementation of the APTx activation function, as introduced in the paper "APTx: Better Activation Function than MISH, SWISH, and ReLU's Variants used in Deep Learning".
WraAct is an artifact for the paper "Convex Hull Approximation for Activation Functions" (OOPSLA'25), which provides a sound but incomplete neural network verifiers by over-approximating the function hulls of various activation functions (including leaky ReLU, ReLU, sigmoid, tanh, and maxpool).
Comparative Analysis of Activation Functions in Shallow Neural Networks for Multi-Class Image Classification Using MNIST Digits and CIFAR-10 Datasets with Fixed Architectural Parameters
WraAct is a tool to construct the convex hull of various activation functions.
This is a repository for Multi-Layer Perceptron and Logistic Regression. There is a code (function) for Logistic Regression. SOme analysis is performed on the function. This is compared with the sklearn Logistic Regression function. Then, the decision boundary has also been plotted for the classification. The next part is the basic neural network. A class and a function has been created for this and it has been used for digit classification (mnist dataset).
Everything about Artificial Neural Network from Basic to Adavnced
Design of a CNN (Convolutional Neural Networks) to classify CIFAR-10 images
A feedforward multilayer perceptron with gradient descent & backpropagation written from scratch in Java
Deep Learning concepts practice using Cifar-10 dataset
Implement Back Propagation in deep neural network (DNN).
step by step tutorial for ANN
This repository provides a comprehensive machine learning course with theoretical concepts and practical implementations
Predicting patient attendance at Bay Clinic using 'medicalcentre.csv'. Employing SVM, Decision Trees, and DNN models for accuracy, sensitivity, specificity evaluation, and ROC analysis. Part of a Data Science course in my master's program at the University of Ottawa 2023.
This project demonstrates how to build and optimize a CNN for classifying images from the Fashion MNIST dataset using TensorFlow, Keras, and Keras Tuner.