There are 4 repositories under activation-functions topic.
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
Rethinking Image Inpainting via a Mutual Encoder Decoder with Feature Equalizations. ECCV 2020 Oral
深度学习系统笔记,包含深度学习数学基础知识、神经网络基础部件详解、深度学习炼丹策略、模型压缩算法详解,以及如何实现深度学习推理框架实战。
All the code files related to the deep learning course from PadhAI
Korean OCR Model Design(한글 OCR 모델 설계)
Unofficial implementation of 'Implicit Neural Representations with Periodic Activation Functions'
Intro to Deep Learning by National Research University Higher School of Economics
Implementing activation functions from scratch in Tensorflow.
Image to Image Translation using Conditional GANs (Pix2Pix) implemented using Tensorflow 2.0
:poop: Sigmoid Colon: The biologically inspired activation function.
Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.
[TCAD 2018] Code for “Design Space Exploration of Neural Network Activation Function Circuits”
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
Code for 'Periodic Activation Functions Induce Stationarity' (NeurIPS 2021)
Unofficial pytorch implementation of Piecewise Linear Unit dynamic activation function
Implementation for the article "Trainable Activations for Image Classification"
A PyTorch implementation of funnel activation https://arxiv.org/pdf/2007.11824.pdf
Predicting Indian stock prices using Stacked LSTM model. Analysing Reliance, Tata Steel, HDFC Bank, Infosys data. Data prep, EDA, hyperparameter tuning.
Official PyTorch implementation of the paper : ProbAct: A Probabilistic Activation Function for Deep Neural Networks.
Binary classification to filter and block unsolicited NSFW content from annoying coworkers... --- ...
Source for the paper "Universal Activation Function for machine learning"
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow and Keras
Awesome papers on Neural Networks and Deep Learning
This Github repository explains the impact of different activation functions on CNN's performance and provides visualizations of activations, convnet filters, and heatmaps of class activation for easier understanding of how CNN works.
Multilayer neural network framework implementation, used for classification and regression task. Can use multiple activation functions with backpropagation based on autograd library. Contains polynomial activation function for regression task.
PyTorch reimplementation of the paper "Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks" [ICLR 2020].
Neural_Networks_From_Scratch
Official source code for "Deep Learning with Swift for TensorFlow" 📖