There are 4 repositories under activation-functions topic.
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function
深度学习系统笔记,包含深度学习数学基础知识、神经网络基础部件详解、深度学习炼丹策略、模型压缩算法详解。
Rethinking Image Inpainting via a Mutual Encoder Decoder with Feature Equalizations. ECCV 2020 Oral
All the code files related to the deep learning course from PadhAI
Korean OCR Model Design(한글 OCR 모델 설계)
Unofficial implementation of 'Implicit Neural Representations with Periodic Activation Functions'
Intro to Deep Learning by National Research University Higher School of Economics
Implementing activation functions from scratch in Tensorflow.
Image to Image Translation using Conditional GANs (Pix2Pix) implemented using Tensorflow 2.0
Predicting Indian stock prices using Stacked LSTM model. Analysing Reliance, Tata Steel, HDFC Bank, Infosys data. Data prep, EDA, hyperparameter tuning.
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
:poop: Sigmoid Colon: The biologically inspired activation function.
Implementation for the article "Trainable Activations for Image Classification"
Reservoir computing library for .NET. Enables ESN , LSM and hybrid RNNs using analog and spiking neurons working together.
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
An easy-to-use library for GLU (Gated Linear Units) and GLU variants in TensorFlow.
Code for 'Periodic Activation Functions Induce Stationarity' (NeurIPS 2021)
Unofficial pytorch implementation of Piecewise Linear Unit dynamic activation function
[TCAD 2018] Code for “Design Space Exploration of Neural Network Activation Function Circuits”
Interactive visualizations and demos that are used in a blog post I wrote about logic in the context of neural networks
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
A PyTorch implementation of funnel activation https://arxiv.org/pdf/2007.11824.pdf
Official PyTorch implementation of the paper : ProbAct: A Probabilistic Activation Function for Deep Neural Networks.
Binary classification to filter and block unsolicited NSFW content from annoying coworkers... --- ...
Advanced deep learning learning techniques, layers, activations loss functions, all in keras / tensorflow
Source for the paper "Universal Activation Function for machine learning"
My Daily Neural Network Exercise
Awesome papers on Neural Networks and Deep Learning
This Github repository explains the impact of different activation functions on CNN's performance and provides visualizations of activations, convnet filters, and heatmaps of class activation for easier understanding of how CNN works.
PyTorch reimplementation of the paper "Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks" [ICLR 2020].
Triton reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
Official source code for "Deep Learning with Swift for TensorFlow" 📖