There are 0 repository under batchnorm topic.
A CV toolkit for my papers.
How to use Cross Replica / Synchronized Batchnorm in Pytorch
MXNet Gluon Synchronized Batch Normalization Preview
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124
Deep learning projects including applications (face recognition, neural style transfer, autonomous driving, sign language reading, music generation, translation, speech recognition and NLP) and theories (CNNs, RNNs, LSTM, Adam, Dropout, BatchNorm, Xavier/He initialization, hyperparameter tuning, regularization, optimization, Residual Networks). Deep Learning Specialization by Andrew Ng, deeplearning.ai
Review materials for the TWiML Study Group. Contains annotated versions of the original Jupyter noteboooks (look for names like *_jcat.ipynb ), slide decks from weekly Zoom meetups, etc.
Synchronized BatchNorm in PyTorch 1.0
Cross-platform mobile Neural network C library for training and inference on the device. CPU only. It fits for time-series data.
MNIST Classification using Neural Network and Back Propagation. Written in Python and depends only on Numpy
Partial transfusion: on the expressive influence of trainable batch norm parameters for transfer learning. TL;DR: Fine-tuning only the batch norm affine parameters leads to similar performance as to fine-tuning all of the model parameters
Playground repository to highlight the problem of BatchNorm layers for an blog article
Code to fold batch norm layer of a DNN model in pytorch
A C# WGAN.
This repository contains different implementation of deep learning model using Numpy.
Digit recognition neural network using the MNIST dataset. Features include a full gui, convolution, pooling, momentum, nesterov momentum, RMSProp, batch normalization, and deep networks.
Implementation of a Fully Connected Neural Network, Convolutional Neural Network (CNN), and Recurrent Neural Network (RNN) from Scratch, using NumPy.
As part of a bigger work, this work focuses on implementing MLPs and Batch Normalization with Numpy and Python only.
Batch normalization from scratch on LeNet using tensorflow.keras on mnist dataset. The goal is to learn and characterize batch normalization's impact on the NN performance.
Neural Networks from scratch using NumPy. Multi-layer NN with BatchNorm. Character Level RNN
A set of experiments inspired by the paper "Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs" by Jonathan Frankle, David J. Schwab, Ari S. Morcos
MXNet implementation of Filter Response Normalization Layer (FRN) published in CVPR2020
Implement GAN (Generative Adversarial Network) on MNIST dataset. Vary the hyperparameters and analyze the corresponding results.
Built CNN model with different no. of layers with added dropout on MNIST data
Built MLP with ReLU and Adam optimization with 2 layers, 3 layers and 5 layers and observed how it works.
I implemented a classifier using batchnorm provided by tensorflow slim and analyzed the results using tensorboard.
Solutions for Andrej Karpathy's "Neural Networks: Zero to Hero" course