There are 0 repository under elu-activation topic.
My extensive work on Multiclass Image classification based on Intel image classification dataset from Kaggle and Implemented using Pytorch 🔦
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
• Trained the network for MNIST dataset • Implemented neural network on MNIST dataset by using Sigmoid, ReLU, ELU as the activation function. • Analyzed network’s running time, error rate, efficiency and accuracy.
• Designed deep residual learning model with exponential linear unit for image classification with higher accuracy. • Decreased the error rate to 5.62% and 26.55% on CIFAR-10 and CIFAR-100 datasets respectively which outpaced the most competitive approaches previously published. • Published research paper for the same on 21th Sept 2016 at ACM conference.
Train car on a known track to generate dataset which include steering angle and view of car from 3 different angles. Use this dataset to drive car on an unknown track. And also learn to identify 43 different traffic signals using existing dataset.
Using advanced deep learning techniques on the MNIST dataset. Over 98% validation set accuracy.
Deep Learning concepts practice using Cifar-10 dataset
Deep Learning concepts practice using Cifar-10 dataset