There are 1 repository under relu-activation topic.
With this package, you can generate mixed-integer linear programming (MIP) models of trained artificial neural networks (ANNs) using the rectified linear unit (ReLU) activation function. At the moment, only TensorFlow sequential models are supported. Interfaces to either the Pyomo or Gurobi modeling environments are offered.
QReLU and m-QReLU: Two novel quantum activation functions for Deep Learning in TensorFlow, Keras, and PyTorch
A Feed Forward Neural Network which a ReLU activation, Cross Entropy Loss & Adam Optimizer
implementation of neural network from scratch only using numpy (Conv, Fc, Maxpool, optimizers and activation functions)
유전알고리즘과 인공신경망을 활용허여 마리오 학습
Python from-scratch implementation of a Neural Network Classifier. Dive into the fundamentals of approximation, non-linearity, regularization, gradients, and backpropagation.
Explain fully connected ReLU neural networks using rules
This is a Feed-Forward Neural Network with back-propagation written in C++ from scratch with no external libraries.
Hackathon project. This project uses object detection to identify and classify waste detected in the image with the help of image detection in python and neural networks.
Image Compression using one hidden layer Neural Network
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
This repository helps in understanding vanishing gradient problem with visualization
My extensive work on Multiclass Image classification based on Intel image classification dataset from Kaggle and Implemented using Pytorch 🔦
This project is about building a artificial neural network using pytorch library. I am sharing the code and output for my project.
Open-source AI library (audio to text, simple NLP, and common algorithms)
Minimal, limited in features, deep learning library, created with the goal of understanding more of the field.
It predicts whether the patient has Heart Disease or not
A multi-task deep learning model that solves multiple tasks concurrently. Specifically, the model is designed to be effective in multiple contexts (e.g., classification, segmentation, regression) simultaneously.
Noise2Noise is an AI denoiser trained with noisy images only. We implemented a ligther version which trains faster on smaller pictures without losing performance and an even simpler one where every low-level component was implemented from scratch, including a reimplementation of autograd.
Text Generation
Фреймворк глубоко обучения на Numpy, написанный с целью изучения того, как все работает под "капотом".
Good Seed were employed Data Science for alcohol law compliance. My role includes using specialized cameras at checkout for alcohol buys, applying advanced computer vision for age verification, and designing a model to confirm age. I built a model with ResNet50 and 'relu', using a single neuron to output.
Here, we will provide a PyTorch regime to handle the partial differential equation solution of the heat equation by executing Deep Kolmogorov Method of Beck et. al.
NU Bootcamp Module 21
Experimenting different neural network architectures for detecting spam emails
Deep-Learning neural network to analyze and classify the success of charitable donations.
It is small Web app for Visualization of Activation Function
This project involves the use of classical neural networks for the computation of the heat equation with Neumann boundary conditions and a gaussian distribution as initial condition.
learning python day 11
American Sign Language (ASL) Detection using CNN
prediction of an absolute temperature on the surface of a star using neural networks
This project utilizes a CNN model to classify cat and dog images through training and testing processes. The model is created using the Keras library on the TensorFlow backend.
The main aim of this project is to built a predictive model using G Store data to predict the TOTAL REVENUE per customer that helps in better use of marketing budget.
Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)
"A TensorFlow-based neural network model for classifying handwritten digits from the MNIST dataset."