There are 1 repository under tanh topic.
Implementation of CORDIC Algorithms Using Verilog
Deep Learning
Simple multi layer perceptron application using feed forward back propagation algorithm
A neural network (NN) having two hidden layers is implemented, besides the input and output layers. The code gives choise to the user to use sigmoid, tanh orrelu as the activation function. Prediction accuracy is computed at the end.
A data classification using MLP
Classes Angle, GeoPos, UTM32 and some other Math functions
Faster Java implementations of hypot, expm1, cos, sinh, cosh, tanh, asin, acos, atan and atan2
Neural Network from scratch without any machine learning libraries
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Implementation of an ANN for recognisement of the Iris plant-family
Modifies a neural network's hyperparameters, activation functions, cost functions, and regularization methods to improve training performance and generalization.
Neural network with 2 hidden layers
Artificial Neural Networks Activation Functions
Lightweight neural network library written in ANSI-C supporting prediction and backpropagation for Convolutional- and Fully Connected neural networks
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Compute the hyperbolic cotangent of a number.
Compute the hyperbolic tangent of a number.
Create an iterator which evaluates the hyperbolic tangent for each iterated value.
This repository contains Python code that generates visualizations for various activation functions commonly used in neural networks.
Exploration of teamwork in neural networks
Implementation Neural Network in C++
Predicting Song Popularity Using Neural Networks with Backpropagation Algorithm Based on Audio Features
This repository delves into the role of activation functions in perceptron-based classification models. It features a comprehensive Jupyter notebook demonstrating different activation functions, their mathematical foundations, and their impact on model performance.
Revising concepts of CNN by building them from scratch using NumPy.
Time series forecast using RNN and LSTM
This repo is created for learning about computer vision and pattern recognition
Deep Learning model for predicting success after donation coded in Google Colab
Modifies a neural network's hyperparameters, activation functions, cost functions, and regularization methods to improve training performance and generalization.
An activation function in the context of neural networks is a mathematical function applied to the output of a neuron. The purpose of an activation function is to introduce non-linearity into the model, allowing the network to learn and represent complex patterns in the data.