There are 1 repository under tanh topic.
Implementation of CORDIC Algorithms Using Verilog
Deep Learning
Simple multi layer perceptron application using feed forward back propagation algorithm
Classes Angle, GeoPos, UTM32 and some other Math functions
Neural Network from scratch without any machine learning libraries
A neural network (NN) having two hidden layers is implemented, besides the input and output layers. The code gives choise to the user to use sigmoid, tanh orrelu as the activation function. Prediction accuracy is computed at the end.
A data classification using MLP
Faster Java implementations of hypot, expm1, cos, sinh, cosh, tanh, asin, acos, atan and atan2
Implementation of an ANN for recognisement of the Iris plant-family
Neural network with 2 hidden layers
This repo is created for learning about computer vision and pattern recognition
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Artificial Neural Networks Activation Functions
Compute the hyperbolic cotangent of a number.
Compute the hyperbolic tangent of a number.
Create an iterator which evaluates the hyperbolic tangent for each iterated value.
Comparison of common activation functions on MNIST dataset using PyTorch.
2nd Project of Course 'Machine Learning' of the SMARTNET programme. Taken at the National and Kapodistrian University of Athens.
Exploration of teamwork in neural networks
I have implemented some AI projects from scratch implementation without explicit use of the built-in-libraries and thus added to this repo.
Simple self-written ANN powered by NumPy to classify handwritten digits of the famous MNIST Dataset. ✍️
Developed Neural Network (NN) having one hidden layer, two hidden layers and four hidden layers, besides the input and output layers. Tested with Sigmoid, tanh and ReLu activation function. Used Scikit learn for pre-processing data.
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Neural Network implementation from scratch along with its analysis with different type of activation function and with variation in hidden layer size and depth.
Time series forecast using RNN and LSTM
Deep Learning model for predicting success after donation coded in Google Colab
Advance Machine Learning (CSL 712) Course Lab Assignments