There are 2 repositories under backpropagation-algorithm topic.
A TensorFlow-inspired neural network library built from scratch in C# 7.3 for .NET Standard 2.0, with GPU support through cuDNN
一文彻底搞懂BP算法:原理推导+数据演示+项目实战
A tiny neural network 🧠
Training spiking networks with hybrid ann-snn conversion and spike-based backpropagation
Artificial intelligence/machine learning course at UCF in Spring 2020 (Fall 2019 and Spring 2019)
matrix square root and its gradient
Back Propagation, Python
Efficiently performs automatic differentiation on arbitrary functions. Basically a rudimentary version of Tensorflow.
Фреймворк для построения нейронных сетей, комитетов, создания агентов с параллельными вычислениями.
MATLAB implementations of a variety of machine learning/signal processing algorithms.
Minimalist deep learning library with first and second-order optimization algorithms made for educational purpose
Neural Network in pure PHP
Neural network/Back Propagation implemented from scratch for MNIST.从零开始实现神经网络和反向传播算法,识别MNIST
Training an artificial neural network using back-propagation on MNIST dataset
MNIST Classification using Neural Network and Back Propagation. Written in Python and depends only on Numpy
Multi-layered Convolutional Neural Network written in C++11
A basic implementation of a neural network in java, with back-propagation. Created for learning purposes.
BSc Thesis at FER-2019/20 led by doc. dr. sc. Marko Čupić
VTU Machine Learning lab programs in Python (2015 SCHEME)
Implementation of the back-propagation algorithm using only the linear algebra and other mathematics tool available in numpy and scipy.
Java implementarion for a Backpropagation Feedforward Neural Network with more than one hidden layer
it is simple 2 layer neural network using only numpy as dependency
🤖 Artificial intelligence (neural network) proof of concept to solve the classic XOR problem. It uses known concepts to solve problems in neural networks, such as Gradient Descent, Feed Forward and Back Propagation.
This a repo for the projects I completed during my Deep Learning Udacity Nanodegree.
backpropagation algorithm with one hidden layer using MNIST Handwriting Digits
Implementation of a neural network with backpropagation algorithm
These Codes are written as part of Neural Networks and Deep learning course at UCLA.
teaching material for my 2h graduate-level crash course on artificial neural networks
Created with CodeSandbox
[RU] Обучение многослойного перцептрона с одним скрытым слоем методом обратного распространения ошибки. [EN] Training of a multilayer perceptron with one hidden layer by the back-propagating errors method.
Automatic backpropagation implemented in numpy,
Demonstration of the mini-lab (practical) component activities conducted for the course of Neural Networks and Deep Learning (19CSE456).
Simple Multi-Layer Perceptron(MLP) with forward and backward propagation