There are 7 repositories under neural-networks-from-scratch topic.
Data science teaching materials
A complete neural network built entirely in x86 assembly language that learns to recognize handwritten digits from the MNIST dataset. No frameworks, no high-level languages - just pure assembly - ~5.3× faster than NumPy
A neural network library written from scratch in Rust along with a web-based application for building + training neural networks + visualizing their outputs
Detailed python notes & code for lectures and exercises of Andrej Karpathy's course "Neural Networks: Zero to Hero." The course is focused on building neural networks from scratch.
🤖 A TypeScript version of karpathy/micrograd — a tiny scalar-valued autograd engine and a neural net on top of it
An Open Convolutional Neural Network Framework in C++ From Scratch
Neural Networks from Scratch in Python crafted for utilization as teaching resources in graduate courses (Deep Learning, Deep Learning for Computer Vision) delivered by Minh-Chien Trinh at Jeonbuk National University.
Unsupervised Deep Learning-based Pansharpening with Jointly-Enhanced Spectral and Spatial Fidelity
A step-by-step walkthrough of the inner workings of a simple neural network. The goal is to demystify the calculations behind neural networks by breaking them down into understandable components, including forward propagation, backpropagation, gradient calculations, and parameter updates.
Learn machine learning the hard way
My first ML sandbox
Implementation of feedforward-backpropagated Neural Network from Scratch
Let's build Neural Networks from scratch.
Neural is a domain-specific language (DSL) designed for defining, training, debugging, and deploying neural networks. With declarative syntax, cross-framework support, and built-in execution tracing (NeuralDbg), it simplifies deep learning development.
Lightweight, easy to use, micro neural network framework written in Rust w/ no python dependencies
Matrix-Vector Library Designed for Neural Network Construction. cuda (gpu) support, openmp (multithreaded cpu) support, partial support of BLAS, expression template based implementation PTX code generation identical to hand written kernels, and support for auto-differentiation
Learn to build neural networks from scratch, simply. No autograd, no deep learning libraries - just numpy.
Neural nets for high accuracy multivariable nonlinear regression.
This is my first Deep Learning project, which is a MNIST hand-written digits classifier. The model is implemented completely from scratch WITHOUT using any prebuilt optimization like Tensorflow or Pytorch. Tensorflow is imported only to load the MNIST data set. This model also uses 2 hidden layers with Adaptive Moment Optimization (Adam) and Drop-out regularization.
Neural Network with VHDL and matlab
XOR gate which predicts the output using Neural Network :fire:
Artificial Neural Networks (ANNs) Projects
Pure Python Simple Neural Network (SNN) library
A set of Jupyter notebooks implementing simple neural networks described in Michael Nielsen's book.
To understand neural networks thoroughly I implemented them from scratch in C++. This is the source code for the same.
Deep Learning course, Sharif University of Technology, Dr. Soleymani, Spring 2024
Neural Networks and optimizers from scratch in NumPy, featuring newer optimizers such as DemonAdam or QHAdam.
Multilayer Perceptron from scratch in python
Implementation of George Hotz's tinygrad.
Code for my youtube video: Neural Network Crash Course, Ep 1
Implementing Neural Networks using Maths and Numpy only
This repository contains an implementation of a neural network from scratch using only NumPy, a fundamental library for numerical computing in Python. The neural network is designed to perform tasks such as classification, regression, or any other supervised learning problem.