franalgaba / neural-network-cairo

Neural Network implementation from scratch for MNIST using Cairo 1.0

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

CI Badge

Neural Network for MNIST in Cairo 1.0

Implementation of a Neural Network from scratch using Cairo 1.0 for MNIST predictions.

The NN has a simple two-layer architecture:

  • Input layer ๐‘Ž[0] will have 784 units corresponding to the 784 pixels in each 28x28 input image.
  • A hidden layer ๐‘Ž[1] will have 10 units with ReLU activation.
  • Output layer ๐‘Ž[2] will have 10 units corresponding to the ten digit classes with softmax activation.

Functionalities implemented in Cairo 1.0:

  • Vector implementation with operations: sum, max, min, argmax.
  • Matrix implementation with operations: get, dot, add, len.
  • Tensor implementation.
  • 8-bit weight quantization based in ONNX quantization.
  • ReLU activation.
  • Forward propagation of NN.
  • Predict method for NN.
  • Pseudo-softmax activation optimized for quantized values.
  • Weight loading into Cairo NN from trained Tensorflow NN.
  • MNIST inferences using Cairo NN.

Built with auditless/cairo-template

Working with the project

Currently supports building and testing contracts.

Build

Build the contracts.

$ make build

Test

Run the tests in src/test:

$ make test

Format

Format the Cairo source code (using Scarb):

$ make fmt

Credits

About

Neural Network implementation from scratch for MNIST using Cairo 1.0

License:MIT License


Languages

Language:Rust 96.8%Language:Python 3.1%Language:Makefile 0.1%