dmhacker / cyberdyne

Feedforward neural network in a TI-84 graphing calculator

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

cyberdyne

As a fun side project, I programmed a simple feedforward neural network program into my graphing calculator. In no way is this efficient or practical, but it was interesting working with the memory and runtime constraints I had.

These images are of a neural network that was trained on the AND function. From left to right: main menu screen, backpropogation, network test execution, error plot

Inside ti-84/ folder, there are 6 separate programs, each one handling a different aspect of a basic feed-forward neural network with tanh as the activation function.

NNET.8xp is the core program, and it contains the code for running the network given an appropriate input vector. The input and output vectors are stored in a TI-Basic lists, and the neuron/connection information is stored in a 2D matrix (TI-Basic does not support 3D matrices, so the information is flattened into a 2D one).

NNETEXE.8xp is a GUI wrapper around NNET.8xp. It allows the user to provide an input vector, and then the program calls NNET.8xp to generate the output vector. The output vector is then printed, so the user can see.

NNETGEN.8xp is used for generating the neural network. Every calculator can only hold one neural network (due to the number of matrices it requires), so this program will overwrite any previous network you have trained, unless its constituent weight matrix was backed up elsewhere. It takes in a list, with each integer in the list specifying how many neurons are that respective layer. e.g. {2, 3, 1} would create a three-layered network, with two inputs, one output, and one hidden layer with three neurons.

NNETLRN.8xp is by far the bulkiest program of the four, mainly because it contains all the code for backpropogation. Users can control how many cycles to train the network for, what input-output pairs to train the network off (provided in a matrix), and what learning rate to use. The backpropogation algorithm it uses is standardized and widely used, but because the calculator takes so long to run backpropogation, especially when learning non-linearly-separable functions, I modified the algorithm with the usage of the Fahlman constant, a small number (k=0.1) added to the derivative of the initial delta error. This has the effect of speeding up the network when the derivative of the error equals 0, commonly seen in the test case of the XOR function.

NNETPLOT.8xp provides a nice visualization of the overall error of the network as you continue to train it. The error being plotted is the total sum squared error of the network, averaged across the entire training set. If the network is working correctly, you see an exponential decrease in error, and error should approach 0 (or close to 0) as the number of training cycles approaches infinity.

NNETMAIN.8xp is the GUI portal program that links together NNETEXE.8xp, NNETGEN.8xp, NNETLRN.8xp, and NNETPLOT.8xp. It recommended you use this to start with access anything related to the neural network, rather than calling each of those programs individually from the EXEC screen.

About

Feedforward neural network in a TI-84 graphing calculator

License:MIT License


Languages

Language:TI Program 100.0%