SpydazWebAI-NLP / BasicNeuralNetWork2023

A Basic Multi layered Neural Network, With Attention Masking Features

Home Page:https://spydazwebai-nlp.github.io/BasicNeuralNetWork2023/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Single Layer Neural Network (VB.NET)

This repository contains an implementation of a single-layer neural network in Visual Basic (.NET). The neural network is designed for simple classification tasks and demonstrates the principles of forward and backward propagation.

Features

Implements a single-layer neural network with customizable activation functions. Supports various transfer functions, including Sigmoid, Hyperbolic Tangent, Rectified Linear Unit (ReLU), and Softmax. Provides functions for training the neural network using different learning algorithms, such as Sigmoid Training and Softmax Training. Includes utility functions for matrix operations, error calculations, and weight initialization.

Getting Started

To get started with the single-layer neural network, follow these steps:

Prerequisites

Visual Studio or any compatible VB.NET development environment. Installation Clone the repository: git clone https://github.com/your-username/single-layer-neural-network.git Open the project in your VB.NET development environment.

Usage

The main code file, SingleLayerNeuralNetwork.vb, contains the implementation of the single-layer neural network class. The class provides various functions for training and using the neural network.

Here are the main functions:

TrainSigmoid: Trains the neural network using the Sigmoid activation function. TrainSoftMax: Trains the neural network using the Softmax activation function. TrainRNN: Trains the neural network using a Recurrent Neural Network (RNN) architecture. Forward: Performs the forward pass of the neural network. Backward: Performs the backward pass of the neural network to update weights. ComputeTotalError: Calculates the total error of the neural network. Refer to the code comments and function signatures for more details on how to use these functions.

Contributing

Contributions to this project are welcome! If you find any issues or have suggestions for improvements, feel free to open an issue or submit a pull request.

To contribute to this project, follow these steps:

Fork this repository. Create a new branch: git checkout -b feature/your-feature. Make your changes and commit them: git commit -m 'Add your feature'. Push to the branch: git push origin feature/your-feature. Submit a pull request. License This project is licensed under the MIT License.

... Contact For any inquiries or questions, please contact LeroySamuelDyer@hotmail.co.uk

About

A Basic Multi layered Neural Network, With Attention Masking Features

https://spydazwebai-nlp.github.io/BasicNeuralNetWork2023/

License:MIT License


Languages

Language:Visual Basic .NET 100.0%