pebeto / ARP.jl

An implementation of the Auto-Rotating Perceptrons in Julia

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ARP.jl Build Status

This repository contains the Flux.jl implementation of the Auto-Rotating Perceptrons (Saromo, Villota, and Villanueva) for dense layers of artificial neural networks.

What is an Auto-Rotating Perceptron?

The ARP are a generalization of the perceptron unit that aims to avoid the vanishing gradient problem by making the activation function's input near zero, without altering the inference structure learned by the perceptron.

Classic perceptron Auto-Rotating Perceptron

Hence, a classic perceptron becomes the particular case of an ARP with rho=1.

Information extracted from the original repository. Reference paper.

About

An implementation of the Auto-Rotating Perceptrons in Julia

License:MIT License


Languages

Language:Julia 100.0%