zizhan / neuron

Scala library for neural networks

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

neuron Build Status Coverage Status

This project is a work-in-progress, which provides a flexible framework to experiment and run neural network of heterogeneous topologies. A preliminary release will be visible soon. neuron is written in Scala which adopts the so-called "define-by-run" scheme.

Features

  • template vs. module
  • neural network operators
  • autoencoders (w or w/o tiled weight)
  • activation functions: logistic, tanh, ReLU, softplus
  • metrics (L1, L2, Mahalanobis, Softmax)
  • regularization: weight decay, activation sparsity, dropout, maxout
  • data parallel framework: atomic parameters + distributed states
  • optimization: LBFGS, SGD, SAGD, SGD with momentum
  • recursive neural network

Documentation

The simplest example to train a regularized multi-layer perceptron to predict handwritten digits from MNIST dataset. It takes around ten minutes.

package neuron.examples

import neuron.core._
import neuron.math._

object MLP_MNIST extends Workspace with Optimizable {
    def main(args: Array[String]): Unit = {
      // set @MLP=784-200-10, @weight_decay=1E-4
      nn = (new RegularizedLinearNN(200, 10, 1E-4) **
            new SingleLayerNeuralNetwork(200) **
            new RegularizedLinearNN(784, 200, 1E-4)).create() // nn is declared in trait @Optimizable
            
      // load standard MNIST training data
      val (xData, yData) = LoadData.mnistDataM("std", "train")
      
      // generate random weight and initialize
      val theta0 = nn.getRandomWeights("get random weights").toWeightVector()
      nn.setWeights("set weight", theta0);
      
      // full-batch training (@maxIter=200, @distance=SoftMaxDistance)
      val (_, theta) = trainx(xData, yData, theta0, 200, SoftMaxDistance)
      
      // load standard MNIST testing data
      val (xDataTest, yDataTest) = LoadData.mnistDataM("std", "t10k")
      
      // estimate accuracy
      val accuracy = (yDataTest.argmaxCol() countEquals nn(xDataTest, null).argmaxCol()) / xDataTest.cols.toDouble
      println(accuracy)
    }
}
/* Accuracy: 0.9806 */

Also have a look at

  • Basics: explains the most fundamental ideas for the use of neuron, and why they featured neuron as a good choice for prototyping neural networks that leverage flexibility, simplicity and efficiency.
  • Auto-Encoder: a special family of unsupervised neural network.
  • Examples: we have more examples under folder src/main/scala/neuron/tutorials/
  • Scaladoc: TBA

FAQ

  • How is neuron different from other deep learning libraries (such as theano, torch7, etc), besides it is Scala based?

    We argue that not only the number of parameters contributes to the representation ability of neural network, but also its infrastructure (network topology, train strategy, etc.) Neuron focuses on fast prototyping novel network architecture. Using Scala, we attempt to make the implementation of neural network in a mixed functional and imperative way ... though neuron is not at the mature shape to be industrial proven.

  • How is the speed of neruon?

    Neuron is currently backed by breeze for numerical computation, which should be fast. And the extra cost for data control flow is minimized. Neuron provides convenient data parallelization.

Reference

  • Breeze and Nak: a set of libraries for machine learning and numerical computing
  • UFLDL Tutorial: a Stanford course, find solutions at Github
  • ScalaSTM: a lightweight software transactional memory for Scala

The MIT License (MIT)

Copyright (c) 2014 - 2015 Jianbo Ye

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

About

Scala library for neural networks


Languages

Language:Scala 94.5%Language:MATLAB 5.5%