swufung / Meganet.m

A fresh approach to deep learning written in MATLAB

Home Page:http://www.xtract.ai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Meganet.m

A fresh approach to deep learning written in MATLAB

Reporting Bugs

We are just getting started, so please be patient with us. If you find a bug, please report it by opening an issue or email lruthotto@emory.edu. In any case, include a small example that helps us re-produce the error. We'll work on this as quickly as possible.

Getting started

  1. Clone or download the code
  2. Add folder to your MATLAB path
  3. (optional) run KernelTypes/mexcuda/make_cuda.m for fast CNNs using CuDNN
  4. (optional) gather test data or binary files

Optional Binary Files

The convMCN kernel type and the average pooling require compiled binaries from the MatConvNet package. Please follow these instructions and add the files for vl_nnconv, vl_nnconvt, and vl_nnpool to your MATLAB path.

For best performance these files can be compiled with GPU or CuDNN support.

Additional Test Data

Some examples use these benchmark data

  1. MNIST
  2. CIFAR10
  3. STL-10

References

The implementation is based on the ideas presented in:

  1. Haber E, Ruthotto L: Stable Architectures for Deep Neural Networks, Inverse Problems, 2017
  2. Chang B, Meng L, Haber E, Ruthotto L, Begert D, Holtham E: Reversible Architectures for Arbitrarily Deep Residual Neural Networks, AAAI Conference on Artificial Intelligence 2018
  3. Haber E, Ruthotto L, Holtham E, Jun SH: Learning across scales - A multiscale method for Convolution Neural Networks, AAAI Conference on Artificial Intelligence 2018

About

A fresh approach to deep learning written in MATLAB

http://www.xtract.ai/

License:Other


Languages

Language:MATLAB 95.6%Language:Cuda 4.2%Language:C 0.2%