locussam / Knet.jl

Koç University deep learning framework.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knet

Knet (pronounced "kay-net") is the Koç University deep learning framework implemented in Julia by Deniz Yuret and collaborators. It supports GPU operation and automatic differentiation using dynamic computational graphs for models defined in plain Julia. You can install Knet with the following at the julia prompt: using Pkg; Pkg.add("Knet"). Some starting points:

  • Tutorial: introduces Julia and Knet via examples.
  • Documentation: installation, introduction, design, implementation, full reference and deep learning chapters.
  • Examples: more tutorials and example models.
  • Benchmarks: comparison of Knet's speed with TensorFlow, PyTorch, DyNet etc.
  • Paper: Yuret, D. "Knet: beginning deep learning with 100 lines of julia." In Machine Learning Systems Workshop at NIPS 2016.
  • KnetML: github organization with Knet repos of models, tutorials, layer collections and other resources.
  • Images: Knet machine images are available for AWS, Singularity and Docker.
  • Issues: if you find a bug, please open a github issue.
  • knet-users: if you need help or would like to request a feature, please join this mailing list.
  • knet-dev: if you would like to contribute to Knet development, please join this mailing list and check out these tips.
  • knet-slack: Slack channel for Knet.
  • Related work: Please check out Flux, Mocha, JuliaML, JuliaDiff, JuliaGPU, JuliaOpt for related packages.

Example

Here is a simple example where we define, train and test the LeNet model for the MNIST handwritten digit recognition dataset from scratch using 13 lines of code and 30 seconds of GPU computation.

using Knet

# Define convolutional layer:
struct Conv; w; b; f; end
(c::Conv)(x) = c.f.(pool(conv4(c.w, x) .+ c.b))
Conv(w1,w2,cx,cy,f=relu) = Conv(param(w1,w2,cx,cy), param0(1,1,cy,1), f)

# Define dense layer:
struct Dense; w; b; f; end
(d::Dense)(x) = d.f.(d.w * mat(x) .+ d.b)
Dense(i::Int,o::Int,f=relu) = Dense(param(o,i), param0(o), f)

# Define a chain of layers and a loss function:
struct Chain; layers; end
(c::Chain)(x) = (for l in c.layers; x = l(x); end; x)
(c::Chain)(x,y) = nll(c(x),y)

# Load MNIST data:
include(Knet.dir("data","mnist.jl"))
dtrn, dtst = mnistdata()

# Define, train and test LeNet (about 30 secs on a gpu to reach 99% accuracy)
LeNet = Chain((Conv(5,5,1,20), Conv(5,5,20,50), Dense(800,500), Dense(500,10,identity)))
adam!(LeNet, repeat(dtrn,10))
accuracy(LeNet, dtst)

Contributing

Knet is an open-source project and we are always open to new contributions: bug reports and fixes, feature requests and contributions, new machine learning models and operators, inspiring examples, benchmarking results are all welcome.

Current contributors:

  • Can Gümeli
  • Carlo Lucibello
  • Ekin Akyürek
  • Ekrem Emre Yurdakul
  • Emre Ünal
  • Emre Yolcu
  • Enis Berk
  • Erenay Dayanık
  • İlker Kesen
  • Kai Xu
  • Meriç Melike Softa
  • Mike Innes
  • Onur Kuru
  • Ozan Arkan Can
  • Ömer Kırnap
  • Phuoc Nguyen
  • Rene Donner
  • Tim Besard
  • Zhang Shiwei

About

Koç University deep learning framework.

License:Other


Languages

Language:Julia 50.4%Language:Jupyter Notebook 41.4%Language:Cuda 5.1%Language:C 0.9%Language:Makefile 0.9%Language:MATLAB 0.9%Language:Python 0.2%Language:Perl 0.0%Language:Shell 0.0%Language:M 0.0%