The goal of this crate is to provide a Rust neural network library which is simple, easy to use, minimal, fast and production ready.
This is still experimental! Seriously, I mean it.
Currently this only works on Rust nightly and requires Python 3.7, TensorFlow 1.13, and Numpy 1.15 at runtime as it's using TensorFlow Keras (through Python!). Might work with older versions; no guarantees though.
Warning: TensorFlow 1.12 (and possibly 1.11) has a broken dropout layer; don't use those versions.
- Support TensorFlow Keras as a backend (due to how buggy TensorFlow is this is only temporarily the default)
- Support basic layer types:
- Fully connected
- Dropout
- Activation
- Logistic
- TanH
- ReLU
- LeakyReLU
- ELU
- Softmax
- Convolutional
- Max pooling
- Add
- Mul
- Support basic layer types:
- Support building sequential models
- Support building graph models
- Add a MNIST example
- Add a CIFAR-10 example
- Train it to a reasonable accuracy
- Add LSUV-like weight initialization
- Replace the random orthogonal matrix generator with a pure Rust one
- Add automatic input normalization (zero mean, unit variance)
- Add a native backend (reimplement the compute parts using pure Rust code)
- Use multiple threads
- Use SIMD
- Make the Python + TensorFlow dependency optional (compile time)
- Add full API documentation and add
#![deny(missing_docs)]
- Support more layer types:
- Batch normalization (maybe)
- RNN (maybe)
- Add other backends:
- Figure out a compute abstraction:
- a) Define a custom Rust-like compute language and a source-level translator
- b) Write the compute parts in Rust, compile to WASM and write a recompiler of WASM bytecode
- c) Write a SIPR-V recompiler, and write the compute parts in something that can compile to SPIR-V
- OpenCL
- Vulkan
- WebGL (?)
- Figure out a compute abstraction:
- Export a C API
- Compile to WebAssembly and publish to NPM as a JS library
In a nutshell a neural network is a mathematical construct which can automatically learn how to transform data from one representation into another one. Say, for example, that you have an old black and white photo of your grandma, and you'd like to colorize it. A neural network can help you with that!
What you would do in such a case is to find a bunch of photos which are already in color and convert them to black and white. You've just got yourself a data set for training a neural network! Now you can tell a neural network - hey, here's a bunch of black and white photos, and here are the same photos but they're in color; please learn how to transform the black and white photos into the color ones!
And that's what the neural network will do. If properly trained your neural network will learn how to generalize, or in other words - it will be able to turn more-or-less any black and white photo into a color one, even if it never saw that particular photo before!
If you want to contribute something significant (especially expose new stuff from TensorFlow) please create an issue first to discuss it. This is not a research-level library, nor it is supposed to be a huge kitchen sink.
Licensed under either of
- Apache License, Version 2.0, (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or http://opensource.org/licenses/MIT)
at your option.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.