shahidash / nn

🧠 Minimal implementations of neural network architectures and layers in PyTorch with side-by-side notes

Home Page:http://lab-ml.com/labml_nn/index.html

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Join Slack Twitter

Screenshot

This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations, and the website renders these as side-by-side formatted notes. We believe these would help you understand these algorithms better.

We are actively maintaining this repo and adding new implementations.

Modules

Transformers module contains implementations for multi-headed attention and relative multi-headed attention.

✨ LSTM

✨ Sketch RNN

✨ Optimizers

Installation

pip install labml_nn

Citing LabML

If you use LabML for academic research, please cite the library using the following BibTeX entry.

@misc{labml,
 author = {Varuna Jayasiri, Nipun Wijerathne},
 title = {LabML: A library to organize machine learning experiments},
 year = {2020},
 url = {https://lab-ml.com/},
}

About

🧠 Minimal implementations of neural network architectures and layers in PyTorch with side-by-side notes

http://lab-ml.com/labml_nn/index.html

License:MIT License


Languages

Language:Python 72.6%Language:Jupyter Notebook 27.2%Language:Makefile 0.2%