jchuter's starred repositories
hearthstone-battlegrounds-simulator
A simulator for battles in the Hearthstone Battlegrounds
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
stanford-cs-221-artificial-intelligence
VIP cheatsheets for Stanford's CS 221 Artificial Intelligence
OpenNMT-py
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
Style-Transfer-in-Text
Paper List for Style Transfer in Text
react-basic
A description of the conceptual model of React without implementation burden.
functional_widget
A code generator to write widgets as function without loosing the benefits of classes.
Awesome-pytorch-list
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
quickdraw-dataset
Documentation on how to access and use the Quick, Draw! Dataset.
immutable-js
Immutable persistent data collections for Javascript which increase efficiency and simplicity.
redux-observable
RxJS middleware for action side effects in Redux using "Epics"
awesome-flutter
An awesome list that curates the best Flutter libraries, tools, tutorials, articles and more.
pytorch-CycleGAN-and-pix2pix
Image-to-Image Translation in PyTorch
react-native
A framework for building native applications using React
stanford-cs-229-machine-learning
VIP cheatsheets for Stanford's CS 229 Machine Learning
the-incredible-pytorch
The Incredible PyTorch: a curated list of tutorials, papers, projects, communities and more relating to PyTorch.
SfmLearner-Pytorch
Pytorch version of SfmLearner from Tinghui Zhou et al.
Pytorch-UNet
PyTorch implementation of the U-Net for image semantic segmentation with high quality images
minimal-nmt
A minimal nmt example to serve as an seq2seq+attention reference.
pytorch-unet
Tunable U-Net implementation in PyTorch
Linear-Attention-Recurrent-Neural-Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
conv_arithmetic
A technical report on convolution arithmetic in the context of deep learning
pytorch-CortexNet
PyTorch implementation of the CortexNet predictive model
Convolutional_LSTM_PyTorch
Multi-layer convolutional LSTM with Pytorch