Raphael Shu's repositories
neuralcompressor
Embedding Quantization (Compress Word Embeddings)
cmlm
A masked language modeling objective to train a model to predict any subset of the target words, conditioned on both the input text and a partially masked target translation.
Language:PythonNOASSERTION000
DRL-Labs-Solutions
Personal solutions for DRL bootcamp labs.
fairseq_shu
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
neovim-config
Neovim configuration
OpenNMT-py
Open Source Neural Machine Translation in PyTorch
pytorch-openai-transformer-lm
A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
smux
keep tmux sessions open with autossh
Language:ShellArtistic-2.0000
tensorflow
Computation using data flow graphs for scalable machine learning