David Stap's repositories
encoder-decoder
Encoder-decoder model with attention (Luong), with two LSTM layers with 500 hidden units on both encoder and decoder side. The vocabulary size on both source (english) and target side (Dutch) is 50000. The model is trained on the train part of the TED dataset (https://wit3.fbk.eu/mt.php?release=2017-01-trnmted), maximum sequence length 50.
celeba-hq-modified
Modified h5tool.py make user getting celeba-HQ easier
acl-anthology
Data and software for building the ACL Anthology.
Cross-Modal-Projection-Learning
TensorFlow Implementation of Deep Cross-Modal Projection Learning
InfoGAN-PyTorch
PyTorch Implementation of InfoGAN
lafand-mt-srn
MAFAND-MT
mosesdecoder
Moses, the machine translation system
mt-archive
Project for ACL. Converting all the data from http://www.mt-archive.info into the ACL standard XML format.
mteb
MTEB: Massive Text Embedding Benchmark
natural-instructions-expansion
Expanding natural instructions
Pytorch_InfoGAN
Easy InfoGAN Implementation in Pytorch
sat_ml_sudoku
Knowledge Representation project (MSc Artificial Intelligence): use ML to predict statistics of SAT solver