JP's starred repositories
transformers
π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
applied-ml
π Papers & tech blogs by companies sharing their work on data science & machine learning in production.
speechbrain
A PyTorch-based Speech Toolkit
transformer-deploy
Efficient, scalable and enterprise-grade CPU/GPU inference server for π€ Hugging Face transformer models π
transformers-interpret
Model explainability that works seamlessly with π€ transformers. Explain your transformers model in just 2 lines of code.
DataAug4NLP
Collection of papers and resources for data augmentation for NLP.
transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks
NL-Augmenter
NL-Augmenter π¦ β π A Collaborative Repository of Natural Language Transformations
ISIC-Archive-Downloader
A script to download the ISIC Archive of lesion images
label-studio-transformers
Label data using HuggingFace's transformers and automatically get a prediction service
diffbot-python-client
Python Diffbot API Client
EUNN-tensorflow
Efficient Unitary Neural Network(EUNN) implementation in Tensorflow
FrenchLefffLemmatizer
A French Lemmatizer in Python based on the LEFFF
torch_eunn
A Pytorch implementation of an efficient unitary neural network (https://arxiv.org/abs/1612.05231)
benchmark-for-transformers
Tool for easily comparing and evaluating the performance of transformers under different scenarios.
nlu_data_diets
NLU on Data Diets
toxic-comment-server
Models to detect hateful comments served with flask trained on Kaggle's Toxic Comment Classification Challenge dataset.
covid19-transmission-ukf
With this repository, I derive the time-dependent R0 coefficient of the COVID-19 with the Unscented Kalman Filter from the data gathered by John Hopkins assuming the SEIR model.
pure-matrix
Pure python matrix code to do algebra with PCA (naive power iteration) and KMean (random initialization) implementations.
orsum2020_collaborative_datasets
Anonymized train set and test set used in RecSys2020 experiment to optimize the hyperparameters.