Moin Nadeem's repositories
exllama
A more memory-efficient rewrite of the HF transformers implementation of Llama for use with quantized weights.
FasterTransformer
Transformer related optimization, including BERT, GPT
companion-app
AI companions with memory: a lightweight stack to create and host your own AI companions
composer
library of algorithms to speed up neural network training
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
loky
Robust and reusable Executor for joblib
pytorch-pretrained-BERT
The Big-&-Extending-Repository-of-Transformers: PyTorch pretrained models for Google's BERT, OpenAI GPT & GPT-2 and Google/CMU Transformer-XL.
stargazers
Analyze GitHub stars
apex
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
bounded_pool_executor
Bounded Process&Thread Pool Executor
ReviewAdvisor
Heavy Workload on Reviewing Papers? ReviewAdvisor Helps out
openie6
OpenIE6 system
characterizing-sampling-algorithms
The official codebase for "A Systematic Characterization of Sampling Algorithms for Open-ended Language Generation"
website
📝 Easily create a beautiful website using Academic, Hugo, and Netlify
coqa-baselines
The baselines used in the CoQA paper
nlp-qa
Gathering information about question-answering systems
oh-my-fish
The Fish Shell Framework
e2e-coref
End-to-end Neural Coreference Resolution
sent-bias
Code and test data for "On Measuring Bias in Sentence Encoders", to appear at NAACL 2019.
LearnedBloomFilters
Implementing Various Learned Bloom Filters in PyTorch
NLP-Paper-Summaries
Summaries of recent papers in AI.
StanceDetectionDemo
A demonstration of Brian's stance detection model for Fake News Detection.
Lucene_Document_Retrieval
Document Retrieval using Classical IR Algorithms
loss-landscape
Code for visualizing the loss landscape of neural nets