Daniel King's repositories
DeeperSpeed
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
allennlp
An open-source NLP research library, built on PyTorch.
blanc
Human-free quality estimation of document summaries
catalogue
Super lightweight function registries for your library
composer
Train neural networks up to 7x faster
examples
Fast and flexible reference benchmarks
databricks-sdk-py
Databricks SDK for Python (Beta)
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
install.python-poetry.org
The official Poetry installation script
langchain
⚡ Building applications with LLMs through composability ⚡
llm-analysis
Latency and Memory Analysis of Transformer Models for Training and Inference
pytextrank
Python implementation of TextRank for text document NLP parsing and summarization
pytorch-lightning
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplate
rich-context-competition
Supporting materials for submitting an entry to the inaugural Coleridge Rich Context Competition - http://coleridgeinitiative.org/richcontextcompetition
scikit-learn
scikit-learn: machine learning in Python
scispacy
This repository contains custom pipes and models related to using spaCy for scientific documents.
sgtb
Structured Gradient Tree Boosting
spaCy
💫 Industrial-strength Natural Language Processing (NLP) with Python and Cython
streaming
A Data Streaming Library for Efficient Neural Network Training
textrank
TextRank implementation for Python 3.
transformers
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.
trlx
A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)