Abacus.AI's repositories
Long-Context
This repository contains code and tooling for the Abacus.AI LLM Context Expansion project. Also included are evaluation scripts and benchmark tasks that evaluate a model’s information retrieval capabilities with context expansion. We also include key experimental results and instructions for reproducing and building on them.
intraprocessing_debiasing
Code to accompany NeurIPS paper https://arxiv.org/abs/2006.08564
EasyContext
Memory optimization and training recipes to extrapolate language models' context length to 1 million tokens, with minimal hardware.
detect-pretrain-code-contamination
run tests in https://swj0419.github.io/detect-pretrain.github.io/ for contamination
transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
min-tfs-client
A lightweight python gRPC client to communicate with TensorFlow Serving
transformers-bloom-inference
Fast Inference Solutions for BLOOM
peft
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Puzzle-Generator-and-Solver
Puzzle Generator; Einstein's Riddle, Zebra Puzzle and Blood Donation Puzzle Solver. For non-commercial use only!
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs