Taco Cohen's starred repositories

AlphaCodium

Official implementation for the paper: "Code Generation with AlphaCodium: From Prompt Engineering to Flow Engineering""

Language:PythonLicense:AGPL-3.0Stargazers:3310Issues:0Issues:0

USACO

Can Language Models Solve Olympiad Programming?

Language:PythonStargazers:87Issues:0Issues:0

Awesome-LLM-Uncertainty-Reliability-Robustness

Awesome-LLM-Robustness: a curated list of Uncertainty, Reliability and Robustness in Large Language Models

License:MITStargazers:582Issues:0Issues:0

aideml

AIDE: the Machine Learning CodeGen Agent

Language:PythonLicense:MITStargazers:296Issues:0Issues:0

InfiniteBench

Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718

Language:PythonLicense:MITStargazers:215Issues:0Issues:0

OpenDevin

🐚 OpenDevin: Code Less, Make More

Language:PythonLicense:MITStargazers:28798Issues:0Issues:0

EfficientZero

Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021.

Language:PythonLicense:GPL-3.0Stargazers:847Issues:0Issues:0
Language:PythonLicense:MITStargazers:6Issues:0Issues:0

GPUAR

A CUDA implementation of Arithmetic Coding

Language:C++Stargazers:14Issues:0Issues:0

LLM-PuzzleTest

This repository is maintained to release dataset and models for multimodal puzzle reasoning.

Language:PythonStargazers:29Issues:0Issues:0

gpt-fast

Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.

Language:PythonLicense:BSD-3-ClauseStargazers:5375Issues:0Issues:0

vllm

A high-throughput and memory-efficient inference and serving engine for LLMs

Language:PythonLicense:Apache-2.0Stargazers:23282Issues:0Issues:0

CodeXGLUE

CodeXGLUE

Language:C#License:MITStargazers:1481Issues:0Issues:0

MultiPL-E

A multi-programming language benchmark for LLMs

Language:PythonLicense:NOASSERTIONStargazers:170Issues:0Issues:0

A-Fast-Transformer-based-General-Purpose-LosslessCompressor

This repository contains the source code and dataset link mentioned in WWW 2022 accepted paper "TRACE:A Fast Transformer-based General-Purpose LosslessCompressor".

Language:PythonStargazers:24Issues:0Issues:0

torchac

Entropy coding / arithmetic coding for PyTorch

Language:PythonLicense:GPL-3.0Stargazers:228Issues:0Issues:0

Reference-arithmetic-coding

Clear implementation of arithmetic coding for educational purposes in Java, Python, C++.

Language:JavaStargazers:361Issues:0Issues:0

NeuralCompression

A collection of tools for neural compression enthusiasts.

Language:PythonLicense:MITStargazers:486Issues:0Issues:0

TinyLlama

The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.

Language:PythonLicense:Apache-2.0Stargazers:7370Issues:0Issues:0

branches

Prototype advanced LLM algorithms for reasoning and planning.

Language:TypeScriptLicense:MITStargazers:78Issues:0Issues:0

lmql

A language for constraint-guided and efficient LLM programming.

Language:PythonLicense:Apache-2.0Stargazers:3509Issues:0Issues:0

guidance

A guidance language for controlling large language models.

Language:Jupyter NotebookLicense:MITStargazers:18259Issues:0Issues:0

cleanrl

High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features (PPO, DQN, C51, DDPG, TD3, SAC, PPG)

Language:PythonLicense:NOASSERTIONStargazers:4954Issues:0Issues:0

anthology-of-modern-ml

Collection of important articles to be treated as a textbook

Language:Jupyter NotebookLicense:MITStargazers:520Issues:0Issues:0

outlines

Structured Text Generation

Language:PythonLicense:Apache-2.0Stargazers:7266Issues:0Issues:0

xlang-paper-reading

Paper collection on building and evaluating language model agents via executable language grounding

Stargazers:324Issues:0Issues:0
Language:C++License:Apache-2.0Stargazers:2037Issues:0Issues:0

xformers

Hackable and optimized Transformers building blocks, supporting a composable construction.

Language:PythonLicense:NOASSERTIONStargazers:8069Issues:0Issues:0

equiformer_v2

[ICLR'24] EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations

Language:PythonLicense:MITStargazers:167Issues:0Issues:0

fairscale

PyTorch extensions for high performance and large scale training.

Language:PythonLicense:NOASSERTIONStargazers:2997Issues:0Issues:0