GraphicalDot / open-llms

🤖 A list of open LLMs available for commercial use.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Open LLMs

These LLMs are all licensed for commercial use (e.g., Apache 2.0, MIT, OpenRAIL-M). Contributions welcome!

Language Model Checkpoints Paper/Blog Size Context Length Licence
T5 T5 & Flan-T5, Flan-T5-xxl (HF) Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer 60M - 11B 512 Apache 2.0
UL2 UL2 & Flan-UL2, Flan-UL2 (HF) UL2 20B: An Open Source Unified Language Learner 20B 512, 2048 Apache 2.0
Cerebras-GPT Cerebras-GPT Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models (Paper) 111M - 13B 2048 Apache 2.0
Pythia pythia 70M - 12B Pythia: A Suite for Analyzing Large Language Models Across Training and Scaling 70M - 12B 2048 Apache 2.0
Dolly dolly-v2-12b Free Dolly: Introducing the World's First Truly Open Instruction-Tuned LLM 3B, 7B, 12B 2048 MIT
RWKV RWKV, ChatRWKV The RWKV Language Model (and my LM tricks) 100M - 14B infinity (RNN) Apache 2.0
GPT-J-6B GPT-J-6B, GPT4All-J GPT-J-6B: 6B JAX-Based Transformer 6B 2048 Apache 2.0
GPT-NeoX-20B GPT-NEOX-20B GPT-NeoX-20B: An Open-Source Autoregressive Language Model 20B 2048 Apache 2.0
Bloom Bloom BLOOM: A 176B-Parameter Open-Access Multilingual Language Model 176B 2048 OpenRAIL-M v1
StableLM-Alpha StableLM-Alpha Stability AI Launches the First of its StableLM Suite of Language Models 3B - 65B 4096 CC BY-SA-4.0
FastChat-T5 fastchat-t5-3b-v1.0 We are excited to release FastChat-T5: our compact and commercial-friendly chatbot! 3B 512 Apache 2.0
h2oGPT h2oGPT Building the World’s Best Open-Source Large Language Model: H2O.ai’s Journey 12B - 20B 256 - 2048 Apache 2.0
MPT-7B MPT-7B, MPT-7B-Instruct Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs 7B 84k (ALiBi) Apache 2.0
RedPajama-INCITE RedPajama-INCITE Releasing 3B and 7B RedPajama-INCITE family of models including base, instruction-tuned & chat models 3B - 7B ? Apache 2.0
OpenLLaMA OpenLLaMA-7b-preview-300bt OpenLLaMA: An Open Reproduction of LLaMA 7B 2048 Apache 2.0

LLMs for code

Language Model Checkpoints Paper/Blog Size Context Length Licence
SantaCoder santacoder SantaCoder: don't reach for the stars! 1.1B ? OpenRAIL-M v1
StarCoder starcoder StarCoder: A State-of-the-Art LLM for Code, StarCoder: May the source be with you! 15B 8192 OpenRAIL-M v1
Replit Code replit-code-v1-3b Training a SOTA Code LLM in 1 week and Quantifying the Vibes — with Reza Shabani of Replit 2.7B infinity? (ALiBi) CC BY-SA-4.0

Evals on open LLMs

LLM datasets for fine-tuning

PENDING

Want to contribute? Just add a row above.


What do the licences mean?

  • Apache 2.0: Allows users to use the software for any purpose, to distribute it, to modify it, and to distribute modified versions of the software under the terms of the license, without concern for royalties.
  • MIT: Similar to Apache 2.0 but shorter and simpler. Also, in contrast to Apache 2.0, does not require stating any significant changes to the original code.
  • CC BY-SA-4.0: Allows (i) copying and redistributing the material and (ii) remixing, transforming, and building upon the material for any purpose, even commercially. But if you do the latter, you must distribute your contributions under the same license as the original. (Thus, may not be viable for internal teams.)
  • OpenRAIL-M v1: Allows royalty-free access and flexible downstream use and sharing of the model and modifications of it, and comes with a set of use restrictions (see Attachment A)

Disclaimer: The information provided in this repo does not, and is not intended to, constitute legal advice. Maintainers of this repo are not responsible for the actions of third parties who use the models. Please consult an attorney before using models for commercial purposes.


Improvements

  • Complete entries for context length, and check entries with ?
  • Add number of tokens trained? (see considerations)
  • Add (links to) training code?
  • Add (links to) eval benchmarks?

About

🤖 A list of open LLMs available for commercial use.

License:Apache License 2.0