danielgross / chroma

the open source embedding database

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

logo

Chroma

Chroma is the open-source embedding database. Chroma makes it easy to build LLM apps by making knowledge, facts, and skills pluggable for LLMs.

ChatGPT for ______

For example, the "Chat your data" use case:

  1. Add documents to your database. You can pass in your own embeddings, embedding function, or let Chroma embed them for you.
  2. Query relevant documents with natural language.
  3. Compose documents into the context window of an LLM like GPT3 for additional summarization or analysis.

Features

  • Simple: Fully-typed, fully-tested, fully-documented == happiness
  • Integrations: πŸ¦œοΈπŸ”— LangChain and more soon
  • Dev, Test, Prod: the same API that runs in your python notebook, scales to your cluster
  • Feature-rich: Queries, filtering, density estimation and more
  • Free: Apache 2.0 Licensed

Get up and running

pip install chromadb
import chromadb
client = chromadb.Client()
collection = client.create_collection("all-my-documents")
collection.add(
    embeddings=[[1.5, 2.9, 3.4], [9.8, 2.3, 2.9]],
    metadatas=[{"source": "notion"}, {"source": "google-docs"}],
    ids=["n/102", "gd/972"],
)
results = collection.query(
    query_embeddings=[1.5, 2.9, 3.4],
    n_results=2
)

Get involved

Chroma is a rapidly developing project. We welcome PR contributors and ideas for how to improve the project.

Embeddings?

What are embeddings?

  • Read the guide from OpenAI
  • Literal: Embedding something turns it from image/text/audio into a list of numbers. πŸ–ΌοΈ or πŸ“„ => [1.2, 2.1, ....]. This process makes documents "understandable" to a machine learning model.
  • By analogy: An embedding represents the essence of a document. This enables documents and queries with the same essence to be "near" each other and therefore easy to find.
  • Technical: An embedding is the latent-space position of a document at a layer of a deep neural network. For models trained specifically to embed data, this is the last layer.
  • A small example: If you search your photos for "famous bridge in San Francisco". By embedding this query and comparing it to the embeddings of your photos and their metadata - it should return photos of the Golden Gate Bridge.

License

Apache 2.0

About

the open source embedding database

License:Apache License 2.0


Languages

Language:Python 94.9%Language:Shell 4.8%Language:Dockerfile 0.3%