zya / litellmjs

JavaScript implementation of LiteLLM.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

πŸš… LiteLLM.js

JavaScript implementation of LiteLLM.

Usage

npm install litellm
import { completion } from 'litellm';
process.env['OPENAI_API_KEY'] = 'your-openai-key';

const response = await completion({
  model: 'gpt-3.5-turbo',
  messages: [{ content: 'Hello, how are you?', role: 'user' }],
});

// or stream the results
const stream = await completion({
  model: "gpt-3.5-turbo",
  messages: [{ content: "Hello, how are you?", role: "user" }],
  stream: true
});

for await (const part of stream) {
  process.stdout.write(part.choices[0]?.delta?.content || "");
}

Features

We aim to support all features that LiteLLM python package supports.

  • Standardised completions
  • Standardised embeddings
  • Standardised input params 🚧 - List is here
  • Caching ❌
  • Proxy ❌

Supported Providers

Provider Completion Streaming Embedding
openai βœ… βœ… βœ…
cohere βœ… βœ… ❌
anthropic βœ… βœ… ❌
ollama βœ… βœ… βœ…
ai21 βœ… βœ… ❌
replicate βœ… βœ… ❌
deepinfra βœ… βœ… ❌
mistral βœ… βœ… βœ…
huggingface ❌ ❌ ❌
together_ai ❌ ❌ ❌
openrouter ❌ ❌ ❌
vertex_ai ❌ ❌ ❌
palm ❌ ❌ ❌
baseten ❌ ❌ ❌
azure ❌ ❌ ❌
sagemaker ❌ ❌ ❌
bedrock ❌ ❌ ❌
vllm ❌ ❌ ❌
nlp_cloud ❌ ❌ ❌
aleph alpha ❌ ❌ ❌
petals ❌ ❌ ❌

Development

Clone the repo

git clone https://github.com/zya/litellmjs.git

Install dependencies

npm install

Run unit tests

npm t

Run E2E tests

First copy the example env file.

cp .example.env .env

Then fill the variables with your API keys to be able to run the E2E tests.

OPENAI_API_KEY=<Your OpenAI API key>
....

Then run the command below to run the tests

npm run test:e2e

About

JavaScript implementation of LiteLLM.

License:MIT License


Languages

Language:TypeScript 99.4%Language:JavaScript 0.6%