stellar-amenities / assistants

The ⭐️ Open Source Assistants API allows you to build AI assistants within your own applications with your own models. 100% private, 75% Cheaper & 23x Faster Assistants. Same API/SDK. Written in Rust

Home Page:https://bit.ly/open-assistants

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

assistants

⭐️ Open Source Assistants API

Build Powerful AI Assistants In-House, On Your Terms

75% Cheaper & 23x Faster Assistants. Same API/SDK.

Open in GitHub Codespaces
Join Discord


📞 Need Support? We're here for you.
🖼️ How it Works – Visual Guide
✨ Suggest a Feature
❤️‍🩹 Found a Bug? Let us know.

Quickstart

Get started in less than a minute through GitHub Codespaces:

Open in GitHub Codespaces

Or:

git clone https://github.com/stellar-amenities/assistants
cd assistants
cp .env.example .env

To get started quickly, let's use Perplexity API. Get an API key from Perplexity. You can get it here. Replace in .env the MODEL_API_KEY with your API key

Install OpenAI SDK: npm i openai

Start the infra:

docker-compose --profile api -f docker/docker-compose.yml up -d

Run the quickstart:

node examples/quickstart.js

Why Open Source Assistants API?

  • Full Control: Own your data, your models, and your destiny.
  • No Hidden Costs: Absolutely free. Seriously, no strings attached.
  • Customizable: Tailor the AI to your specific needs and use cases.
  • Offline Capabilities: Perfect for edge cases or internet-free zones.
  • OpenAI Compatibility: Love OpenAI's API? We play nice with that too.
  • Simplicity: Easy setup, no steep learning curve.
  • Non-woke style: Get rid of OpenAI Woke/Brainwashed/PC models.
  • Unleashed code interpreter: OpenAI Code interpreter is not allowed to do other things than processing data (e.g. cannot do web scrapping, etc.). This one is unleashed.

What's Cooking? – Latest News

  • [2023/08/19] 🔥 New example: Open source LLM with code interpreter. Learn more.
  • [2023/08/12] 🔥 New example: Open source LLM with function calling. Learn more.
  • [2023/29/11] 🔥 New example: Using mistral-7b, an open source LLM. Check it out.

Key Features

  • Code Interpreter: Runs Python code in a sandboxed environment. (beta)
  • Knowledge Retrieval: Retrieves external knowledge or documents.
  • Function Calling: Defines and executes custom functions.
  • File Handling: Supports a range of file formats.
  • Multimodal: Supports audio, images, and text.
    • image audio text
    • audio text
    • image text (soon)
    • text

What can you build with Assistants?

Join the Movement

  • For Developers: We've got the docs, tools, and a community ready to help you build what's next.
  • For Innovators: Looking for an edge in AI? Here's where you leapfrog the competition.
  • For the Visionaries: Dreamt of a custom AI assistant? Let's make it a reality.

Deployment

Please follow this documentation.

FAQ

What's the difference with LangChain? LangChain offers detailed control over AI conversations, while OpenAI's Assistants API simplifies the process, managing conversation history, data/vector store, and tool switching for you.
Are you related to OpenAI? No.
I don't use Assistants API. Can I use this? We recommend switching to the Assistants API for a more streamlined experience, allowing you to focus more on your product than on infrastructure.
Does the Assistants API support audio and images? Images soon, working on it. Audio in a few weeks.

About

The ⭐️ Open Source Assistants API allows you to build AI assistants within your own applications with your own models. 100% private, 75% Cheaper & 23x Faster Assistants. Same API/SDK. Written in Rust

https://bit.ly/open-assistants

License:MIT License


Languages

Language:Rust 98.8%Language:Makefile 0.6%Language:Shell 0.3%Language:Dockerfile 0.3%