nohzafk / raycast_api_proxy

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Raycast AI Proxy

This is a simple Raycast AI API proxy. It allows you to use the Raycast AI app without a subscription. It's a simple proxy that forwards requests from Raycast to the OpenAI API, converts the format, and returns the response in real-time.

English | 中文

Introduction

Supported Models

Model Name Test Status Environment Variables
openai Tested OPENAI_API_KEY
azure openai Tested AZURE_OPENAI_API_KEY, AZURE_DEPLOYMENT_ID, OPENAI_AZURE_ENDPOINT
gemini Experimental GOOGLE_API_KEY

Ai chat

ai chat

Translate

translate

How to Use

Quick Start with Docker

  1. Generate certificates
pip3 install mitmproxy
python -c "$(curl -fsSL https://raw.githubusercontent.com/yufeikang/raycast_api_proxy/main/scripts/cert_gen.py)"  --domain backend.raycast.com  --out ./cert
  1. Start the service
docker run --name raycast \
    -e OPENAI_API_KEY=$OPENAI_API_KEY \
    -p 443:443 \
    --dns 1.1.1.1 \
    -v $PWD/cert/:/data/cert \
    -e CERT_FILE=/data/cert/backend.raycast.com.cert.pem \
    -e CERT_KEY=/data/cert/backend.raycast.com.key.pem \
    -e LOG_LEVEL=INFO \
    -d \
    ghcr.io/yufeikang/raycast_api_proxy:main
  1. Change the OPENAI environment variable to using the Azure OpenAI API

See How to switch between OpenAI and Azure OpenAI endpoints with Python

docker run --name raycast \
    -e OPENAI_API_KEY=$OPENAI_API_KEY \
    -e OPENAI_API_BASE=https://your-resource.openai.azure.com \
    -e OPENAI_API_VERSION=2023-05-15 \
    -e OPENAI_API_TYPE=azure \
    -e AZURE_DEPLOYMENT_ID=your-deployment-id \
    -p 443:443 \
    --dns 1.1.1.1 \
    -v $PWD/cert/:/data/cert \
    -e CERT_FILE=/data/cert/backend.raycast.com.cert.pem \
    -e CERT_KEY=/data/cert/backend.raycast.com.key.pem \
    -e LOG_LEVEL=INFO \
    -d \
    ghcr.io/yufeikang/raycast_api_proxy:main

Experimental Google Gemini support

Obtain your Google API Key and export it as GOOGLE_API_KEY.

Currently only gemini-pro model is supported.

# git clone this repo and cd to it
docker build -t raycast .
docker run --name raycast \
    -e GOOGLE_API_KEY=$GOOGLE_API_KEY \
    -p 443:443 \
    --dns 1.1.1.1 \
    -v $PWD/cert/:/data/cert \
    -e CERT_FILE=/data/cert/backend.raycast.com.cert.pem \
    -e CERT_KEY=/data/cert/backend.raycast.com.key.pem \
    -e LOG_LEVEL=INFO \
    -d \
    raycast:latest

Install Locally

  1. Clone this repository
  2. Use pdm install to install dependencies
  3. Create an environment variable
export OPENAI_API_KEY=<your openai api key>
  1. Use ./scripts/cert_gen.py --domain backend.raycast.com --out ./cert to generate a self-signed certificate
  2. Start the service with python ./app/main.py

Configuration

  1. Modify /etc/host to add the following line:
127.0.0.1 backend.raycast.com
::1 backend.raycast.com

The purpose of this modification is to point backend.raycast.com to the localhost instead of the actual backend.raycast.com. You can also add this record in your DNS server.

  1. Add the certificate trust to the system keychain

Open the CA certificate in the cert folder and add it to the system keychain and trust it. This is necessary because the Raycast AI Proxy uses a self-signed certificate and it must be trusted to work properly.

Note:

When using macOS on Apple Silicon, if you experience issues with applications hanging when manually adding a CA certificate to Keychain Access, you can use the following command in the terminal as an alternative method:

mitmproxy document

sudo security add-trusted-cert -d -p ssl -p basic -k /Library/Keychains/System.keychain ~/.mitmproxy/mitmproxy-ca-cert.pem

About


Languages

Language:Python 93.2%Language:Shell 3.9%Language:Dockerfile 2.9%