ausbitbank / openai-gemini

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Why

The Gemini API is free, but there are many tools that work exclusively with the OpenAI API.

This project provides a personal OpenAI-compatible endpoint for free.

Serverless?

Although it runs in the cloud, it does not require server maintenance. It can be easily deployed to various providers for free (with generous limits suitable for personal use).

Tip

Running the proxy endpoint locally is also an option, though it's more appropriate for development use.

How to start

You will need a personal Google API key.

Important

Even if you are located outside of the supported regions (e.g., in Europe), it is still possible to acquire one using a VPN.

Deploy the project to one of the providers, using the instructions below. You will need to set up an account there.

If you opt for “button-deploy”, you'll be guided through the process of forking the repository first, which is necessary for continuous integration (CI).

Deploy with Vercel

Deploy with Vercel

  • Alternatively can be deployed with cli: vercel deploy
  • Serve locally: vercel dev
  • Vercel Functions limitations (with Edge runtime)

Deploy to Netlify

Deploy to Netlify

  • Alternatively can be deployed with cli: netlify deploy
  • Serve locally: netlify dev
  • Two different api bases provided:
    • /v1 (e.g. /v1/chat/completions endpoint)
      Functions limits
    • /edge/v1
      Edge functions limits

Deploy to Cloudflare

Deploy to Cloudflare Workers

How to use

Note

Not all tools allows overriding the OpenAI endpoint, but many do (however these settings can sometimes be deeply hidden).

Use the endpoint address wherever you can specify it. The relevant field may be labeled as "OpenAI proxy". You might need to look under "Advanced settings" or similar sections. Or in some config file.

For some command-line tools, you may need to set an environment variable, e.g.:
set OPENAI_BASE_URL=https://my-super-proxy.vercel.app/v1
..or:
set OPENAI_API_BASE=https://my-super-proxy.vercel.app/v1


Possible further development

  • chat/completions

    Currently, most of the parameters that are applicable to both APIs have been implemented, with the exception of function calls.

    • messages
      • content
      • role
        • system (=>user)
        • user
        • assistant
        • tool (v1beta)
      • name
      • tool_calls
    • model (value ignored, autoselect "gemini-pro", or "-vision" for "gpt-4-vision-preview" request)
    • frequency_penalty
    • logit_bias
    • logprobs
    • top_logprobs
    • max_tokens
    • n (candidateCount <8) n.b.: atm api does not accept >1
    • presence_penalty
    • response_format
    • seed
    • stop: string|array (stopSequences [1,5])
    • stream
    • temperature (0.0..1.0)
      • <0, >1..2
    • top_p
    • tools (v1beta)
    • tool_choice (v1beta)
    • user
  • completions

  • embeddings

  • models

About

License:MIT License


Languages

Language:JavaScript 100.0%