mgallo / openai.ex

community-maintained OpenAI API Wrapper written in Elixir.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

API key per request

feynmanliang opened this issue Β· comments

Describe the feature or improvement you're requesting

I believe the API key is currently read once from the environment during configuration and then re-used globally. It would be nice to be able to set the API key per request.

Additional context

We have a multi-tenant use case where multiple OpenAI API keys are present and certain requests must use certain keys.

Ha, I came here to request this.

I'm "maintaining" a multi-tentant fork here: https://github.com/Miserlou/openai.ex/tree/mt but I wouldn't expect anybody else to use it, but suits me in the immediate term. Would really, really prefer to use the proper repo.

So yeah, +1

Hey! That would be a very interesting feature, I will try to implement it in one of the next releases.. . Unfortunately I'm a bit busy these days and it may take some time. @Miserlou I'll have a look at your repo to see if I can steal something πŸ”₯, btw if you already have something ready (or pseudoready), feel free to open a PR!

Nice! My branch is hot trash, I just needed something immediately and did the first thing that came to mind, totally breaking the API. Ideally, the API will support both use cases and not break anything for anybody. It should be pretty easy other than the finickiness of how Elixir handles default arguments, I'm just a little busy with the thing I'm actually using the library for. But it shouldn't take @feynmanliang more than an hour, I'm sure. πŸ˜›

You overestimate my abilities... but I desperately need this, so let me see what I can do.

I sent #29 which POCs this on just the Completions API, but while working on this I discovered OpenAI publishes an OpenAPI spec and fell into an existential crisis.

So I didn't complete the work for the rest of the APIs. Instead, I am now playing with @aj-foster's fantastic OpenAPI SDK generator to see if we can generate this automatically.

@feynmanliang That's great news! Happy to hear your feedback and figure out ways the library can be applicable to more use-cases. If you need anything, I'm always available in discussions.

wow, awesome @feynmanliang! I'll take a look at the PR and try to extend it ASAP (next week I think..). Great idea to generate the client directly from the API schema, it would help a lot to stay in synchronisation with the official specifications. πŸ‘€

btw really nice project @aj-foster πŸ₯‡

hey! I finally found some time to work on this request, and it's live in v0.5.0!
You can find some documentation on how to use it here: https://github.com/mgallo/openai.ex#configuration-override

I preferred to create a struct for the config, that can be passed to the library's functions, so you can do:

config_override = %OpenAI.Config{ api_key: "test-api-key" } # this will return a config struct with "test-api-key" as api_key, and all the other config values taken from config.exs, so you don't need to set the defaults manually

# chat_completion with overriden config
OpenAI.chat_completion([
  model: "gpt-3.5-turbo",
  messages: [
        %{role: "system", content: "You are a helpful assistant."},
        %{role: "user", content: "Who won the world series in 2020?"},
        %{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
        %{role: "user", content: "Where was it played?"}
    ]
  ],
  config_override # <--- pass the overriden configuration as last argument of the function
)

@feynmanliang thanks for your PR, it was helpful in developing the final feature! I preferred to pass a complete config object in order to be more flexible and improve the way http_options can be overwritten, but your code was very helpful.

Let me know your feedbacks. πŸ”₯

@mgallo This seems to break the default api key behavior. I just updated and my code that previously worked now crashes, complaining that I did not provide an API key.

Example:

iex(2)> OpenAI.Config.api_key()
"sk-E7LIV37888888888888888888888G7Q8N1gF4iGTal"
iex(3)> OpenAI.Config.org_key()
"org-BmAHS88888888888QqOV25j"
iex(4)> OpenAI.chat_completion(model: "gpt-3.5-turbo", temperature: 0.2, max_tokens: 2000, messages: [%{role: "user", content: "Hello. Tell me about yourself."}])
{:error,
 %{
   "error" => %{
     "code" => nil,
     "message" => "You didn't provide an API key. You need to provide your API key in an Authorization header using Bearer auth (i.e. Authorization: Bearer YOUR_KEY), or as the password field (with blank username) if you're accessing the API from your browser and are prompted for a username and password. You can obtain an API key from https://platform.openai.com/account/api-keys.",
     "param" => nil,
     "type" => "invalid_request_error"
   }
 }}

@jwilger thanks for reporting the issue, it's a silly bug with how I initialise the struct, which is valued at compile time πŸ˜“ . if you do a mix compile it should take the right values of your config. I will release a fix shortly, sorry for the inconvenience