mgallo / openai.ex

community-maintained OpenAI API Wrapper written in Elixir.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Streaming example does not work in the shell

samm81 opened this issue Β· comments

hi! first, thanks for your work on this 😊

I've gotten the streaming to work in an .exs file (as demonstrated in #36 ), but it doesn't seem to work in a shell (iex -S mix), it just hangs forever.

is there a fundamental reason that has to do with the shell, or am I just missing something?

Hi! Thank you!
Unfortunately I cannot reproduce it, I tried it on my console and it works properly (I just copied and pasted the snippet). Can you give me more information so I can try to help you?
If you are using the exact snippet, make sure you have the variable OPENAI_API_KEY set as an environment variable, otherwise you can try to pass only the httpoptions in the stream configuration, in this way it will use the api_key variable you have in your config.exs

OpenAI.chat_completion([
    model: "gpt-3.5-turbo",
    messages: [
      %{role: "system", content: "You are a helpful assistant."},
      %{role: "user", content: "Who won the world series in 2020?"},
      %{role: "assistant", content: "The Los Angeles Dodgers won the World Series in 2020."},
      %{role: "user", content: "Where was it played?"}
    ],
    stream: true,
  ],
  %OpenAI.Config{
    http_options: [recv_timeout: :infinity, stream_to: self(), async: :once]
  }
)
|> Stream.flat_map(fn res -> res["choices"] end)
|> Stream.each(fn choice ->
  IO.write(choice["delta"]["content"])
end)
|> Stream.run()
iex-stream.mp4

hmm so I'm working with a local model, and I'm working with the .completions function:

iex(2)> OpenAI.completions([ model: "my_model/ggml-model-q4_0.bin", prompt: "a long long time ago", max_tokens: 32, temperature: 0.3, stream: true], %OpenAI.Config{ http_options: [recv_timeout: :infinity, stream_to: self(), async: :once] }) |> Stream.flat_map(fn res -> res["choices"] end) |> Stream.each(fn choice -> IO.write(choice["delta"]["content"]) end) |> Stream.run()
:ok
iex(3)>
nil

I don't have an openapi key so I can't try the .completions endpoint with the official api, but I do know that the above works when I put it in a .exs file and run it

ahhh, I figured it out - the .completions returned "choices" don't have a "delta" key, they have a "text" key, this modified snippet works in the shell:

iex(4)> OpenAI.completions([ model: "my_model/ggml-model-q4_0.bin", prompt: "a long long time ago", max_tokens: 32, temperature: 0.3, stream: true], %OpenAI.Config{ http_options: [recv_timeout: :infinity, stream_to: self(), async: :once] }) |> Stream.flat_map(fn res -> res["choices"] end) |> Stream.each(fn choice -> IO.write(choice["text"]) end) |> Stream.run()
 | I was in the middle of a conversation with my friend | and he said "I'm sorry, but I have to go" | and I said:ok
iex(5)>
nil

thank you!