orhanerday / open-ai

OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)

Home Page:https://orhanerday.gitbook.io/openai-php-api-1/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Chat API: first event line gets split into two for streamed response

Schoude opened this issue · comments

Describe the bug

Using the chat API with stream set to true splits the first event line into two parts at the content key in the choices array breaking the JSON response from OpenAI.

I want to parse the data of every event line to JSON and this fails now because of the split JSON string I receive.

All following responses are valid JSON and also data: [DONE] gets send.

Could this be a problem with the content being an empty string for the first response?

I also called the OpenAI API directly from JS and the first event line gets send completely so a bug on OpenAI's side should be out of the question.

Response from direclty calling the OpenAI API endpoint https://api.openai.com/v1/chat/completions.

image

Thanks for looking into it.

To Reproduce

This is the dumped streamed output from the code snippet dump($data); as you can see the first result is split at the content property containing an empty string "" with the finish reason following with the next event line.

 ------------ --------------------------------- 
  date         Tue, 14 Nov 2023 08:33:35 +0000  
  controller   "ConversationController"         
  source       Client.php on line 188           
  file         app/Services/OpenAI/Client.php   
 ------------ --------------------------------- 

"data: {"id":"chatcmpl-8Kj9Su5BrpORXECCiNhE6MbOrbwp9","object":"chat.completion.chunk","created":1699950814,"model":"gpt-4-0613","choices":[{"index":0,"delta":{"role":"assistant","content":"

 ------------ --------------------------------- 
  date         Tue, 14 Nov 2023 08:33:35 +0000  
  controller   "ConversationController"         
  source       Client.php on line 188           
  file         app/Services/OpenAI/Client.php   
 ------------ --------------------------------- 

"""
""},"finish_reason":null}]}\n
\n
data: {"id":"chatcmpl-8Kj9Su5BrpORXECCiNhE6MbOrbwp9","object":"chat.completion.chunk","created":1699950814,"model":"gpt-4-0613","choices":[{"index":0,"delta":{"content":"Ol"},"finish_reason":null}]}\n
\n
"""

[INTERMEDIATE OUTPUT SKIPPED]

 ------------ --------------------------------- 
  date         Tue, 14 Nov 2023 08:33:51 +0000  
  controller   "ConversationController"         
  source       Client.php on line 188           
  file         app/Services/OpenAI/Client.php   
 ------------ --------------------------------- 

"""
data: {"id":"chatcmpl-8Kj9Su5BrpORXECCiNhE6MbOrbwp9","object":"chat.completion.chunk","created":1699950814,"model":"gpt-4-0613","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}\n
\n
data: [DONE]\n
\n
"""

Code snippets

// Using Laravel 10.10

$arguments = collect([
    'model' => 'gpt-4',
    'messages' => $messages,
    'temperature' => $temperature ?? self::DEFAULT_TEMPERATURE,
    'presence_penalty' => $presencePenalty ?? self::DEFAULT_PRESENCE_PENALTY,
    'frequency_penalty' => $frequencyPenalty ?? self::DEFAULT_FREQUENCY_PENALTY,
    'functions' => $functions,
    'function_call' => $function_call,
    'n' => 1,
    'stream' => true,
])
    ->filter(fn ($value) => $value !== null)
    ->toArray();

$this->client->chat($arguments, function ($curl, string $data) {
    // Each received event line direcly gets dumped here. No manipulation on our side.
    dump($data);

    echo $data . "<br><br>";
    echo PHP_EOL;
    ob_flush();
    flush();
    return strlen($data);
});

OS

Windows 11

PHP version

8.2

Library version

4.9.1

Could you please confirm if this issue has already been reported before or not? If not, kindly visit #109 to check if a similar issue has already been raised.