MacPaw / OpenAI

Swift community driven package for OpenAI public API

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Partial ChatResult data chunks

jcmourey opened this issue · comments

Often receiving partial chunks of ChatResult from an OpenAI streaming session, which causes a DecodingError inside this package.

Example of data received by StreamingSession.urlSession:

data: {"id":"chatcmpl-7oBwbf0NG9vi74Z4mt3XasGhhdAVl","object":"chat.completion.chunk","created":1692196669,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"function_call":{"arguments":" che"}},"finish_reason":null}]}

data: {"id":"chatcmpl-7oBwbf0NG9vi74Z4mt3XasGhhdAVl","object":"chat.completion.chunk","created":1692196669,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"function_call":{"arguments":"et"}},"finish_reason":null}]}

data: {"id":"chatcmpl-7oBwbf0NG9vi74Z4mt3XasGhhdAVl","object":"chat.completion.chunk","created":1692196669,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"function_call":{"arguments":"ah"}},"finish_reason":null}]}

data: {"id":"chatcmpl-7oBwbf0NG9vi74Z4mt3XasGhhdAVl","object":"chat.completion.chunk","created":1692196669,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"function_call":{"arguments":" on"}},"finish_reason":null}]}

data: {"id":"chatcmpl-7oBwbf0NG9vi74Z4mt3XasGhhdAVl","object":"chat.completion.chunk","created":1692196669,"model":"gpt-3.5-turbo-0613","choices":[{"index":0,"delta":{"function_call":{"arguments":" rocket"}},"finish_reason":null}]}

data: {"id":"chatcmpl-

The decoder chokes on the last chunk.

I fixed it in my local copy of the OpenAI package by keeping the incomplete chunk around as leftover for the next round. I don't know enough to know whether it's a valid fix or if there is a better way to address this.

StreamingSession.patch