FuturraGroup / OpenAI

A library that makes it easy to use ChatGPT API

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Error ("invalid numeric value (NaN, or not-a-number)") when trying to stop stream

stephansann opened this issue ยท comments

Hello ๐Ÿ‘‹๐Ÿป

First of all thanks a lot for this nice library. It makes it very easy to get a chat up an running in a Swift application ๐Ÿ˜Š

I found a problem when trying to stop a stream:
Error: this application, or a library it uses, has passed an invalid numeric value (NaN, or not-a-number) to CoreGraphics API and this value is being ignored. Please fix this problem.

This is the relevant code:

        let tmpNewMessage = AIMessage(role: .user, content: chatMessage)

        GlobalConstants.OPEN_AI.sendStreamChatCompletion(newMessage: tmpNewMessage, previousMessages: aiMessages, model: .gptV3_5(.gptTurbo), maxTokens: 2048) { aResult in
            switch aResult {
                case .success(let aStreamResult):
                    if let tmpStreamMessage = aStreamResult.message?.choices.first?.message {
                        courseOfChat += tmpStreamMessage.content
                    }
                    if (aStreamResult.isFinished) {
                        responseOngoing = false
                    }
                    //if (shouldStopResponse) {
                        aStreamResult.stream.stopStream()
                    //}
                case .failure(let anError):
                    courseOfChat = "Sorry! There was a problem: \(anError.localizedDescription)"
            }
        }

I commented out the "shouldStopResponse" if-statement to make sure there is no what-so-ever interference.
So now "stopStream()" is called with the first stream result.

This screenshot may give a bit more insight:

Screenshot 2024-01-14 at 11 48 00

Let me know if I can deliver more details.

Thanks again and best regards
Stephan

Hello there! Thank you for contacting us about issue.
We've figgured out source of issue and fixed it (check release 1.8.0).

First of all - our library doesn't use any CoreGraphics API. It's just network layer for OpenAI API usage. Your CoreGraphics issues log doesn't related to our library. So check your application for possible issues with CoreGraphics usage ๐Ÿ˜‰

The issue was in stopStream action in stream completion handler, what causes start of infinity loop of self-call stopStream. We've added protection from infinity loop and fixed this issue. And thanks to you - also we've fixed some related small issues!

If you want to cancel stream in your way - you can do it now. But i'll recoment aproach like this - store active stream in some weak variable, then when you need do some actions with stream, make them directly to stream:

private weak var activeStream: AIEventStream<AIResponseModel>?

...

GlobalConstants.OPEN_AI.sendStreamChatCompletion(newMessage: tmpNewMessage, previousMessages: aiMessages, model: .gptV3_5(.gptTurbo), maxTokens: 2048) { aResult in
    switch aResult {
    case .success(let aStreamResult):
        if self.activeStream != streamResult.stream {
            self.activeStream = streamResult.stream
        }

        // Your actions with stream data

    case .failure(let anError):
        courseOfChat = "Sorry! There was a problem: \(anError.localizedDescription)"
    }
}

...

@IBAction func myButtonActionThatStopsStream() {
    activeStream?.stopStream()
    activeStream = nil
}

Thank you for helping us get better!

This changes already available in release 1.8.0.

Hi again ๐Ÿ‘‹๐Ÿป

Thank you very much for picking up this problem and fixing it so fast!

I now updated the dependency and changed my code like suggested by you.
Now the stopping of the stream does not end in an exception any more ๐Ÿ‘๐Ÿป

Anyhow after a "stopStream()" it seems the next call to "sendStreamChatCompletion" seems to get stuck (the completion code is not called).
The call after the next call works again.

Any idea what could be the reason?

Thanks again and best regards
Stephan

Can you please provide info about calling sendStreamChatCompletion, stopStream it's interaction with data storage and UI?
This heppening in example or your existing project?
We will check this issue and try to figure out solution.

I now did a longer debugging session and here are the results:

First of all the relevant current code:

    private func triggerCompletion() {

        if (!courseOfChat.isEmpty) {
            courseOfChat += "\n\n"
        }

        courseOfChat += "You:\n\(chatMessage)\n\nChatbot:\n"

        let tmpNewMessage = AIMessage(role: .user, content: chatMessage)

        openAi.sendStreamChatCompletion(newMessage: tmpNewMessage, previousMessages: aiMessages, model: .gptV3_5(.gptTurbo), maxTokens: 2048) { aResult in
            switch aResult {
                case .success(let aStreamResult):
                    if activeStream != aStreamResult.stream {
                        activeStream = aStreamResult.stream
                    }
                    if let tmpStreamMessage = aStreamResult.message?.choices.first?.message {
                        courseOfChat += tmpStreamMessage.content
                    }
                    if (aStreamResult.isFinished) {
                        responseOngoing = false
                    }
                case .failure(let anError):
                    courseOfChat = "Sorry! There was a problem: \(anError.localizedDescription)"
            }
        }

        aiMessages.append(tmpNewMessage)
        chatMessage = String.EMPTY_STRING
        inputFieldFocused = true
        responseOngoing = true
    }

    private func stopResponse() {

        activeStream?.stopStream()
        activeStream = nil
        responseOngoing = false
    }

Here is an example conversation:

**
You:
Could you please write me a motivating affirmation for today?

Chatbot:
Of course! Here is a motivating affirmation for today:

"Today is

You:
Sorry?

Chatbot:

You:
Sorry?

Chatbot:
Of course, I'd be happy to help! Here is a motivating affirmation for today:
"Today is a new day, full of endless possibilities and opportunities. I am worthy and capable of achieving my [...]
**

I stopped the stream within the first answer at "Today is".
As you can see the second question does not get an answer.

So I set a breakpoint within the "case .success(let aStreamResult):" block.
In the contrary to my former suspicion, the completion block is called, but there is no message.

Let's take a look at what happens after the second question was fired:

First stream result to the second question:

Printing description of aStreamResult:
โ–ฟ AIStreamResponse<AIResponseModel>
  โ–ฟ stream : <_TtGC9OpenAIKit13AIEventStreamVS_15AIResponseModel_: 0x6000026882a0>
  โ–ฟ message : Optional<AIResponseModel>
    โ–ฟ some : AIResponseModel
      โ–ฟ id : Optional<String>
        - some : ""
      - object : "chat.completion.chunk"
      - created : 0.0
      โ–ฟ model : Optional<AIModelType>
        โ–ฟ some : AIModelType
          - custom : "/Users/stephan/.cache/lm-studio/models/TheBloke/Llama-2-7B-Chat-GGUF/llama-2-7b-chat.Q5_K_M.gguf"
      โ–ฟ choices : 1 element
        โ–ฟ 0 : Choice
          - text : nil
          โ–ฟ message : Optional<AIMessage>
            โ–ฟ some : AIMessage
              - role : OpenAIKit.AIMessageRole.assistant
              - content : ""
          - index : 0
          - logprobs : nil
          โ–ฟ finishReason : Optional<String>
            - some : "stop"
      - usage : nil
      - logprobs : nil
  โ–ฟ data : Optional<Data>
    โ–ฟ some : 239 bytes
      - count : 239
      โ–ฟ pointer : 0x0000600003d21ef0
        - pointerValue : 105553180368624
  - isFinished : false
  - forceEnd : false

Second and all following stream results to the second question:

Printing description of aStreamResult:
โ–ฟ AIStreamResponse<AIResponseModel>
  โ–ฟ stream : <_TtGC9OpenAIKit13AIEventStreamVS_15AIResponseModel_: 0x6000026882a0>
  - message : nil
  โ–ฟ data : Optional<Data>
    โ–ฟ some : 239 bytes
      - count : 239
      โ–ฟ pointer : 0x0000600003d21ef0
        - pointerValue : 105553180368624
  - isFinished : false
  - forceEnd : false

With the third question things get back to normal. As you can see there will be an answer again.

The code is for my existing iOS app, but it is not on GitHub.

Hope this helps.

Thanks again for the good work ๐Ÿ™๐Ÿป

Hi. Sorry for long time answer and sorry for missunderstending with example with mistake.

We checked out - the sending and deinit stream in library with manual or automatic stop works correct.

The problem may be here where you initialize local variable for current stream:

if self.activeStream != streamResult.stream {
	self.activeStream = streamResult.stream
}

After you call stopResponse() method, activeStream variable becomes nil, but on stop stream there is final event with bool values streamResponse.isFinished or streamResponse.forceEnd desribes how it was stoped, that again initialize self.activeStream.

So correct way do this would be:

  1. Init it only when stream is alive
if self.activeStream != streamResult.stream && !streamResponse.isFinished && !streamResponse.forceEnd {
	self.activeStream = streamResult.stream
}
  1. Do it on your way like you deskribed in the initial problem - stop stream inside closure
if (shouldStopResponse) {
	shouldStopResponse = false /// To prevent infinity loop self stoping
	streamResult.stream.stopStream()
}

Sorry again for long answer. I began to figure out what the problem was as soon as such an opportunity appeared.

Thank you for helping us get better!

Thank you very much for your great answer ๐Ÿ™๐Ÿป

I changed the code again and know everything works out fine ๐Ÿ˜Š

And now that I'm back to my original approach, I do not need "self.activeStream" any more, so no init topics there.

Keep up the great work!