Prompt tokens when using Chats Streaming
klevr-steve opened this issue · comments
Can't get openai's determination of prompt tokens when using chat streaming. Can get the response tokens by just counting them as they come in.
Some equivalent of the completions "usage" structure would be helpful.