microsoft / sample-app-aoai-chatGPT

Sample code for a simple web chat experience through Azure OpenAI, including Azure OpenAI On Your Data.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Dynamic conversation context limit

oskarslapinsvismacom opened this issue · comments

Is your feature request related to this sample app, or to an Azure service, such as Azure OpenAI or Azure AI Search?
It is related to this sample app.

Is your feature request related to a problem? Please describe.
For some implementations it might be very useful to set a limit of conversation context. Meaning, taking into account, for example, only last 5, or 10 messages as the context. This could help dealing with users that keep asking unrelated questions in the same conversation

Describe the solution you'd like
Implementing an environment variable that, if set, would take into context only last X messages from the conversation when sending a chat completion request.

Is this feature specific to your use case or your organization, or would it apply broadly across other uses of this app?
It would apply for our use case in some specific scenarios and I believe that the same would apply for other organisations as well

Describe alternatives you've considered
Implementing a reminder in the system message for users to clear the chat, but this would be awfully unreliable.

Additional context
Issue we are facing is that when users keep very long conversations of many different topics, at some point chat-bot no longer can find answer in the documents and it explains that in his answer, which makes users very confused, since the topic is in the documents. Also, trying the same question later gives them an answer.
I very strongly feel that a variable that can be set for such a behavior to shorten the context can help in many different scenarios, especially if you do not want to over-spend for resources