marcolardera / chatgpt-cli

Simple yet effective command line client for chatting with ChatGPT using the official API

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Request: Be able to use OpenAI Assistant ID / Thread ID...

Dan1jel opened this issue · comments

If possible, would be nice to use OpenAI with Assistant ID / Thread ID instad? if om not misstaken, this way it is possible to bypass 4k token limit? But also great to have access to the diffrent Assistant that one might build up. Math teacher, "Jarvis" etc etc.

Hi @Dan1jel! Thank you, I'm looking into it

So I've done some coding with the assistant API -- I would suggest this would be a great thing to have in chatgtp-cli, but it might be too early.

Assitant API is still early beta, and they don't have a streaming option on the way back. You have to "fast poll" an endpoint to see when the job is done, which adds complexity & inefficiency to the code that would be needed. In any case, they're going to add streaming support soon.

Would totally support doing this though. Note that you can bypass the token limit by using different models, and so using assistants and bypassing the token limit are just different issues, you don't need one to do the other.

So I've done some coding with the assistant API -- I would suggest this would be a great thing to have in chatgtp-cli, but it might be too early.

Assitant API is still early beta, and they don't have a streaming option on the way back. You have to "fast poll" an endpoint to see when the job is done, which adds complexity & inefficiency to the code that would be needed. In any case, they're going to add streaming support soon.

Would totally support doing this though. Note that you can bypass the token limit by using different models, and so using assistants and bypassing the token limit are just different issues, you don't need one to do the other.

Aah I see, thanks for looking in too it :) my first idea was to, instead of a website that might reload and lose all history, a cli version would maybe have the history there, or be on going (in for example, a tmux window). But I understand, I just hope it will be possible in the near future :)

Via the OpenAI Assistants API and Threads (https://platform.openai.com/docs/assistants/how-it-works#:~:text=Assistants%20can%20access%20persistent%20Threads,it%20as%20your%20users%20reply.) -- they keep track of the threads. Probably wouldn't need to keep the threads locally, because they could be gotten by the API whenever you need them. The CLI would need to keep track of which thread you were on, and there'd probably have to be command line options for "make a new thread" vs. "pick up our chat as of the latest status on thread ID XYZ". A good default behavior would be to start a new thread every time the app runs

one slight concern for me though is that it's such an early beta, like I could probably build code to do this in an afternoon, but it would suck if it breaks in 3 weeks