xebia-functional / xef

Building applications with LLMs through composability, in Kotlin, Scala, ...

Home Page:https://xef.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Allow set token manually in conversation

carbaj03 opened this issue · comments

OpenAI.conversation { }

When I use conversation block it always calls the env system, but I want to set the token manually.

OpenAI.conversation(here_token) { }

The problem:

@JvmField val FromEnvironment: OpenAI = OpenAI()

suspend fun <A> conversation(block: suspend Conversation.() -> A): A =
      block(conversation(LocalVectorStore(FromEnvironment.DEFAULT_EMBEDDING)))

This always ends up calling the environment because the token is set to null by default.

class OpenAI(internal var token: String? = null, internal var host: String? = null)

For now, the token is part of the model not part of the Conversation. If you want to use a model with your own token you should do something like this:

OpenAI.conversation {
    val chat = OpenAI(token = "your_token").DEFAULT_CHAT
    val response = promptMessage("What is the meaning of life?, model = chat)
    println(response)
}

Does that make sense for you?

There is a problem because to create OpenAI.conversation {, you call OpenAI with the token null always by default.
The second call OpenAI(token = "your_token").DEFAULT_CHAT it's okay.
It would be interesting if only one OpenAi() were created for the entire block. With Context Receivers this will be easier in the future.

Yes, this is a known error. FromEnvironment should be lazy... and I agree with you about the conversation block