xebia-functional / xef

Building applications with LLMs through composability, in Kotlin, Scala, ...

Home Page:https://xef.ai

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

abstract ModelType

Intex32 opened this issue · comments

commented

Goal: generalize the ModelType class from OpenAI specific implementations to support multiple providers

I consider ModelType a formal description of a model and it's properties without any capabilities. This includes eg name, contextLength.

my suggestions:

  • move file from module tokenizer to core
  • remove sealed class constraint
  • implementations of ModelType can be found directly in the provider specific classes that contain instances of available models; instance is passed as parameter to respective model constructor => move OpenAI instances and create new instances with correct parameters for gpt4all and GCP
  • encodingType has to be moved out of ModelType; GCP appears to not have public information about the encoding they are using; thus we have to make an API call to GCP to retrieve the token count of a given message; for OpenAI, the encoding is used to locally compute the token count

depends on #393

After further assessment, I think maintaining a model structure (such as ModelType) besides the actual models with capabilities is not worth the effort. I suggest inlining the functionality of ModelType into the implementations of LLM. This is necessary as the encoding and context length is not the same for all models and providers and requires some more hierarchy. Currently, ModelType is heavily designed for OAI. This idea and it's consequences are subject of exploration in this issue.

ModelType held following properties:

  • model name
  • context length
  • encoding type
  • magic numbers for chat models

ModelType's encoding type was used eg for summarization or adapting a prompt to the context size.