llm-edge / hal-9100

Edge full-stack LLM platform. Written in Rust

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

"action" tool (function calling on the server)

louis030195 opened this issue · comments

#25 prelim

image

i.e. same as in chatgpt ui

https://platform.openai.com/docs/actions but for assistant api

@louis030195
Here is a demo of how I got actions to work with assistant. It consists of the following steps:

  1. Provide a OpenAPI spec
  2. Generate functions from the spec (in parallel)
  3. Generate HTTP request functions from the spec (in parallel)
  4. Whenever the function from the OpenAPI spec is called, send the args to the generated HTTP request function.
  5. Return the results as the tool outputs.
  6. Done.

As an action this whole process should happen automatically in the assistant backend.

https://gist.github.com/CakeCrusher/aad0cdb695aea7eca55d31bc801a7f83

The expectation is to be able to pass a tool like so:

client.beta.assistants.create(
    name="Action assistant",
    tools=[
        # ...    
        {
            "type": "action",
            "openapi_spec": """
                openapi: 3.0.0
                ...
            """
        } # this tool will execute autonimously
    ]
)

For simplicity here are the features that I think can wait until after the initial release:

should be easy to implement after merging #62

@CakeCrusher

done v0