mislavjc / ironclad

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

  _                      _           _ 
 (_)                    | |         | |
  _ _ __ ___  _ __   ___| | __ _  __| |
 | | '__/ _ \| '_ \ / __| |/ _` |/ _` |
 | | | | (_) | | | | (__| | (_| | (_| |
 |_|_|  \___/|_| |_|\___|_|\__,_|\__,_|

A CLI tool takes an OpenAPI spec and generates files needed for function calling.

Why Ironclad?

OpenAPI schemas include the required information in a format that needs to be normalized and transformed into a format that can be used for function calling. This tool takes the OpenAPI schema and generates the files needed for function calling.

Without Ironclad, you would have to manually write the function definitions and the function calling logic. Ironclad automates this process.

It was created when I was building a chatbot that uses Spotify API and OpenAI API. I had to manually write the function definitions and the function calling logic. I realized that I could automate this process by using the OpenAPI schema.

Files generated

The following files are generated inside the generated directory:

  • types.ts - Includes type definitions for the OpenAPI schema, generated by openapi-typescript.
  • functions.ts - Includesh function definitions used for Function Calling.
  • runFunction.ts - Includes a function that calls the functions defined in functions.ts. It is responsible for executing the function calls.

Example files

Example files are located in the example directory.

Requirements

  • OpenAI API Key

OpenAPI schemas include the required information in a format that needs to be normalized and transformed into a format that can be used for function calling. This tool takes the OpenAPI schema and generates the files needed for function calling.

How to use

  1. Call the ironclad command in the root directory.
$ ironclad
  1. The generated directory will be created with the files mentioned above.
import { functions } from 'generated/functions';
import { runFunction } from 'generated/runFunction';
  1. Use the functions and runFunction in your code.

Import the functions in your code.

import { functions } from 'generated/functions';

functions are passed as an argument to the OpenAI endpoint.

  const response = await openai.chat.completions.create({
    model: 'gpt-3.5-turbo-16k',
    stream: true,
    messages,
    functions: functions,
  });

Import the runFunction in your code.

import { runFunction } from 'generated/runFunction';

runFunction is used to call the functions.

const stream = OpenAIStream(response, {
  experimental_onFunctionCall: async (
    { name, arguments: args },
    createFunctionCallMessages,
  ) => {
    const result = await runFunction(name, args);

    // .... rest of the code
  },
});

How to run

Currently, the CLI tool is not published to npm. You can run the CLI tool by following the steps below:

  1. Clone the repository.
$ git clone https://github.com/mislavjc/ironclad.git
  1. Install the dependencies.
$ pnpm install
  1. Run the CLI tool.
$ npm run exec

About


Languages

Language:TypeScript 98.8%Language:JavaScript 0.8%Language:Shell 0.4%