Viroscope / OpenAI-Complex-Skeleton

The Open AI Complex Skeleton is a Python program that demonstrates a conversation-based interaction with the OpenAI GPT-3.5 language model. It allows users to communicate with the model by sending queries and receiving responses. The program utilizes the OpenAI API to generate chat-based completions.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Open AI Complex Skeleton

This program serves as a simple conversational assistant powered by OpenAI's GPT-3.5 language model. It allows users to have interactive conversations by sending queries and receiving responses generated by the model. The code is designed to be easy to understand for novice programmers.

Prerequisites

Before running the program, ensure that you have the following dependencies installed:

  • Python (version 3.6 or higher)
  • dotenv library (pip install python-dotenv)
  • openai library (pip install openai)

Additionally, you need to set up an OpenAI API key by creating an account on the OpenAI website. Once you have your API key, create a file named .env in the same directory as the code and add the following line, replacing YOUR_API_KEY with your actual API key:

OpenAIKey=YOUR_API_KEY

Usage

  1. Import the required libraries:
import os
from dotenv import load_dotenv
import openai
import json
  1. Load the API key from the .env file:
load_dotenv()
openai.api_key = os.getenv('OpenAIKey')
  1. Define the function descriptions:
function_descriptions = [
    {
        "name": "function_1",
        "description": "When the user asks to run function one",
        "parameters": {
            "type": "object",
            "properties": {
                "one": {
                    "type": "string",
                    "description": "User asked to run function one, with the required argument one"
                },
            },
            "required": ["one"]
        }
    },
    {
        "name": "function_2",
        "description": "When the user asks to run function two",
        "parameters": {
            "type": "object",
            "properties": {
                "one": {
                    "type": "string",
                    "description": "User asked for function two with argument one"
                },
                "two": {
                    "type": "string",
                    "description": "User asked for function two with argument two"
                },
            },
        }
    }
]
  1. Define the function_call function to handle specific function calls:
def function_call(ai_response):
    function_call = ai_response["choices"][0]["message"]["function_call"]
    function_name = function_call["name"]
    arguments = function_call["arguments"]
    
    if function_name == "function_1":
        one = eval(arguments).get("one")
        print(f"Function 1 {one}")
        return
     
    if function_name == "function_2":
        print('Function 2')
        one = eval(arguments).get("one")
        two = eval(arguments).get("two")
        print(f"Function 2 {one}{two}")
        return
    
    else: 
        return
  1. Define the feedback function to process user queries and generate responses:
def feedback(query):
    messages = [{"role": "user", "content": query}]
    
    response = openai.ChatCompletion.create(
        model="gpt-3.5-turbo-0613",
        messages=messages,
        functions=function_descriptions,
        function_call="auto"
    )

    while response["choices"][0]["finish_reason"] == "function_call":
        function_response = function_call(response)
        messages.append({
            "role": "function",
            "name": response["choices"][0]["message"]["function_call"]["name"],
            "content": json.dumps(function_response)
        })
        
        response = openai.Chat

Completion.create(
            model="gpt-3.5-turbo-0613",
            messages=messages,
            functions=function_descriptions,
            function_call="auto"
        )
        
        print("\n"+response['choices'][0]['message']['content'].strip())

    print("\n"+response['choices'][0]['message']['content'].strip())
  1. Start the conversational loop:
while True:
    user_input = input("User: ")
    feedback(user_input)

Running the Program

  1. Ensure that you have fulfilled the prerequisites mentioned above.

  2. Save the code in a file with a .py extension, e.g., conversational_assistant.py.

  3. Open a terminal or command prompt and navigate to the directory where you saved the file.

  4. Run the program using the following command:

python OpenAI_Complex_Skeleton.py
  1. The program will start and prompt you to enter your queries. Type your query and press Enter to receive a response from the conversational assistant.

  2. The program will continue to prompt for queries and provide responses until you terminate it by pressing Ctrl+C or stopping the execution in your development environment.

Feel free to modify the function descriptions, add more conversational logic, or customize the program according to your needs. Enjoy interacting with your OpenAI Conversational Assistant!

About

The Open AI Complex Skeleton is a Python program that demonstrates a conversation-based interaction with the OpenAI GPT-3.5 language model. It allows users to communicate with the model by sending queries and receiving responses. The program utilizes the OpenAI API to generate chat-based completions.

License:MIT License


Languages

Language:Python 100.0%