This repository demonstrates a simple local client-server implementation of the OpenAI Chat Completion API, which utilizes the powerful GPT-3.5 language model to create conversational AI applications. The example consists of a Python Flask server that handles the interaction with the OpenAI API, and a Python client that communicates with the server to carry out the conversation.
The OpenAI Chat Completion API provides an easy way to integrate AI-powered conversational capabilities into applications. The API leverages the GPT-3.5 language model to understand and generate human-like text based on the input it receives.
This project consists of two main components:
-
Server (
/server
): A Flask application that processes incoming messages from the client, sends them to the OpenAI API, and returns the generated responses. It also handles the session state and conversation logging. -
Client (
/client
): A simple command-line interface (CLI) that allows users to send messages to the Flask server and receive responses from the AI.
For more information about the OpenAI Chat Completion API, you can visit the following resources:
- Interactive CLI for sending and receiving messages.
- Flask server for API requests and response handling.
- Conversation persistence on both the client and server sides.
- Customizable user identification for conversation logging.
To get started with this example:
- Clone the repository to your local machine.
- Install the required dependencies by running
pip install -r requirements.txt
in both the/client
and/server
directories. - Create an
.env
file within the/server
directory containing your OpenAI API key like so:
OPENAI_API_KEY='your_api_key_here'
- Start the server by running
python server.py
within the/server
directory. - In a separate terminal, start the client by running
python client.py
within the/client
directory. - Follow the prompts in the client to begin a conversation with the AI.
#!/bin/bash
/var/lang/python37/bin/python3 server/flask_server_chatcompletion.py
Contributions to this example are welcome! If you have an improvement or encounter an issue, please feel free to open an issue or submit a pull request.
This project is open-sourced under the MIT License. See the LICENSE file for more details.