This is an example NextJS template that can be used to communicate with the Deep Chat component. It includes a variety of endpoints that can be used to host your own service or act as a proxy for the following AI APIs - OpenAI, HuggingFace, StabilityAI, Cohere.
This project is fully setup and ready to be hosted by a platform such as Vercel.
If you are downloading the project via git clone
- we advise you to use shallow cloning with the use of the --depth 1 option to reduce its size:
git clone --depth 1 https://github.com/OvidijusParsiunas/deep-chat.git
Navigate to this directory and run the following command to download the dependencies:
npm install
Run the project:
npm run dev
If you want to use the proxy functions:
Local - Replace the environment variables (process.env.
) in the route handler functions with the corresponding API key values. E.g. if you want to use the OpenAI Chat, replace process.env.OPENAI_API_KEY
with a string value of the key.
Hosting Platform - Add the environment variables to your deploy config. E.g. if you want to use the OpenAI Chat, add the OPENAI_API_KEY
environment variable.
If you are experiencing issues with this project or have suggestions on how to improve it, do not hesitate to create a new ticket in Github issues and we will look into it as soon as possible.