Meet Bricky - a conversational bot using OpenAI 🤖
Remember clippy? Meet bricky!
Bricky is a conversational bot using Retrieval-Augmented Generation with some help from OpenAI's GPT-3 LLM.
Bricky indexes content stored in markdown files and vectorizes it using OpenAI embeddings. It then uses few-shot learning using a ChatGPT prompt to generate an answer based on relevant content.
Read more about my journey into this field and the background for creating Bricky in my blog article
The project is inspired by the awesome HoustonAI by Astro
Getting started 🚀
Prereqs
Provide these env
variables for the api container:
OPENAI_KEY=<YOUR OPENAI KEY GOES HERE>
DOC_DIR=<YOUR ROOT DOC DIRECTORY GOES HERE>
INDEX_NAME=<NAME OF YOUR INDEX FOR THE DOC STORE>
Easiest way is to create a dotenv
file in /api/.env
Steps
- Clone this repo!
- Make sure the document directory is configured correctly
- Run docker-compose:
docker-compose up
You should now have two endpoints running:
- The Nextjs-based frontend: Open http://localhost:3000 to meet Bricky.
- The Haystack-based API: Open http://localhost:8080/docs with your browser to see the OpenAPI documentation.
Learn more
To learn more about Haystack and OpenAI, take a look at the following resources:
- Haystack Documentation - learn about the Haystack platform by deepset.ai.
- OpenAI docs - the OpenAI docs site.
To learn more about Next.js, take a look at the following resources:
- Next.js Documentation - learn about Next.js features and API.
- Learn Next.js - an interactive Next.js tutorial.
Powered by haystack and OpenAI ChatGPT
Questions or comments? Reach out to @larsbaunwall
Don't forget to ⭐ this repo!