wjkennedy / DocQA

Question Answering with Custom FIles using LLMs

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

DocQA πŸ€–

image

DocQA πŸ€– is a web application built using Streamlit πŸ”₯ and the LangChain πŸ¦œπŸ”— framework, allowing users to leverage the power of LLMs for Generative Question Answering. 🌟

Read More Here πŸ‘‰ https://ai.plainenglish.io/️-langchain-streamlit-llama-bringing-conversational-ai-to-your-local-machine-a1736252b172

Installation

To run the LangChain web application locally, follow these steps:

Clone this repository πŸ”—

git clone https://github.com/afaqueumer/DocQA.git

Create Virtual Environment and Install the required dependencies βš™οΈ

Run ➑️ setup_env.bat 

Launch Streamlit App πŸš€

Run ➑️ run_app.bat

Usage

Once you have the Streamlit web application up and running, you can perform the following steps:

  1. Upload the Text File.
  2. Once the Text File is loaded as the Vector Store Database it will show a success alert "Document is Loaded".
  3. Insert the question in "Ask" textbox and submit your question for LLM to generate the answer.

Contributing

Contributions to this app are welcome! If you have any ideas, suggestions, or bug fixes, please feel free to open an issue or submit a pull request. We appreciate your contributions.

License

This project is licensed under the MIT License.

πŸŽ‰ Thank you πŸ€— Happy question answering! 🌟

About

Question Answering with Custom FIles using LLMs


Languages

Language:Python 97.0%Language:Batchfile 3.0%