atineoSE / octoml-llm-qa

A code sample that shows how to use 🦜️🔗langchain, 🦙llama_index and a hosted LLM endpoint to do a standard chat or Q&A about a pdf document

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

LLM endpoint chat

Load a PDF file and ask questions via llama_index, LangChain and a LLM endpoint

Instructions

  • Install the requirements
pip install -r requirements.txt -U
  • Run chat_main.py script to chat with the LLM hosted endpoint.
python3 chat_main.py

or

  • Select a file from the menu or replace the default file file.pdf with the PDF you want to use.
  • Run pdf_qa_main.py script to ask questions about your pdf file via llama_index, LangChain and the hosted endpoint.
python3 pdf_qa_main.py
  • Ask any questions about the content of the PDF.


Credits:

This work was inspired by the chatPDF repo

About

A code sample that shows how to use 🦜️🔗langchain, 🦙llama_index and a hosted LLM endpoint to do a standard chat or Q&A about a pdf document


Languages

Language:Python 100.0%