grapeot / KnowledgeLLM

Enable LLMs to do Q&A based on a knowledge base.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Knowledge Ingestion for LLM

This project offers a reference implementation that combines document retrieval with a large language model like Llama, allowing users to ask questions and receive answers based on a knowledge base. You can also replace the large language model inference with OpenAI APIs for even better performance.

Usage

  1. setup.sh provide a reference implementation for initializing the model.
  2. embedding_retrieval.py provides embedding based retrieval capability, and llama_model.py provides the Q&A capability based on Alpaca-LoRA.
  3. A complete sample usage could be file in embedding_retrieval.py.

About

Enable LLMs to do Q&A based on a knowledge base.


Languages

Language:Python 98.9%Language:Shell 1.1%