KyrieRui / BOP_LLaMA

research in Meta LLaMA7b in BOPRC

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The BOP LLaMA is using locally running LLaMA2 7B Chat model,it using llama-cpp-python library and Graio for UI

conda create -n llm python=3.11

conda activate llm
python BOP_LLM_GradioUI.py

01 Image

if Mac GPU (M1...) not using, try the following

pip uninstall llama-cpp-python
CMAKE_ARGS="-DLLAMA_METAL=on" pip install llama-cpp-python==0.2.27

The BOP_RAG using llama cpp provieded by langchain_community

This is the test demo csv in RAG file

rag01

About

research in Meta LLaMA7b in BOPRC


Languages

Language:Jupyter Notebook 99.3%Language:Python 0.7%