Chat UI for local offline Llama3 Model to chat with.
- Install Ollama - https://ollama.com/
- Install Python 3
In your terminal:
ollama pull llama3:latest
In your terminal:
pip install -r requirements.txt
streamlit run streamlit_app_v2.py
Add into your bashrc
or zshrc
file:
alias llama='cd ~/llama3_local; streamlit run streamlit_app_v2.py'
NOTE: update the cd ~/llama3_local
with the path, where you've saved this project.
llama