- Install ollama from https://ollama.com/
git clone https://github.com/R00tendo/webllama
cd webllama/
npm i
npm run build
ollama pull dolphin-llama3 #or whatever model you want to use
npm run start
- Start ollama API with
ollama serve
- Navigate to http://localhost:3000
If you want to change the model, edit webllama.json in the project root. Default:
{
"model": "dolphin-llama3"
}