Apoorvgarg-creator / doc-based-llm-backend

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

To Run the FastApi Backend Server Locally

  • Pre-requisite
    • Ollama
    • conda
    • Python 3.10+

Steps

  1. Create the python environment
  2. Download Ollama (https://ollama.com)
  3. Run Ollama pull llama2
  4. Change directory to src
  5. Run the below command
uvicorn main:app --reload

About


Languages

Language:Python 100.0%