NILICK / ollama-pdf-bot

A bot that accepts PDF docs and lets you ask questions on it.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PDF Bot with Ollama

A bot that accepts PDF docs and lets you ask questions on it.

The LLMs are downloaded and served via Ollama.

GitHub stars GitHub forks Hits

Table of Contents

Requirements

  • Docker (with docker-compose)
  • Python (for development only)

How to run

Define a docker-compose.yml by adding the following contents into the file.

services:

  ollama:
    image: ollama/ollama
    ports:
      - 11434:11434
    volumes:
      - ~/ollama:/root/.ollama
    networks:
      - net

  app:
    image: amithkoujalgi/pdf-bot:1.0.0
    ports:
      - 8501:8501
    environment:
      - OLLAMA_API_BASE_URL=http://ollama:11434
      - MODEL=orca-mini
    networks:
      - net

networks:
  net:

Then run:

docker-compose up

When the server is up and running, access the app at: http://localhost:8501

Note:

  • It takes a while to start up since it downloads the specified model for the first time.
  • If your hardware does not have a GPU and you choose to run only on CPU, expect high response time from the bot.
  • Only Nvidia is supported as mentioned in Ollama's documentation. Others such as AMD isn't supported yet. Read how to use GPU on Ollama container and docker-compose.

Image on DockerHub: https://hub.docker.com/r/amithkoujalgi/pdf-bot

PDF.Bot.Demo.mp4

Sample PDFs:

Hl-L2351DW v0522.pdf

HL-B2080DW v0522.pdf

Improvements

  • Expose model params such as temperature, top_k, top_p as configurable env vars

Credits

Thanks to the incredible Ollama, Langchain and Streamlit projects.

About

A bot that accepts PDF docs and lets you ask questions on it.


Languages

Language:Python 95.3%Language:Dockerfile 3.1%Language:Shell 1.6%