enso-labs / ollama

Standalone Ollama Deployment

Repository from Github https://github.comenso-labs/ollamaRepository from Github https://github.comenso-labs/ollama

Prompt Engineers AI - Ollama Deployment

Welcome to the Prompt Engineers AI - Ollama Deployment repository. This document provides instructions on how to deploy the Ollama application using Docker Compose.

Prerequisites

Before proceeding with the deployment, ensure you have the following installed on your system:

  • Docker
  • Docker Compose
  • kubectl
  • helm

Docker: Getting Started

  1. Clone the repository to your local machine:

git clone https://github.com/your-repository-url

  1. Start the Docker Compose deployment:
bash scripts/ollama.sh

Kubernetes: Getting Started (Under Development)

  1. Install on cluster
bash scripts/deploy.sh
  1. Pull model
bash scripts/pull.sh

About

Standalone Ollama Deployment


Languages

Language:Smarty 76.9%Language:Shell 23.1%