pamaldi / adso-chat

ADSO Chat is an interactive, Docker-based chat app with advanced language processing. It features a robust backend, Ollama for language tasks, and a user-friendly frontend, all containerized using Docker Compose. Key Features: Three services: backend, Ollama, frontend UI Advanced natural language processing Easy setup and deployment Customizable

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ADSO Chat Application

Overview

ADSO Chat is a containerized application that leverages Docker Compose to manage three main services: a backend server, an Ollama language processing service, and a frontend user interface. This application is designed to provide an interactive chat experience with advanced language processing capabilities.

Services

  1. ADSO Chat Backend: Handles server-side logic and API requests.
  2. ADSO Chat Ollama: Provides language model processing for enhanced chat functionality.
  3. ADSO UI: Delivers the user interface for interacting with the chat application.

Prerequisites

  • Docker
  • Docker Compose

Quick Start

  1. Clone the repository:
    git clone <repository-url>
    cd <repository-directory>
    
  2. Build and start the services:
    docker-compose up --build -d
    
  3. Access the application:
  1. Usage
  • View Logs
    `docker-compose logs -f`
  • Stop Services
    `docker-compose down`
  • Rebuild and Restart Services
    `docker-compose up -d --build`
    
    
  1. Configuration The application uses a custom bridge network named mynetwork for inter-service communication. Each service is configured with specific options for optimal performance and reliability. Data Persistence The Ollama service uses a bind mount to persist data:

Local ./data directory is mounted to /data in the Ollama container.

  1. Keycloak http://localhost:8081/admin/master/console/ http://localhost:8081/realms/adso/protocol/openid-connect/auth?client_id=adso-user&redirect_uri=http://localhost:80&response_type=code&scope=openid

Other

Build a single service

docker-compose  up --build -d adso-ui

About Ollama

What is Ollama?

Ollama is a service used to handle language models and facilitate advanced language processing tasks. It can be integrated into various applications to provide functionalities such as language translation, text generation, and more.

Usage in ADSO Chat

In the context of the ADSO Chat application, Ollama is used to process and generate responses based on the inputs it receives. This enhances the chatbot's ability to understand and generate natural language.

Configuration

Ensure that the Ollama endpoint is correctly configured in your application settings. This typically involves specifying the URL or IP address where the Ollama service is running.

Data Volume

The configuration includes a bind mount for the ./data directory to /data within the Ollama container. This is used to persist data across container restarts and can be useful for storing model files or other relevant data.

Network Configuration

Ollama is connected to the same custom bridge network (mynetwork) as the other services, allowing it to communicate seamlessly with the backend and UI components of the ADSO Chat application.

About

ADSO Chat is an interactive, Docker-based chat app with advanced language processing. It features a robust backend, Ollama for language tasks, and a user-friendly frontend, all containerized using Docker Compose. Key Features: Three services: backend, Ollama, frontend UI Advanced natural language processing Easy setup and deployment Customizable

License:MIT License


Languages

Language:Java 41.9%Language:CSS 21.9%Language:JavaScript 21.0%Language:Dockerfile 8.6%Language:HTML 6.5%