mpfled / kafkahq

Kafka GUI to view topics, topics data, consumers group, schema registry ...

Home Page:https://hub.docker.com/r/tchiotludo/kafkahq

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

KafkaHQ

Kafka GUI for topics, topics data, consumers group, schema registry and more...

preview

Features

  • General
    • Works with modern Kafka cluster (1.0+)
    • Connection on standard or ssl, sasl cluster
    • Multi cluster
  • Topics
    • List
    • Configurations view
    • Partitions view
    • Consumers groups assignments view
    • Node leader & assignments view
    • Create a topic
    • Configure a topic
    • Delete a topic
  • Browse Topic datas
    • View data, offset, key, timestamp & headers
    • Automatic deserializarion of avro message encoded with schema registry
    • Configurations view
    • Logs view
    • Delete a record
    • Sort view
    • Filter per partitions
    • Filter with a starting time
    • Filter data with a search string
  • Consumer Groups (only with kafka internal storage, not with old Zookepper)
    • List with lag, topics assignments
    • Partitions view & lag
    • Node leader & assignments view
    • Display active and pending consumers groups
    • Delete a consumer group
    • Update consumer group offsets to start / end / timestamp
  • Schema Registry
    • List schema
    • Create a schema
    • Update a schema
    • Delete a schema
    • View and delete individual schema version
  • Nodes
    • List
    • Configurations view
    • Logs view
    • Configure a node

Quick preview

It will start a Kafka node, a Zookeeper node, a Schema Registry, fill with some sample data, start a consumer group and a kafka stream & start KafkaHQ.

Installation

First you need a configuration files in order to configure KafkaHQ connections to Kafka Brokers.

Docker

docker run -d \
    -p 8080:8080 \
    -v /tmp/application.yml:/app/application.yml \
    tchiotludo/kafkahq
  • With -v /tmp/application.yml must be an absolute path to configuration file
  • Go to http://localhost:8080

Stand Alone

Configuration

Configuration file can by default be provided in either Java properties, YAML, JSON or Groovy files. Configuration file example in YML :

kafkahq:
  server:
    # if behind a reverse proxy, path to kafkahq with trailing slash
    base-path: ""

  # default kafka properties for each clients, available for admin / producer / consumer (optionnal)
  clients-defaults:
    consumer:
      properties:
        isolation.level: read_committed

  # list of kafka cluster available for kafkahq
  connections:
    # url friendly name for the cluster
    my-cluster-1:
      # standard kafka properties (optionnal)
      properties:
        bootstrap.servers: "kafka:9092"
      # schema registry url (optionnal)
      schema-registry: "http://schema-registry:8085"

    my-cluster-2:
      properties:
        bootstrap.servers: "kafka:9093"
        security.protocol: SSL
        ssl.truststore.location: /app/truststore.jks
        ssl.truststore.password: password
        ssl.keystore.location: /app/keystore.jks
        ssl.keystore.password: password
        ssl.key.password: password
        
  topic-data:
    # default sort order (OLDEST, NEWEST)
    sort: OLDEST
    # max record per page
    size: 50
  • kafkahq.server.base-path: if behind a reverse proxy, path to kafkahq with trailing slash
  • kafkahq.clients-defaults.{{admin|producer|consumer}}.properties: if behind a reverse proxy, path to kafkahq with trailing slash
  • kafkahq.connections is a key value configuration with :
    • key: must be an url friendly string the identify your cluster (my-cluster-1 and my-cluster-2 is the example above)
    • properties: all the configurations found on Kafka consumer documentation. Most important is bootstrap.servers that is a list of host:port of your Kafka brokers.
    • schema-registry: the schema registry url (optional)

Since KafkaHQ is based on Micronaut, you can customize configurations (server port, ssl, ...) with Micronaut configuration. More information can be found on Micronaut documentation

KafkaHQ docker image support 2 environment variables to handle configuraiton :

  • MICRONAUT_APPLICATION_JSON: a string that contains the full configuration in JSON format
  • MICRONAUT_CONFIG_FILES: a path to to a configuration file on container. Default path is /app/application.yml

Development Environment

A docker-compose is provide to start a development environnement. Just install docker & docker-compose, clone the repository and issue a simple docker-compose -f docker-compose-dev.yml up to start a dev server. Dev server is a java server & webpack-dev-server with live reload.

License

Apache 2.0 © tchiotludo

About

Kafka GUI to view topics, topics data, consumers group, schema registry ...

https://hub.docker.com/r/tchiotludo/kafkahq

License:Apache License 2.0


Languages

Language:Java 64.4%Language:FreeMarker 20.2%Language:JavaScript 9.5%Language:CSS 5.8%Language:Dockerfile 0.0%Language:Shell 0.0%