ethaden / confluent-cloud-audit-logs-ELK

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Demo for streaming Audit Log data from Confluent Cloud directly into ELK

This demo shows how to stream audit Log data from Confluent Cloud directly into ELK (Elastic, Logstash, Kibana).

Preconditions

Create an Audit Log API key as described in the Confluent Cloud documentation. Then copy the template ".env-template" to .env and update its content accordingly.

Running Confluent Platform

Start the containers by running (if you want to use filebeat for fetching logs):

docker compose  up -d

Stopping the containers with removal of all created volumes (be careful!):

docker compose down -v

Cleaning up (CAREFUL: THIS WILL DELETE ALL UNUSED VOLUMES):

docker volumes prune

Usage

Access kibana with your web browser here (you need to create an exception as the certificate is self-signed):

On the left side, click on Observability, then open Stream. After a while, you should start to see Kafka Audit Log events (event.dataset: kafka_log.generic).

Go back to the main page and open Analytics→Discover. You should be able to see all the unpacked json events. Use a filter such as event.dataset : "kafka_log.generic" to show only the audit log data.

About

License:Apache License 2.0