Serialize and deserialize messages in protobuf and avro has a dependency on Kafka schema registry, which does not come with Kafka installer by default. One easy way is to use the Docker images. The original author provides a Docker compose script here: https://github.com/codingharbour/kafka-docker-compose.
To set up Docker, one easy way is to simply install Docker Desktop, which comes with Docker Compose. After that, do the following:
git clone git@github.com:codingharbour/kafka-docker-compose.git
cd single-node-avro-kafka
docker-compose up -d
Now, you have Kafka, Zookeeper and Schema Registry up and running, docker-compose ps
shows the running containers:
➜ single-node-avro-kafka git:(master) docker-compose ps
NAME COMMAND SERVICE STATUS PORTS
sna-kafka "/etc/confluent/dock…" kafka running 0.0.0.0:9092->9092/tcp, :::9092->9092/tcp
sna-schema-registry "/etc/confluent/dock…" schema-registry running 0.0.0.0:8081->8081/tcp, :::8081->8081/tcp
sna-zookeeper "/etc/confluent/dock…" zookeeper running 0.0.0.0:2181->2181/tcp, :::2181->2181/tcp, 2888/tcp, 3888/tcp
We need to use two new topics: protobuf-topic
, avro-topic
. To create them:
➜ single-node-avro-kafka git:(master) docker exec -it sna-kafka /usr/bin/kafka-topics --create --topic protobuf-topic --bootstrap-server localhost:9092
Created topic protobuf-topic.
➜ single-node-avro-kafka git:(master) docker exec -it sna-kafka /usr/bin/kafka-topics --create --topic avro-topic --bootstrap-server localhost:9092
Created topic avro-topic.