The problem statement is described here https://developers.redhat.com/blog/2020/09/28/build-a-data-streaming-pipeline-using-kafka-streams-and-quarkus/
Make sure you have docker installed on your laptop. Run following command to start a kafka cluster
$ git clone https://github.com/wurstmeister/kafka-docker.git $ cd kafka-docker
open docker-compose-single-broker.yml and change the following line KAFKA_ADVERTISED_HOST_NAME: 192.168.99.100
to
KAFKA_ADVERTISED_HOST_NAME: localhost
$ docker-compose -f docker-compose-single-broker.yml up -d
We will now create left-stream-topic, right-stream-topic, stream-stream-outerjoin, processed-topic topics using kafka cli
$ cd <folder location where kafka cli binaries are located> $ ./kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic left-stream-topic $ ./kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic right-stream-topic $ ./kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic stream-stream-outerjoin $ ./kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic processed-topic
$ git clone https://github.com/kgshukla/data-streaming-kafka-quarkus.git $ cd data-streaming-kafka-quarkus/quarkus-kafka-streaming # start the application in dev mode. You could also package the application as a jar file and start using 'java -jar' command $ ./mvnw compile quarkus:dev
open a new terminal and initialize the application. This step wasn’t needed if the application was defined with ApplicationScoped notation
$ curl localhost:8080/startstream
Open a new terminal and run the following commands
$ cd data-streaming-kafka-quarkus/quarkus-kafka-producer $ ./mvnw compile quarkus:dev
$ cd <folder location where kafka cli binaries are located> $ ./kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic processed-topic --property print.key=true --property print.timestamp=true
Use case 1 - Send few records to left-stream-topic, right-stream-topic topics
Couple of records will be missing in left-stream-topic. You will notice that punctuator will process those records which have been put into state store.
$ curl localhost:8082/sendfewrecords
Use case 2 - Send one record to left-stream-topic only
You will notice that punctuator will process those records which have been put into state store.
$ curl localhost:8082/sendoneleftrecord # to see records that are present in the state store and not yet processed $ curl localhost:8080/storedata
Use case 3 - send 100000 records to both left-stream-topic, right-stream-topic topics
$ curl localhost:8082/sendmanyrecords