nodefluent / kafka-streams

equivalent to kafka-streams :octopus: for nodejs :sparkles::turtle::rocket::sparkles:

Home Page:https://nodefluent.github.io/kafka-streams/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Read from beginning

aleproust opened this issue · comments

Hello,
Thanks a lot for the work you achieved with this lib.
I had a question though.
I'm able to read messages I produce in a the topic, but, I'm not able already existing messages in my topic (From beginning?).

My configuration:

"kafkaConf":{

        "noptions": {
            "debug": "all",
            "compression.codec": "none",
            "offset.store.method": "none",            
            "message.send.max.retries": 10,          

            "metadata.broker.list": "localhost:9092",
            "group.id": "kafka-streams-test-native",
            "client.id": "kafka-streams-test-name-native",
            "event_cb": true,            
            "api.version.request": true,            

            "socket.keepalive.enable": true,
            "socket.blocking.max.ms": 100,

            "enable.auto.commit": false,
            "auto.commit.interval.ms": 100,

            "heartbeat.interval.ms": 250,
            "retry.backoff.ms": 250,

            "fetch.min.bytes": 100,
            "fetch.message.max.bytes": 2097152,
            "queued.min.messages": 100,

            "fetch.error.backoff.ms": 100,
            "queued.max.messages.kbytes": 50,

            "fetch.wait.max.ms": 1000,
            "queue.buffering.max.ms": 1000,

            "batch.num.messages": 10000
        },
        "toptions":{
            "auto.offset.reset":"earliest"
        }
    },

Also, what is the fromOffset available options?

Thanks

Antoine

Hi Antoine,

to read a topic from the beginning, you either have to switch the consumer group to a new one (this is by the way where the fromOffset is used) or you have to manually commit your consumer group to the desired beginning of the kafka topic you can fetch that very metadata too.
The first one is a lot easier... make sure auto.offset.reset is set to earliest.

Make sure you understand how Kafka works with partitions and offsets: https://stackoverflow.com/questions/38024514/understanding-kafka-topics-and-partitions

Hello Christian,
Thanks for the informations. I tried to change the group.id but still.
To summarize my situation, I have a kafka container that I initialize by creating a topic, and messages in this topic (Avro Format).
In my consumer application, I'm able to connect to the broker.
When I do stream.start(), no messages appear on stream.forEach(console.log) except the message . I was expecting to get all messages from the topic.

Code:

const stream:KStream = consumerService.getStream('my-kafka-topic');
  
  stream.forEach(console.log)
 

stream.start().then(() => {  
    console.log("stream started, as kafka consumer is ready.");
    stream.writeToStream(consumerService.getKafkaStyledMessage({ ping: "pong" }, 
      'my-kafka-topic'));
   }, error => {
     console.log("streamed failed to start: " + error);
   });

Any idea?

Thanks

I would advise against using the same stream instance that reads the topic to write on the topic. Create two instances and ensure they use the proper Kafka configuration.

The issue was on my side, thanks guys