wso2 / carbon-analytics-common

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Queue input event dispatchers can exist execution causing data blockage in entire server

tishan89 opened this issue · comments

Description:
This issue was initially discovered in wum updated DAS 3.1.0 with analytics features installed. But should be reproducible with DAS 3.2.0. In a HA setup when we are having an stream with artibary maps, it will throw a buffer overflow exception and QueueInputDispatcherWorker will quit. This will cause that BlockingQueue to fill up. When the queue is full Databridge threads will begin to wait on the queue causing all Databridge threads to block and halt all thrift/binary traffic.

Solution for this is two folds.

  1. We need to investigate below bufferOverflow exception and fix that and also make QueueInputEventDispatchers to stay in the loop no matter what happens.
TID: [-1234] [] [2018-06-21 09:04:09,640] ERROR {org.wso2.carbon.event.receiver.core.internal.management.AbstractInputEventDispatcher} -  Error in dispatching events:null {org.wso2.carbon.event.receiver.core.internal.management.AbstractInputEventDispatcher}
java.nio.BufferOverflowException
	at java.nio.HeapByteBuffer.put(HeapByteBuffer.java:189)
	at java.nio.ByteBuffer.put(ByteBuffer.java:859)
	at org.wso2.carbon.event.processor.manager.commons.transport.client.TCPEventPublisher.sendEvent(TCPEventPublisher.java:229)
	at org.wso2.carbon.event.processor.manager.core.internal.EventHandler.syncEvent(EventHandler.java:99)
	at org.wso2.carbon.event.processor.manager.core.internal.CarbonEventManagementService.syncEvent(CarbonEventManagementService.java:292)
	at org.wso2.carbon.event.receiver.core.internal.management.QueueInputEventDispatcher$QueueInputEventDispatcherWorker.run(QueueInputEventDispatcher.java:197)
  1. Make Databridge threads to drop events and return if it waited for a queue for a pre-configured amount of time.

Please refer : "Possible issue in log analyzer data publishing"

Affected Product Version:
DAS 3.1.0, DAS 3.2.0 and associated analytics distribution

Steps to reproduce:

  1. Get WUM updated APIM analytics and deploy in HA mode
  2. Enable log analyzer analytics
  3. Send events

Theoretically this should get reproduced in latest DAS also with a stream which has an arbitrary map.

@AnuGayan @ksdperera Can we resolve issue since we already fix

@tishan89 yes. we can close this issue.