Details
-
Bug
-
Status: Resolved (View Workflow)
-
Medium
-
Resolution: Fixed
-
29.0.6
-
None
-
Security Level: Default (Default Security Scheme)
-
Horizon 22 - Jun 9 - 23
-
1199
Description
When a large number of events are sent to Kafka Consumer topics in a short period of time (say 1000+ messages in a burst), the messages are not processed fast enough and "something" stops processing. The consumer attempts to reconnect to Kafka and reprocesses all messages, then gets caught in the same loop. This means that the server will infinitely keep duplicating events and never processes the new additions to clear out the backlog.