Fixed
Details
Assignee
Alex MayAlex MayReporter
Mark MahacekMark MahacekFD#
1199Sprint
NoneFix versions
Affects versions
Priority
Medium
Details
Details
Assignee
Alex May
Alex MayReporter
Mark Mahacek
Mark MahacekFD#
1199
Sprint
None
Fix versions
Affects versions
Priority
PagerDuty
PagerDuty
PagerDuty
Created June 17, 2022 at 1:50 AM
Updated July 6, 2022 at 5:38 PM
Resolved June 22, 2022 at 12:14 PM
When a large number of events are sent to Kafka Consumer topics in a short period of time (say 1000+ messages in a burst), the messages are not processed fast enough and "something" stops processing. The consumer attempts to reconnect to Kafka and reprocesses all messages, then gets caught in the same loop. This means that the server will infinitely keep duplicating events and never processes the new additions to clear out the backlog.