Kafka Consumer stops commits when overloaded

Description

When a large number of events are sent to Kafka Consumer topics in a short period of time (say 1000+ messages in a burst), the messages are not processed fast enough and "something" stops processing. The consumer attempts to reconnect to Kafka and reprocesses all messages, then gets caught in the same loop. This means that the server will infinitely keep duplicating events and never processes the new additions to clear out the backlog.

Acceptance / Success Criteria

None

Lucidchart Diagrams

Activity

Show:

Alex May June 21, 2022 at 4:14 PM

Mark Mahacek June 17, 2022 at 1:54 AM

I was just able to replicate the issue by sending 150 messages to the Kafka topic at once. There's over 3500 instances of the event recorded about two minutes later.

Fixed

Details

Assignee

Reporter

FD#

Sprint

Affects versions

Priority

PagerDuty

Created June 17, 2022 at 1:50 AM
Updated July 6, 2022 at 5:38 PM
Resolved June 22, 2022 at 12:14 PM

Flag notifications