Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apache Kafka consume events from different topics in specific order

Let's say, I have topicA, topicB and topicC, both topic segregated by separate event types, based on domain entities. topicA operate with eventA only, topicB keeps eventB, topicC operates with eventC only. All events relate to each other by business domain but produced by separate microservices and should be processed in specific order.

The question is, how to using Apache Kafka introduce consuming events in specific order, eventA then wait for receiving eventB then when eventC received consume all of them.

Appreciate any feedback, any questions are welcome.

Some notes: Kafka Streams is a good approach, but restricted by company policies.

Also, I've looked through Join Pattern but haven't found any reliable approaches for implementation.

like image 516
Aventes Avatar asked Oct 15 '25 14:10

Aventes


1 Answers

Probably, there are many approaches to solve the problem. Here are couple, that I can suggest:

  • Introduce correlation ID, that will link events from topics A, B and C. Then, use correlation ID in following manner:

    1. Services A, B and C produce events to corresponding topics, but related events have the same correlation ID

    2. Service D consumes events from separate topics. Each time it receives event from any topic, service D either inserts event data to database by correlation ID, or performs some action if all data is received.

    For example, when service D receives event C it first issues query to check if there is record in database with correlation ID from event C:

    • if there is no record, then incoming event C is stored,
    • if some record already exist, then service D checks whether event C is the last one needed to consume all data and either performs final action, or inserts event C to the database.

    And so on for each consumed event.

  • Chain services that produce events (A, B and C). For example, chain can be formed in following manner:

    1. Service A produces event to topic A

    2. Service B consumes event from topic A, and produces event to topic B (possibly, aggregating events A and B)

    3. Service C consumes event from topic B, and produces event to topic C (possibly, aggregating events A, B and C)

    4. Finally, service D consumes event from topic C (possibly, aggregated with A, B and C) and executes required action.

    Variation of this approach (without aggregating events on each intermediate stage), would be to chain services and listen for last event in the chain. When last event is consumed, then issue Kafka pull to corresponding topics to get events produced by other services.

like image 81
Oleksii Zghurskyi Avatar answered Oct 18 '25 09:10

Oleksii Zghurskyi



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!