Let's say, I have topicA, topicB and topicC, both topic segregated by separate event types, based on domain entities. topicA operate with eventA only, topicB keeps eventB, topicC operates with eventC only. All events relate to each other by business domain but produced by separate microservices and should be processed in specific order.
The question is, how to using Apache Kafka introduce consuming events in specific order, eventA then wait for receiving eventB then when eventC received consume all of them.
Appreciate any feedback, any questions are welcome.
Some notes: Kafka Streams is a good approach, but restricted by company policies.
Also, I've looked through Join Pattern but haven't found any reliable approaches for implementation.
Probably, there are many approaches to solve the problem. Here are couple, that I can suggest:
Introduce correlation ID, that will link events from topics A, B and C. Then, use correlation ID in following manner:
Services A, B and C produce events to corresponding topics, but related events have the same correlation ID
Service D consumes events from separate topics. Each time it receives event from any topic, service D either inserts event data to database by correlation ID, or performs some action if all data is received.
For example, when service D receives event C it first issues query to check if there is record in database with correlation ID from event C:
And so on for each consumed event.
Chain services that produce events (A, B and C). For example, chain can be formed in following manner:
Service A produces event to topic A
Service B consumes event from topic A, and produces event to topic B (possibly, aggregating events A and B)
Service C consumes event from topic B, and produces event to topic C (possibly, aggregating events A, B and C)
Finally, service D consumes event from topic C (possibly, aggregated with A, B and C) and executes required action.
Variation of this approach (without aggregating events on each intermediate stage), would be to chain services and listen for last event in the chain. When last event is consumed, then issue Kafka pull to corresponding topics to get events produced by other services.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With