Axon Framework And Kafka

Hi guys,

I have been lately playing with Axon Framework, which I found very good for spinning good event sourced applications. I have used Kafa as the event bus and MongoDB as the event store inspired by this project, and everything works fine.

The main problems I'm facing are :
- How to publish to different topics (I don't want to send all my events to one topic )?
- How to custom serialize events to Kafka (Right now the events are sent by for other applications to consume them it will be very hard)

Thanks for you help

Best Regards

Hi YB,

I’m not doing anything like that in my code bases (using axon and Kafka, but for different things), but how we manage a vast quantity of events and topics we publish to, for consumption by other applications may interest you. (It may also be overkill for you)

We use avro to write the contracts (schemata) for our ‘Object changed’ events, keep them in a separate project, and publish a jar to our Nexus repository. Consuming applications (and our own publishing application) can add the schema project as a dependency and then use the avro Maven plugin to deserialize them, if necessary. Consumers can also get the schema from the schema registry.

For the how do you send different events to different topics, each schema has a metadata and a data section. The data section is for the structure of the object, the metadata section is for other useful information, specifically we keep the name of the topic the given event is published on. We have a helper class that wraps the metadata so that our Kafka publisher can easily get the topic from the schema and publish it.


Hi Amy,

Thanks for your response.

For the serialization part thank you, I think that avro will solve our issues.

The other thing I found (I might be mistaken), is that when Axon Framework publishes Events to Kafka, it does so to One and Only One Topic. So it uses one publisher configured to communicate with that topic (axon.kafka.default-topic). The work around I think of is to create new EventHandlers that will take the events and publish them to the desired topics.

Best Regards

Event Handlers that translate to kafka avro events are actually what we do for “milestone” events that are consumed by other applications that care about our aggregate (forgot about those handlers heh). The schemas for those, in our case, aren’t axon events. Rather, we collect all the events into a larger “Aggregate Changed Event” with the whole aggregate in the payload every time. Our publisher sends one changed event per transaction, that way they don’t have to do any reconstruction on their end, they can take what they want from the event.

You might consider exactly what downstream consumers require from your app and a milestone event(s) might be useful to you (unless they really do want/need every tiny update). That way other apps aren’t so tied to the inner workings of your axon structure and you would have to do less coordination if you need to change your aggregate’s events. Also consider how potential event replays will interact with consumers. You can always choose to have it ignore events while replaying if you need to. In addition, using a milestone event removes the necessity of maintaining many tiny topics and makes it easier for consumers to ingest updates. They don’t have to know every single event name+topic combo and figure out what it all means to them.

I’m not familiar with axon+kafka as event bus, so can’t offer help there, but it does make sense it would only publish to one topic. Otherwise you wouldn’t be guaranteed event order.

Out of curiosity, what’s the event store in your situation?


Ah! Reread first post. Mongo event store. Cool.

Thanks Amy for your response, that’s the road we are going to. One milestone aggregate event to be consumed by different applications.
For the event store it is MongoDB. We have considered using cassandra but we want to be able to have Ad-hoc queries on our event store, so we went with Mongo instead.