I’m looking for some advice on the best strategy to evolve a “Big Ball of Mud” legacy application to a new, modular, Event Driven architecture leveraging the Axon Framework. I’ve talked with my colleagues about Martin Fowler’s Strangler Pattern and we’ve generally agreed that this seems like the right way to approach the issue. However, as I read more about the Strangler pattern, I’m not sure that my ideas for executing it are on target. With that said, I want to present our domain as simply as possible and then outline our integration/strangulation strategy and hopefully get some feedback that will give us more confidence in our approach and/or some alternatives or other things to think about.
So first, our domain exists in the shipping industry and can be characterized as distressed package management and recovery. While distressed actually has many different meanings in the domain, the most common definition is when a shipping label becomes separated from its package or is otherwise rendered unreadable and we are left with a package that can not be delivered. When we apply DDD to this domain, some of the bounded contexts we recognize include: package detail data capture, inventory/warehouse management, proactive research (looking for clues in the captured details of a package) and reactive research (taking calls from customers who are looking for their lost package).
We have identified the package detail data capture bounded context as the best place to start the evolution because it’s a pretty simple. So far we have implemented the service back end, leveraging Axon 3 infrastructure, and we are presently working on a new Angular 6 UI.
The idea we have is that we will deliver a new data capture application and move all data capture agents to the new application at the same time. Meanwhile, all warehouse and research agents will remain in the legacy application, which will continue to use it’s own model for the data capture details. We will therefore need to implement an integration module in the new application that will listen for domain events in order to update the legacy application. In this sense, the legacy application will simply be one of the materialized views of our new application.
At the same time we release the new data capture application we will also need to remove all existing functionality from the legacy application which enables the user to modify the state of the captured package detail data. This point is critical in my mind in order to limit complexity and avoid the need for bidirectional synchronization of the two disparate data sources. However, as I’ve read about the strangler pattern it seems as though I should allow for bidirectional data flow. That seems like a nightmare…
Finally, assuming that the legacy application data model is simply a materialized view of the new package detail data capture application, I am concerned about managing materialization process. I’m trying to decide if I need a saga here or simply an event listener service. It feels like I should use a saga because I need to embrace the fact that the connection to the legacy system may not be 100% reliable or there could be bugs in the legacy system that prevents some data from materializing in the legacy system. It’s certainly possible to put exception handling in a non-saga listener service to deal with the unexpected, but it seems like I get more help with a saga.
I hope my questions are clear and I greatly appreciate you for taking the time to share your insights.