Ask anyone to define application modernization and you’ll hear many different answers. Here’s a generalization of what we all agree on: Application modernization takes existing applications and data sets that run businesses and makes them more useful, productive, and attractive to those who use them, especially customers. The ability to enhance the customer experience drives more business. 

Some see application modernization as “putting lipstick on a pig,” but that’s not the purpose at all. Application modernization should not be about making applications appear modern; the applications should look and be modern. 

This means changing the user and machine interfaces, as well as modernizing the internal architecture, public cloud platform infrastructure, application features, and enabling technology. It also involves a move from traditional waterfall development processes to agile and devops. 

Is it a good idea to take valuable legacy applications, make them better, and move them to a public cloud? Sure. However, more and more, I see developers and cloud architects approach modernization with some sort of endless checklist that often goes too far and does too much, therefore missing the project’s business value goals and objectives. 

There is so much information out there on application modernization, including processes and methodologies, that many teams try to modernize by hitting checkmarks that others say will make their legacy application truly modern. They pursue buzzy concepts like containerization, microservices, data augmentation, internal architecture augmentation, and other things that may require major surgery. That approach could put the application at risk by introducing a myriad of complexity, complications, and expenses, just to tick a checklist box.

Here are two pragmatic issues to consider. 

First, there is a tipping point where it may make more sense to scrap the original legacy application and start anew. I’m always more willing to fix things than toss them. However, I often see cases where $2 million is spent to modernize an application when a net-new development would have cost $1 million. 

Software engineers usually understand that it’s often easier and cheaper to build something from the ground up rather than refactor and recode an existing system that must first be completely understood before it can be changed for the better. It’s rare that the teams who originally developed the application are still on staff. The knowledge base is incomplete or nonexistent. The application has been modified so many times over the years that no one fully comprehends its complete scope as it exists today.

Second, those who modernize the applications go through an extensive application modernization checklist of things that need to be done. In many cases, they do everything on the list regardless of actual need. This means containerization, microservice enablement, migration to a more modern database, portability, and mobility. These features are considered necessary because they’re on the list. Why do so few people question the list?

In most cases, it does not pay to force-fit everything on the application modernization checklists. Even containerization, which has many benefits, is not a fit for all applications. There is a cost for container-based architectures and enabling technology, as is the case with microservices and even migration to the cloud. 

I’m not saying those features are not solid investments when based on the needs of the application. I am saying that in some cases they’re overkill and don’t really add value to the overall business purpose. Once again, do not question if you can, question if you should. 

Source