Application modernization overview

Application modernization is the process of updating legacy applications leveraging modern technologies, enhancing performance and making it adaptable to evolving business speeds by infusing cloud native principles like DevOps, Infrastructure-as-code (IAC) and so on. Application modernization starts with assessment of current legacy applications, data and infrastructure and applying the right modernization strategy (rehost, re-platform, refactor or rebuild) to achieve the desired result.

While rebuild results in maximum benefit, there is a need for high degree of investment, whereas rehost is about moving applications and data as such to cloud without any optimization and this requires less investments while value is low. Modernized applications are deployed, monitored and maintained, with ongoing iterations to keep pace with technology and business advancements. Typical benefits realized would range from increased agility, cost-effectiveness and competitiveness, while challenges include complexity and resource demands. Many enterprises are realizing that moving to cloud is not giving them the desired value nor agility/speed beyond basic platform-level automation. The real problem lies in how the IT is organized, which reflects in how their current applications/services are built and managed (refer to Conway’s law). This, in turn, leads to the following challenges:

  • Duplicative or overlapping capabilities offered by multiple IT systems/components create sticky dependencies and proliferations, which impact productivity and speed to market.
  • Duplicative capabilities across applications and channels give rise to duplicative IT resources (e.g., skills and infrastructure)
  • Duplicative capabilities (including data) resulting in duplication of business rules and the like give rise to inconsistent customer experience.
  • Lack of alignment of IT capabilities to business capabilities impacts time to market and business-IT. In addition, enterprises end up building several band-aids and architectural layers to support new business initiatives and innovations.

Hence, application modernization initiatives need to be focusing more on the value to business and this involves significant element of transformation of the applications to business capabilities aligned components and services. The biggest challenge with this is the amount of investment needed and many CIOs/CTOs are hesitant to invest due to the cost and timelines involved in realizing value. Many are addressing this via building accelerators that could be customized for enterprise consumption that helps accelerate specific areas of modernization and one such example from IBM is IBM Consulting Cloud Accelerators. While attempting to drive acceleration and optimize cost of modernization, Generative AI is becoming a critical enabler to drive change in how we accelerate modernization programs. We will explore key areas of acceleration with an example in this article.

A simplified lifecycle of application modernization programs (not meant to be exhaustive) is depicted below. Discovery focuses on understanding legacy application, infrastructure, data, interaction between applications, services and data and other aspects like security. Planning breaks down the complex portfolio of applications into iterations to be modernized to establish an iterative roadmap—and establishing an execution plan to implement the roadmap.

Blueprint/Design phase activities change based on the modernization strategy (from decomposing application and leveraging domain-driven design or establish target architecture based on new technology to build executable designs). Subsequent phases are build and test and deploy to production. Let us explore the Generative AI possibilities across these lifecycle areas.

Discovery and design:

The ability to understand legacy applications with minimal SME involvement is a critical acceleration point. This is because, in general, SMEs are busy with systems lights-on initiatives, while their knowledge could be limited based on how long they have been supporting the systems. Collectively, discovery and design is where significant time is spent during modernization, whereas development is much easier once the team has decoded the legacy application functionality, integration aspects, logic and data complexity.

Modernization teams perform their code analysis and go through several documents (mostly dated); this is where their reliance on code analysis tools becomes important. Further, for re-write initiatives, one needs to map functional capabilities to legacy application context so as to perform effective domain-driven design/decomposition exercises. Generative AI becomes very handy here through its ability to correlate domain/functional capabilities to code and data and establish business capabilities view and connected application code and data—of course the models need to be tuned/contextualized for a given enterprise domain model or functional capability map. Generative AI-assisted API mapping called out in this paper is a mini exemplar of this. While the above is for application decomposition/design, event-storming needs process maps and this is where Generative AI assists in contextualizing and mapping extracts from process mining tools. Generative AI also helps generate use cases based on code insights and functional mapping. Overall, Generative AI helps de-risk modernization programs via ensuring adequate visibility to legacy applications as well as dependencies.

Generative AI also helps generate target design for specific cloud service provider framework through tuning the models based on a set of standardized patterns (ingress/egress, application services, data services, composite patterns, etc.). Likewise, there are several other Generative AI use cases that include generating of target technology framework-specific code patterns for security controls. Generative AI helps to generate detail design specifications, for example, user stories, User Experience Wire Frames, API Specifications (e.g., Swagger files), component relationship diagram and component interaction diagrams.

Planning:

One of the difficult tasks of a modernization program is to be able to establish a macro roadmap while balancing parallel efforts versus sequential dependencies and identifying co-existence scenarios to be addressed. While this is normally done as a one-time task—continuous realignment through Program Increments (PIs)—planning exercises incorporating execution level inputs is far more difficult. Generative AI comes in handy to be able to generate roadmaps based on historical data (applications to domain area maps, effort and complexity factors and dependency patterns, etc.), applying this to applications in the scope of a modernization program—for a given industry or domain.

The only way to address this is to make it consumable via a suite of assets and accelerators that can address enterprise complexity. This is where Generative AI plays a significant role in correlating application portfolio details with discovered dependencies.

Build and test:

Generating code is one of the most widest known Generative AI use case, but it is important to be able to generate a set of related code artifacts ranging from IAC (Terraform or Cloud Formation Template), pipeline code/configurations, embed security design points (encryption, IAM integrations, etc.), application code generation from swaggers or other code insights (from legacy) and firewall configurations (as resource files based on services instantiated, etc.). Generative AI helps generate each of the above through an orchestrated approach based on predefined application reference architectures built from patterns—while combining outputs of design tools.

Testing is another key area; Generative AI can generate the right set of test cases and test code along with test data so as to optimize the test cases being executed.

Deploy:

There are several last mile activities that typically takes days to weeks based on enterprise complexity. The ability to generate insights for security validation (from application and platform logs, design points, IAC, etc.) is a key use case that will help assist accelerated security review and approval cycles. Generating configuration management inputs (for CMDB)and changing management inputs based on release notes generated from Agility tool work items completed per release are key Generative AI leverage areas.

While the above-mentioned use cases across modernization phases appear to be a silver bullet, enterprise complexities will necessitate contextual orchestration of many of the above Generative AI use cases-based accelerators to be able to realize value and we are far from establishing enterprise contextual patterns that help accelerate modernization programs. We have seen significant benefits in investing time and energy upfront (and ongoing) in customizing many of these Generative AI accelerators for certain patterns based on potential repeatability.

Let us now examine a potential proven example:

Example 1: Re-imagining API Discovery with BIAN and AI for visibility of domain mapping and identification of duplicative API services

The Problem: Large Global Bank has more than 30000 APIs (both internal and external) developed over time across various domains (e.g., retail banking, wholesale banking, open banking and corporate banking). There is huge potential of duplicate APIs existing across the domains, leading to higher total cost of ownership for maintaining the large API portfolio and operational challenges of dealing with API duplication and overlap. A lack of visibility and discovery of the APIs leads API Development teams to develop the same or similar APIs rather than find relevant APIs for reuse. The inability to visualize the API portfolio from a Banking Industry Model perspective constrains the Business and IT teams to understand the capabilities that are already available and what new capabilities are needed for the bank.

Generative AI-based solution approach: The solution leverages BERT Large Language Model, Sentence Transformer, Multiple Negatives Ranking Loss Function and domain rules, fine-tuned with BIAN Service Landscape knowledge to learn the bank’s API portfolio and provide ability to discover APIs with auto-mapping to BIAN. It maps API Endpoint Method to level 4 BIAN Service Landscape Hierarchy, that is, BIAN Service Operations.

The core functions of solution are the ability to:

  • Ingest swagger specifications and other API documentations and understand the API, end points, the operations and the associated descriptions.
  • Ingest BIAN details and understand BIAN Service Landscape.
  • Fine-tune with matched and unmatched mapping between API Endpoint Method and BIAN Service Landscape.
  • Provide a visual representation of the mapping and matching score with BIAN Hierarchical navigation and filters for BIAN levels, API Category and matching score.

Overall logical view (Open Stack based) is as below:

User Interface for API Discovery with Industry Model:

Key Benefits: The solution helped developers to easily find re-usable APIs, based on BIAN business domains; they had multiple filter/search options to locate APIs. In addition, teams were able to identify key API categories for building right operational resilience. Next revision of search would be based on natural language and will be a conversational use case.

The ability to identify duplicative APIs based on BIAN service domains helped establish a modernization strategy that addresses duplicative capabilities while rationalizing them.

This use case was realized within 6–8 weeks, whereas the bank would have taken a year to achieve the same result (as there were several thousands of APIs to be discovered).

Example 2: Automated modernization of MuleSoft API to Java Spring Boot API

The Problem: While the current teams were on a journey to modernize MuleSoft APIs to Java Spring boot, sheer volume of APIs, lack of documentation and the complexity aspects were impacting the speed.

Generative AI-based Solution Approach: The Mule API to Java Spring boot modernization was significantly automated via a Generative AI-based accelerator we built. We began by establishing deep understanding of APIs, components and API logic followed by finalizing response structures and code. This was followed by building prompts using IBM’s version of Sidekick AI to generate Spring boot code, which satisfies the API specs from MuleSoft, unit test cases, design document and user interface.

Mule API components were provided into the tool one by one using prompts and generated corresponding Spring boot equivalent, which was subsequently wired together addressing errors that propped up. The accelerator generated UI for desired channel that could be integrated to the APIs, unit test cases and test data and design documentation. A design documentation that gets generated consists of sequence and class diagram, request, response, end point details, error codes and architecture considerations.

Key Benefits: Sidekick AI augments Application Consultants’ daily work by pairing multi-model Generative AI technical strategy contextualized through deep domain knowledge and technology. The key benefits are as follows:

  • Generates most of the Spring Boot code and test cases that are optimized, clean and adheres to best practices—key is repeatability.
  • Ease of integration of APIs with channel front-end layers.
  • Ease of understanding of code of developer and enough insights in debugging the code.

The Accelerator PoC was completed with 4 different scenarios of code migration, unit test cases, design documentation and UI generation in 3 sprints over 6 weeks.

Conclusion

Many CIOs/CTOs have had their own reservations in embarking on modernization initiatives due to a multitude of challenges called out at the beginning—amount of SME time needed, impact to business due to change, operating model change across security, change management and many other organizations and so on. While Generative AI is not a silver bullet to solve all of the problems, it helps the program through acceleration, reduction in cost of modernization and, more significantly, de-risking through ensuring no current functionality is missed out. However, one needs to understand that it takes time and effort to bring LLM Models and libraries to enterprise environment needs-significant security and compliance reviews and scanning. It also requires some focused effort to improve the data quality of data needed for tuning the models. While cohesive Generative AI-driven modernization accelerators are not yet out there, with time we will start seeing emergence of such integrated toolkits that help accelerate certain modernization patterns if not many.

Comments are closed.