Is Aging Legacy Software a Boat Anchor Destined to Sink Your Enterprise?

Is Aging Legacy Software a Boat Anchor Destined to Sink Your Enterprise?
Published on

Custom-built software is ubiquitously used to automate processes and differentiate businesses. But today’s groundbreaking applications eventually become tomorrow’s legacy software – a boat anchor that can sink any enterprise. CIOs continually tread water to maintain current applications while embracing new technology to increase efficiency.

Enterprise systems change organically, and adding new applications doesn’t mean you can abandon existing software. Legacy systems still need to be maintained, enhanced, and integrated with new capabilities.

The problem is that many software teams have difficulty maintaining legacy software because it has become nearly impossible to understand all the intricacies of its inner workings. It spans layers and technologies and tens of thousands of objects. Original developers are often no longer available, and the documentation is poor at best, with little or no insight into how it works. Before a software team can assess how long it takes to make a simple change or embark on a modernization project, they need to understand how the code is structured and how it interacts with other systems.

Some CIOs adopt the attitude, “If it isn’t broken, don’t fix it.” However, as the pace of business and cloud adoption accelerates, there comes a time when legacy applications create massive problems. It is better to be proactive and figure out how to regain the lost knowledge about how these complex software systems actually work.

The cost of legacy software maintenance

There are hidden costs to legacy software that CIOs and IT managers don’t always recognize. Maintenance costs are ongoing. Granted, the cost to maintain applications tends to decline as software approaches end-of-life, then there is a spike in costs when the software must be changed, enhanced, refactored for the cloud, substantially modernized, or decommissioned and replaced. Much of that cost and associated risk of grinding the business to a halt is the time to ‘reverse-engineer’ the meaning of the legacy code. According to the Institute of Electrical and Electronics Engineers (IEEE), developers can spend an average of 58% of their time on program comprehension.

There are staffing costs. As software ages, you may have to pay more for the expertise to program outdated systems. For example, do you still have COBOL systems running? Developers are still using COBOL, and 92% of companies say COBOL workloads are strategic to operations, and 52% expect to be running COBOL ten years from now. To maintain legacy systems, you must hire expertise or train your current staff in outdated technology.

Another aspect is the technical debt you accrue. Just as companies accumulate fiscal debt to pay for what they need now, technical debt accumulates when companies focus on new, leading-edge projects and forsake legacy applications. The staff skill set erodes over time. When something breaks, your staff may not have the expertise to fix it, so you will need to hire higher-priced consultants.

Keeping pace with changing software

The growing complexity of custom-built applications underlie all of these challenges. Over time, new layers, technologies, frameworks, database sources, and API calls are added, in addition to legacy software that shares data with new applications.

Today’s enterprise custom software systems comprise millions of code lines meaning their internal software structures and internal and external dependencies, have become too complex to interpret. As a result, developers spend an ever-increasing amount of time analyzing code and running into dead ends, whether trying to make simple changes or re-architecting applications to run in the cloud.

Developers also need to collaborate more closely with enterprise architects and other developers. Microservices, database warehouses, and software components must be mapped out, and intelligence about the inner software structures must be shared to ensure safer changes and simplified troubleshooting.

In every case, the key to success is readily available knowledge about the software code structures. Whether dealing with poorly documented legacy code or integrating applications from different sources, you need software intelligence to understand systems’ internal and external workings (i.e., how they interact with other applications).

Rather than spending hours trying to reverse-engineer another programmer’s work, large companies and system integrators are adopting capabilities to automatically extract and visually map relationships inside the code for them, much like physicians use MRI to gain knowledge of what’s inside the human body. This type of automated software intelligence reveals dependencies, fine details and, most importantly, provides them with answers to their questions in minutes rather than days or weeks it would otherwise take to find on their own.

Software intelligence in action

To illustrate the power of automated software intelligence, consider the case of a software and service provider providing revenue cycle management software to physicians and hospitals. This company offers software to healthcare providers to assist with contract management, contract modeling, pricing, and reimbursement.

The company needed to migrate five legacy applications written in C#, .NET, AngularJS, and Dapper to Microsoft’s Azure cloud platform to expand its operation. These legacy applications share the same SQL database and are interconnected, with different developer teams working on each application. Before migrating to Azure, the developer team needed to dig into the codebase to understand the architecture.

Using an application that automatically extracts software intelligence, the developer team was able to quickly reverse engineer the inner workings of the applications and display structures as a relational schematic. All the objects and their interrelationships were displayed as a living map, where they could zoom in and out, and see end-to-end transaction flows, data access paths, API dependencies, and so on.

The developers could quickly familiarize themselves with the applications and their interrelationships using the architectural map. They could drill down into specific applications to analyze their inner workings and map interactions from the UI layer to services components to data services components all the way to the SQL-server database. The map allowed them to make notes about specific elements and relationships.

With a better understanding of the code structure, the developers and enterprise architects could collaborate more efficiently, providing tags and adding documentation. The living knowledge base made change requests faster and easier to handle. It also helped them avoid costly dead ends and lengthy wrong turns during the modernization journey.

The healthcare software provider was able to take weeks off development time using automated software intelligence because they didn’t have to map relationships inside the applications manually. They were not only able to migrate to Azure faster but they also were able to create a foundation to simplify future enhancements and integrations.

Enterprise software is constantly evolving, and the only way to keep pace is with a deeper understanding of software interactions and a more efficient way to react to changing system relationships. Understanding those relationships is the first step. Since most developers don’t have a roadmap or a way to decipher legacy applications easily, you can expect to see more organizations relying on automated software intelligence as a starting point for cost-effective application management.

About the Author

Ernie Hu is COO of CAST. He is a renowned tech executive in field execution, business development, strategic planning, and M&A, including assignments in the U.S., AP, China, and Hong Kong. He is also a published author with a bestseller on Big Data.

Alongside CAST CEO & Founder Vincent Delaroche, Hu is now scaling the emerging market category of Software Intelligence, a $1B+ phenomena fueled by business and digital leaders’ needs for greater objectivity in making software decisions, faster modernization for Cloud, raising the security and resiliency of their software assets.

Prior to joining CAST, Hu was the General Manager of IBM Cloud for Greater China and served on IBM’s Global Strategy Team (ST) and IBM’s Growth and Transformation Team (GTT). He holds degrees in commerce and business administration from the University of Wisconsin – Madison.

Related Stories

No stories found.
CDO Magazine
www.cdomagazine.tech