Skip navigation

Keeping Your Legacy Systems Out of the Retirement Home

High-tech architectures of the past can become a digital anchor as newer technology enables better, more efficient infrastructures.

zazueta.jpg

Rob Zazueta is Global Director of Digital Strategy at TIBCO.

Digital transformation - almost by definition - means better digital agility. The whole point is to free your data, break it from its silos, and make it more accessible for new analyses and applications. You've been told countless times that “there's gold in them thar' datalakes,” all you need to do is unleash it.

That's easy to say when you don't have decades of aging code on legacy infrastructure running systems that are nearing or have already passed their planned end of life.

The Curse of the Early Adopter

This challenge came into sharp focus for me during my recent trip through Scandinavia and Germany to attend the ENFO Integration Days conference and to visit with several local companies. I met with people from dozens of companies, most of them doing heavy duty engineering - like, actual physical engineering, not the digital stuff we software people claim as engineering. These folks are no slouches when it comes to embracing technology and, to their credit, they embraced the digitalization movement early. 

But the high-tech architectures of the past can become a digital anchor as newer technology enables better, more efficient infrastructures. Monolithic architectures relying on a single code base worked well when scalability wasn’t much of a concern. The move toward N-tier architectures, which typically distributed the load across data, application, and presentation layers, addressed early scaling issues when access spread primarily through desktop browser-based web applications. Now that just about every human in the developed world has a powerful computer running in their pockets - not to mention the rapidly increasing number of devices being connected every day - the demand continues for newer, more distributed approaches to delivering functionality and data without sacrificing performance.  Many early adopters find themselves locked in a seemingly eternal battle of bending older monoliths to do their will in an increasingly agile and distributed marketplace.

None of them seem to have reached a sense of hopelessness, but they're realistic about their prospects of competing with up-and-comers who aren't saddled with - at least in technical terms - ancient technology. They have a sense that it will cost a lot of money, a lot of effort, and a lot of lost revenue as they spend time modernizing their infrastructure rather than actually digging for digital gold.

Leverage Your Legacy

To those of you feeling the same pains, I bring a message of hope, of being able to adapt to a changing landscape without having to toss out your old investments. Your legacy systems are still in place mainly because they just work. Why should you risk replacing it with new technology that will need to go through the same maturation process - right along with the same growing pains - when its only major issue is that it’s just old and, maybe, kind of slow? 

The answer is, you don’t. Not yet, anyway. For those systems that are not quite yet at end of life, where the cost of maintenance does not yet match or exceed the value they provide, you should make them feel and appear more agile by placing an API abstraction layer in front of them. API-led development and integration allows you to move on opportunities more quickly by exposing your existing data securely and consistently as independent, use-case decoupled APIs. Your legacy systems, more than likely, do not have built-in API capabilities that even remotely match these requirements, but you can teach old systems new tricks.

API Abstraction - Modernize Without Compromise

An API abstraction layer is more than just a veneer of APIs over a stodgy system. It’s a strategy that allows you to extend the life of your software and provide agility not only on the front end, where you applications and BI tools consume data, but on the backend as well, giving you the freedom to replace systems as needed with minimal impact on your clients. The three main components for a successful abstraction strategy are:

  1. An integration bus capable of transforming the data from your legacy software into APIs that adhere to your designs,
  2. A data virtualization layer that can combine and host information stored across your many data silos in real time and expose them as APIs without taxing your older systems,
  3. An API management system that can help you monitor, orchestrate, and govern your APIs from a single location with little to no coding involved.

It’s easier said than done, but it’s still easier than you think to expose your services in a way that lets you exploit the advantages of an API-led approach without replacing valuable backend systems. Start by identifying the low-hanging fruit - where does your most valuable data live and how much more valuable could it be if exposed in a more agile way? Which applications have you decided not to develop, or which questions are you not able to answer because the information is not easily accessible? 

Once you’ve zeroed in on the data you should transform first, it’s time to abstract it away. Before writing a single line of code or transformation logic, define and design the APIs that will help you achieve your most pressing business goals now. Use mock and model tools to validate your assumptions and ensure the design is correct. Next, put your active applications and vendor-provided services behind an enterprise service bus and map their data to the APIs you designed. For the data stored throughout your organization, identify their primary keys and get them organized in a data virtualization layer that can also expose them as APIs. Offload your API key generation, documentation, reporting and other management tasks to an API management solution that allows you to better scale, secure and support your services.

As you migrate some systems to the cloud, ensure all new development adheres to the same API design - which is considered a contract between you and your client developers - to reduce the impact on your existing consuming clients. Those systems that either can’t be migrated or would not be cost effective to migrate to the cloud can continue to hum along behind the scenes, appearing as new, agile services to all but your IT operations team. Modernizing your infrastructure doesn’t mean rebuilding from scratch - it means making the most of what you have to address your challenges now and into the future. Don’t wait on this - get started today!

Opinions expressed in the article above do not necessarily reflect the opinions of Data Center Knowledge and Informa.

Industry Perspectives is a content channel at Data Center Knowledge highlighting thought leadership in the data center arena. See our guidelines and submission process for information on participating.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish