top of page

Data flow - and IT modernisation

This article appeared in ITProPortal in Dec 2019 (which has 390K active monthly users):

Event-streaming platforms can act as a key accelerator of IT modernisation by accessing business data and events and processing them in a parallel modern architecture.

Pick up a company’s annual report, or read the strategic objectives of most organisations and chances are you will find something around creating a digital strategy to improve Customer Experience (CX) and drive operational efficiency. Exact descriptions vary, but they usually boil down to taking advantage of data, and sometimes artificial intelligence, to provide an omnichannel customer journey, while simplifying back-office operations. Business drivers include increasing revenue, reducing costs and mitigating risk. More cynically, the core driver comes from a fear of disruption, or even extinction.

And yet, in setting these strategic objectives, many organisations are impeded by the very thing required to support them; their technology infrastructure.

Most traditional companies are burdened with massive amounts of technical debt in the form of traditional legacy mainframes and proprietary relational databases. Whilst usually reliable, stable and secure, the ageing architectures often no longer meet the demands of real-time data and the requirement for development and operational agility.

How to tackle the next phase of growth

Like the snake that finds its growth restricted by its own skin, these organisations, across industries such as financial services, telecoms, retail, manufacturing and government must grow a new skin before shedding the old in order to realise their next phase of growth.

Here we talk of how traditional organisations can modernise their legacy architecture - and do so in a phased method.

Let’s take a retail bank as an example. The traditional banks are being disrupted by the newer challenger banks and financial services companies. These digital-natives are free from the burden of aged mainframes and legacy systems - they have mostly chosen to architect around event-streaming platforms. This enables them to process events, such as payments and customer interactions, in real-time. This creates a better user experience for customers, with notifications and alerts linked to their personal preferences. New functionality can also be added using microservices. The result is increased customer trust and loyalty, evidenced by the growth rates of challenger banks.

The traditional banks on the other hand have legacy core systems, such as mainframes, that can be over 30 years old. They include years of modifications, often using COBOL - a language first created in 1959. The skills to change these systems are now hard to source. Behind the scenes the bank runs separate systems for each of their services such as current accounts, savings accounts, credit cards, loans, mortgages, and insurance. Each of these systems is stitched together with complex integrations which require batch processes to move data around and process it.

To complicate things, the ‘channels’ - or ways of interacting with the bank - also vary, with online, mobile apps, call centre and branches creating many different touch points. I have spoken to banks where call centre staff need to open dozens of different applications to service a single customer. For these traditional banks with the legacy constraints, providing a single customer view is challenging enough. Providing an equivalent customer experience to the challenger banks is impossible.

Don’t panic, there are options

So, what can the traditional bank, or any large enterprise with legacy debt in need of modernising, do? How can these traditional organisations fulfil their CX and operational efficiency objectives?

One approach is to use an event-streaming platform to enable IT legacy modernisation. A key benefit here is that the transformation doesn’t require a big-bang migration or replatforming.

Event streaming can augment existing systems and manage the transformation over time, thereby reducing risk.

For example, imagine a complete renovation of your house. The ideal and simplest scenario might include moving into temporary accommodation, knocking your existing house down and rebuilding a new one from scratch. Unfortunately, this is never an option when running a business. Another option is to build a new house next door and then ‘move-in’ overnight. But just Google “core banking replatforming disasters” to see the downsides of this approach.

So, instead, the best approach is to renovate whilst living in the house. This involves migrating one piece at a time. When the snake sheds its skin, it is just the outer layer - the one in contact with the outside world, that moults. The snake actually has three further layers of epidermis, each serving different functions. And so, when undergoing legacy IT modernisation, the layer should be peeled back and replaced whilst maintaining the critical core systems.

So, how does an event-streaming platform help achieve this?

An event-streaming platform works by capturing business data, or ‘events’, at the source and/or extracts these using Change Data Capture (CDC) from core systems or records. Data, or events, can continue to run into the traditional systems. The parallel architecture allows for real-time access to the data.

Based on this. the bank can now build additional microservices to improve the CX and rival the challenger banks. Real-time transaction data can be enriched with location or other data. New functionality might include push notifications, personalised loan offers, alerts, and fraud prevention. All this can be implemented whilst the legacy systems remain and without over loading the legacy mainframes or systems.

So, the event streaming platform enables a gradual move from the monolith to a microservices architecture and new use cases can be executed. A purist architect may potentially critique the parallel (dual) architecture, but such scorn is naive. Whilst legacy systems offer ongoing business value, the cost of change tends to spiral as they age. Taking an approach that compartmentalises these legacy functions, extracts events and innovates to one side is often the most cost efficient path, balancing modernisation goals with the substantial business value that legacy systems hold. The result is a kind of controlled migration that retains the best of the legacy systems, while also giving businesses the agility they need to adapt to an evolving market that necessitates modernisation.

This approach also often pays for itself by reducing traffic to the mainframe, which often directly reduces costs, paid-for based on MIPS (Million Instructions per Second) hitting the mainframe. Over time parts of the legacy infrastructure may be turned off, or shed, as components of the architecture truly becomes redundant.

To further simplify the IT architecture landscape, organisations can access event-streaming platforms from a fully managed cloud service. This enables them to chart a course to modernise their legacy infrastructure, with reduced risk and controlled cost.

Future proofing with IT modernisation

In summary, event-streaming platforms can act as a key accelerator of IT modernisation by accessing business data and events and processing them in a parallel modern architecture. This un-traps data from legacy systems enabling a phased migration to a truly digital-first architecture. The traditional businesses can shed the constraints of aging non fit for purpose systems and move to their next phase of growth, satisfying their customer experience and operational efficiency objectives.

bottom of page