The magic wand for data analytics?
A version of this article appeared in Global Banking and Finance in March 2020
By Lyndon Hedderly, Director Customer Solutions at Confluent.
Conjure up the frustration, anger and feeling of helplessness you get when stuck in traffic trying to get home, to a meeting or flight, or some other pressing deadline. Driving around most cities, it quickly and depressingly becomes apparent the transport infrastructure is not optimised for the motorist. Traffic congestion isn’t just highly annoying – it’s costly. We see lost productivity, pollution and overcrowding in major cities due to transport links. In short, congestion is bad for the economy and people.
What’s this got to do with finance and regulation? Congested road traffic is analogous to data flows in an enterprise. Cities have often seen huge increases in demand for transport infrastructure, without the luxury of long-term planning required to provision for this. For example, just remember the last time you were stuck in traffic because of a major sporting event or rush hour. Similarly, most enterprises have applications and architectures that have evolved over time. These complex, inflexible legacy architectures struggle to meet increasing demand, new digital business models, greater customer experience expectations and fast evolving operational requirements including, regulatory requirements.
We see a ‘spaghetti mess’ in both road layouts and enterprise application integration diagrams. Suggesting alternatives is often met with objections based on cost, practicality and scale of change. With both a city and technology architecture, it’s nearly impossible to rip-and-replace the existing architecture or infrastructure. Instead, we make do with small fixes and the associated delays, gridlocks and missed deadlines.
That said, frustration often results in innovation. Lack of transport options in San Francisco famously led to Uber – now the de facto adjective for innovation. Companies want to be the Uber of banking, or the Uber of retail, the list goes on. More recently, we’ve seen burgeoning alternative modes of transport, such as electric scooters and eBikes.
The data hurdles faced by financial institutions
Before explaining event streaming, let’s take a look at how the current pain of moving data around an enterprise impacts financial services organisations. Most readers will know this story well. After the 2008 global financial crisis, regulators grip on banking institutions tightened with the introduction of a raft of new and necessary regulation. As an example, financial institutions are now being asked to report on the Markets in Financial Instruments Directive (MiFID II) and Markets in Financial Instruments Regulation (MiFIR), the second Payment Services Directive (PSD2) and the Common Reporting Standard (CRS) amongst many other regulations.
Each regulation often requires accessing and exposing massive amounts of data from numerous data sources and reporting on this on a regular basis. And regulations change. This has made regulatory reporting a time consuming and error-prone task. Like a city planner being asked to enable the free flow of all traffic through a congested city, many banks have found this feat quite impossible and have simply failed at the task.
These failures are real. They have cost the world’s top investment banks $43bn in fines over the past seven years, making it the single most expensive compliance issue in banking today. Processing data accurately and in real-time has become such a big issue that Deutche Bank has set aside £3 billion to cover potential fines incurred. That’s a lot of frustration, anger and helplessness.
Unpacking the 3Vs of data
So, how does event streaming help?
According to the 3Vs model, the challenges of big data management result from three properties of data; Volume, Variety and Velocity. This is true for dealing with data for regulatory compliance purposes.
Volume – with the rise of digital and mobile banking, the sheer amount of data is mind boggling. There used to be just a few transactions in bank branches per day. Now we have a combined effect of transactions becoming much easier and therefore greater in volume and we are collecting much more data in general. Apps are now the biggest route to checking bank balances and executing transactions in Britain – being used over 2.6 million times a day. This requires additional compute resources impacting the aging, costly and maxed out mainframe systems – the same systems which own the processing cycles for regulatory compliance and reporting
Variety – we now have many different types of data, from many different sources. Services are evolving rapidly, with KYC (Know Your Customer) initiatives. These include new tech and innovative services. For example, we’ve seen some banks implement automated services to apply for a mortgage through a video link, or wristbands that can be loaded with credit, which work with contactless readers. All the time, these systems have to comply with the General Data Protection Regulation (GDPR).
Velocity – the speed at which we need to analyse data is increasing all the time. Many regulatory requirements have specific time deadlines. For example MiFiD II requires firms to timestamp trades down to the microsecond and report basic details of their trades almost immediately, so that the information can be circulated in the market. The near real-time broadcasts of trade information is set to improve the transparency of pricing and offer greater insight into how prices are quoted and formed. Historically, it takes many financial firms weeks or even months to pull this data together, from multiple sources and systems.
Event streaming allows financial institutions to cope with the 3V’s of data in order to remain compliant in an efficient and cost-effective way. Many organisations are now implementing event streaming as the backbone, or central nervous system of their enterprise, partly driven by regulatory requirements.
Event streaming: The magic wand for data analytics
There are many additional benefits of working with events in an event streaming platform. Reporting, or D&A (Data & Analytics) is one of them. Among IT and digital decision-makers, more than half (or 54% to be precise) of UK financial services organisations struggle to draw valuable insights from data that they collect according to a report by Claranet. And according to Forbes, an estimated 60 percent of a data scientist’s time is spent cleansing their data before being able to do any analysis.
An event streaming platform can help address these challenges. Event streaming architectures help take the human out of the process, as they’re designed for services (and micro services) to ‘talk’ to each other. Rather than the traditional 3-tier request-response style architecture which serves a UI, which in turn serves a human or creates a report, the event streaming platform is really a step forward for the software defined business. This, of course, is a great foundation as we move into the age of Machine Learning (ML) and Artificial Intelligence (AI) – which are sure to play a part in future regulation.
Back to our city analogy, imagine a sci-fi scenario, whereby a city planner could magically implement a fast transport platform to off-load all those people stuck in gridlocked cars, buses, even trains and transport them to their chosen destination in near real-time. At the same time, the cars autonomously drive to their destination in a fast efficient manner without any human interaction. In our enterprise scenario, by using event streaming, this scenario is possible for data flow in an enterprise.
More and more I’m seeing business cases built for global banks to implement an event streaming platform, where the case is made on the back of risk-cost avoidance. That is, implementing an event streaming platform can help prevent fines potentially incurred by failing to meet regulatory requirements. Is this magic? No, it’s real and it has the potential to save FS organisations millions of dollars.
A version of this article appeared in Global Banking and Finance, in March 2020