What can FS learn from Big Tech?
Adopt a similar approach to data architecture, for a start.
This article was publish in Finextra - Aug 21st 2020
Many tech stocks continue their meteoric rise, despite the worsening economic downturn. On Wednesday this week Apple became Wall Street's first $2tn company. In my last Finextra article I mentioned that Apple, along with the other big four, Microsoft, Amazon, Google and Facebook, now make up a quarter of the S&P 500. We are seeing new investment strategies (and acronyms) emerging with this tech boom, such as ANTMAN - referring to taking big bets on Amazon, Netflix, Tesla, Microsoft, Apple, and Nvidia. An investment here would have returned 76% since the global pandemic was declared on January 30th.
So, what can financial services firms, which now account for just 10% of the S&P 500, learn from the successful tech sector? One common factor across the tech companies is their leading edge data architectures, enabling them to make use of data in real-time, offering exceptional customer experiences, at massive scale.
We used to refer to this as ‘big data’ but this term is now falling out of use, perhaps for two reasons; firstly, it’s just assumed that every company is now a ‘big data’ company and secondly, the term ‘big data’ tends to refer to technology that is now legacy - such as Hadoop.
Perhaps more accurately, we can say the above tech firms all employ event streaming. They capture streams of events (or data changes) and feed these to applications, both customer facing and back-end. Whilst big data is about storing massive amounts of data in databases, data lakes, or data warehouses for analytics, event streaming is much more about application building. This is the data architecture financial services firms can emulate. Event streaming is especially pertinent in the current climate, where digitising channels, automating and drastically cutting costs have emerged as the most important strategic imperatives in the past six months.
Fortunately, many financial services institutions have already made early investments in event streaming, with 9 of the top 10 US banks using Apache Kafka® for a wide variety of use-cases. Typically Kafka is first implemented to help address simple data pipeline use-cases. Often, there is little need for a wider business case to justify the early investments. Whilst this eases enterprise adoption, on the counter argument it sometimes precludes the requirement for a business case. A consequence of this is that organisations adopting event streaming don’t always appreciate the full potential of the event streaming platform, in terms of the business value it can deliver.
So, what can organisations do to maximise this value? And what has Big Tech got to do with it? The answer is in Metcalfe’s law, also referred to as the network effect. This states that communications networks increase in value in proportion to the number of users of the network. Metcalfe’s law helps explain the growth of social networks such as Facebook, Twitter and even the proliferation of the World Wide Web. The same law explains the increasing value of event streaming, as more users (and business use-cases) adopt the platform within an organisation. The more publishers, the more subscribers - and those subscribers tend to re-publish to the platform, resulting in more subscribers. This results in a virtuous circle of shared ‘event-data’ across an enterprise.
Like those people who buy old homes and find lost treasure in the basement or attic, investors in event streaming typically see additional unexpected value, as increasing use cases are unleashed.
Within financial services, the platform can be used to help drive regulatory use cases, such as reporting on MIFID II or PSD2. It can be used to support new customer experiences, such as mobile application notifications, or same-day account opening. It can support fraud detection and prevention, saving millions of dollars. It can augment and off-load data from legacy mainframes and replace other aging and expensive software. The platform can even be used to support Cybersecurity use cases. The list goes on.
As the event streaming platform continues to evolve, we are seeing it become easier for organisations to adopt and scale this technology. Organisations can now deploy event streaming platforms across major cloud infrastructure providers (AWS, Microsoft Azure, and Google Cloud) so almost any organisation can deploy in their public Cloud. By leveraging Cloud versions of event streaming we can see organisations scale up and down, increasing elasticity. It is becoming far easier and therefore more cost effective to set-up and run these platforms.
We are also seeing new features and functions emerge. Infinite storage, within the platform, is now a reality. This means an event streaming platform can ultimately become a system of record, just like relational OLTP (online transactional processing) database. This is game changing. It potentially escalates the strategic nature of the event streaming platform to the point where it can become the most important data platform across an enterprise.
So, not every company needs to be a Google, or Apple, or Microsoft, to deploy world class data architectures. But I believe in order to keep up with the tech stock and ride out the potential recession, financial services organisations should be looking at their data architecture and doubling down on event streaming in order to deliver speed, flexibility and innovation. This is table stakes if they are to continue competing with the rising tech companies. Just as in many tech companies, the event streaming platform has the potential to truly become the central nervous system across most financial services organisations and I’m looking forward to seeing this happen.