Delivering New, Real-time CX
This article appeared in Financial IT in June 2020
https://financialit.net/content/delivering-new-real-time-customer-experiences
An interview with Lyndon Hedderly, Director Customer Solutions at Confluent.
Can you please tell us more about Confluent and your background?
To understand Confluent you first need to understand Apache Kafka® -and for that, cast your mind back to 2007-08 when at five years old, LinkedIn had grown to 10 million users and was introducing features such as "who's viewed your profile" and "People You May Know."
Any technologist would appreciate the complexity in delivering these highly personalised features in such a data-rich environment, at this scale. The existing middleware and messaging technology just weren’t cutting it. So, in 2008, Jay Kreps and the engineering team created a data streaming tool that was real-time, reliable, and able to scale to trillions of events per day. Jay called this Kafka - and subsequently open sourced it in 2011.
On seeing the wider adoption of Kafka, Jay, Neha Narkhede and Jun Rao left LinkedIn in 2014 to build Confluent. Their goal was to create “a fully managed Kafka service and enterprise stream processing platform” that any organization could use a “central nervous system” for their entire business. I joined Confluent 3 years later, in 2017 with a background in enterprise software and strategic consulting. I now head up our Business Value Consulting practice.
Can you tell us more about the implications of event streaming in finance - and has the current global crisis impacted that?
Along with the Silicon Valley digital natives, the financial services sector was an early adopter of Kafka. As we know, most banks today are really technology companies and we saw these innovative technologists apply Kafka to solve problems similar to those experienced at LinkedIn.
The driving factor was that banks were under threat on three fronts; a slew of digital-native startups redefining the digital banking experience, massive operational efficiency/cost pressures and increasingly stringent regulatory requirements. They leverage Kafka to help them:
Create better customer experiences; unlike digital native banks working in greenfield environments, many traditional banks were struggling to connect information about a customer across different information silo’s across-existing legacy systems.
Increase operational efficiency; This includes the use of advanced analytics and new levels of automation in the form of AI and ML, all in a highly resilient, redundant and scalable way - where losing data is not an option. The AI / ML automation also enabled banks to remove manual tasks, creating much faster processes.
Meet new regulatory reporting requirements, such as MIFID II, FIX or FPM and Basel II/III.
Some of the largest banks in the world now use Kafka and Confluent for payment systems, fraud detection, risk systems amongst many other use cases. This enables traditional banks to better compete with the challenger banks - which are often architected around event streaming from the get-go.
In terms of the trends amidst COVID-19, we’re mostly seeing an unprecedented acceleration on existing themes;
Digital channels are becoming the primary (and, in some cases, sole) customer-engagement model.
Pressure on operational efficiency/cost reduction - moving to Cloud is a key sub-theme here.
Automated processes are a primary driver of productivity.
Businesses that once mapped digital strategy in one to three-year phases must now scale their initiatives in a matter of days, weeks and months. Digital has become central to every interaction.
How is Confluent providing organisations in the trading industry with seamless data analysis in real-time?
Kafka and the Confluent Platform can be used to support trading environments in a number of different ways, by offering a performant highly reliable and scalable streaming infrastructure and a persistence layer for the Ultra Low Latency (ULL) high-frequency trading platforms.
As an example, Euronext recently developed a new event-driven trading platform, Optiq®, that processes billions of trades in the European markets and has average performance latency as low as 15 microseconds for a roundtrip order. Kafka and the Confluent Platform underpins the Optiq platform by matching an order book with buyers and sellers, handling multiple billions of messages per day, so vendors and trading members can improve their trading strategies. This is all done with millisecond latencies and no lost messages.
The Confluent Platform has also enabled the Euronext team to build numerous consumer applications on the persistence layer. This includes applications that; interface with clearinghouses, monitor market latency, provide gateways used by analysis and regulators, perform replication of data for disaster recovery and store records in a data warehouse in compliance with regulatory requirements.
Data is increasingly “a golden resource” for most organisations, and this type of event streaming platform can help create revenue opportunities across business models, asset classes, products, and services, whilst driving operational efficiencies and mitigating risk and aligning to regulatory requirements.
What’s next for Confluent?
We believe the event streaming platform will be the single most strategic data platform in a modern company, and our goal at Confluent is to help make this happen. Businesses are becoming increasingly defined by software. Most Silicon Valley tech companies are already ‘software’ - think Uber, Netflix, AirBnB etc. We’re now witnessing the core business processes in more traditional organisations become codified, or ‘software-defined.’ We believe the event streaming platform is a much better architecture than the traditional request-response database to support this transition. It is much more suited to act as the central nervous system of the company.
Confluent’s mission is to build this event streaming platform and help companies begin to re-architect themselves around it. This hasn’t changed.
Secondly - looking at HOW the event streaming platform is delivered is a major focus for us. We have seen some friction when some organisations set-up and operate their own Kafka clusters. For this reason, we are now laser focussed on our Cloud offering. Last year we saw our cloud revenue grow over 450 percent.
We want to help every organisation realise the benefits of cloud economics - especially in the current economic downturn and a renewed focus on operational efficiency. We have recently launched Project Metamorphosis - a significant transformation in our software, our cloud service, and our contributions to Apache Kafka. Each and every month for the rest of the year, we will announce a major new set of product capabilities. Event streaming in the Cloud is an emerging space and this will lay the foundation for what we plan to do over our next five years.