top of page

Why event streaming is key to your hybrid cloud strategy

This article was published in Computing on 5th Jan 2021



Event streaming is the modern way to facilitate data exchange between on-premises infrastructure and the cloud


Today's customers are quickly growing accustomed to a different kind of service. They expect personalised recommendations for what to watch and listen to, a helping hand with personal finances, and more relevant, informative and pleasurable shopping experiences. The reaction to services that fail to meet these expectations, in whatever sector, is not just frustration, but often rejection.


Meeting these expectations requires responding to real-time events, as they happen, at scale, to the point that traditional data architectures, hosted in conventional data centres are no longer viable solutions. To deliver modern services, hybrid cloud has become a standard way of working for the majority of businesses. In fact, recent research revealed that 86 per cent of IT leaders see this as the ideal operating model for their organisations.


According to the same study, however, businesses face a number of key challenges when it comes to getting hybrid cloud right. One of the main roadblocks is facilitating data exchange between on-premises infrastructure and the cloud. Incorporating Apache Kafka - the technology behind event streaming - into your hybrid cloud strategy offers a new approach for solving this problem, with the ability to enable the smooth and secure sharing of data between systems in hybrid environments.


Why choose hybrid cloud in the first place?


Hybrid cloud was born out of necessity. Systems now need to offer targeted, personalised and contextual customer experiences, while at the same time remaining fast, secure and reliable. Few consumers realise that behind the scenes, they are not interacting with a single application or platform but a complex fleet of applications, or microservices, that work independently and integrate seamlessly.


This has resulted in a rapid growth in data requirements - one that has been so significant that self-managing applications within data centres is no longer cost effective or able to scale quickly enough to cope with the new demands. Adapting to a new reality requires the benefits of cloud computing and fully-managed services. This means that on-premises systems now need to work alongside the cloud, creating hybrid cloud environments. Every hybrid cloud architecture is different, but the ultimate goal is fundamentally the same: to retain the value embedded in on-premises IT equipment, while leveraging the scale and agility of the cloud and fully-managed services.


These environments are key to industries that heavily rely on intricate legacy systems. New applications might be built in the cloud, but they still need to connect to the edge or to historical data sitting in data centres. While organisational planning refers more and more to ‘cloud first' or ‘cloud only' initiatives, in many cases these aspirations are far off - or simply not possible.


The problem of data exchange


Hybrid cloud is the chosen approach for many organisations, but major issues arise when data isn't shared efficiently between different systems. This is magnified when working with the diverse, federated microservices that underpin today's responsive and scalable services.


Complex data exchange between the cloud and on-premises infrastructure is an obstacle that hinders the success of many hybrid cloud approaches. Friction between different systems can lead to serious fulfilment errors - for example, when one customer address is held on file in the cloud, and a different one is held on-premises.


The default approach to addressing the data exchange challenge is rooted in the same principles we used with mainframe architectures: introducing direct linkages between systems that need to communicate. We might imagine this as something like a telephone system which installs a new end-to-end wire between any two buildings which might want to talk to one another - manageable for a limited number of links, but chaotic when scaled up to every office and home.


Re-thinking data in the cloud


Ultimately, solving the data exchange problem requires a new approach, which is why many organisations are turning to Kafka to deliver event streaming. Instead of sending information directly to each machine that needs it, data is posted to an ongoing system of record where any machine that needs to read it, can do so, in real time. Rather than telephone lines, this is radio, available at the same time to anyone who can tune in.


For hybrid cloud, this sidesteps the capacity issues involved in keeping on-premises and cloud infrastructure running in unison, minimising the bandwidth and complexity needed for data transfer. For systems running on large-scale microservices, event streaming is an approach to data designed with this new frontier in mind.


As an open architecture, Kafka can run in an organisation's data centre, at the edge, or in the cloud, wherever it is needed. As such, it integrates data from all over the organisation, no matter where it resides. Event streaming is the instant link between all of a company's footprints, from the data stored in data centres or mainframe systems to the data stored in cloud applications. It enables data to flow seamlessly and securely in hybrid cloud architectures at scale, significantly reducing the complexity as a result.


Getting smarter is life-or-death for businesses. To stay afloat - never mind get ahead - they must do more than simply roll out an application. Hybrid cloud provides a solution for delivering advanced services, but it also presents a number of key challenges, particularly when it comes to connecting data from disparate systems. To overcome the data exchange problem, we need to overhaul how we think about how data is shared - which is why every hybrid cloud strategy should include event streaming.



RECENT POST
bottom of page