A major UK telco: This multinational telecommunications services company has operations in multiple countries around the world. It is a leading provider of fixed-line, broadband and mobile services in the UK, plus provides subscription TV and IT services, under several different brands.
The Telco required a centralized platform for publishing and subscribing to reusable events from a single location. It wanted to integrate streams of events, so they could be used for connecting applications upstream and downstream.
The operator, in partnership with Torry Harris, built a centralized, consolidated data streaming platform using Confluent Kafka, as the foundation of planned and future streaming projects to support all its customer-facing units. The platform saw accelerated delivery, as DevOps enabled efficient and focused use of time and resources.
Increasingly, enterprises are seeking to gain a competitive edge and reduce costs by identifying and reacting to opportunities and threats in real time, by gaining insights which can be acted upon automatically. Hence the company in question wanted to leverage event streaming, through data integration and event processing to optimize business outcomes.
The operator had many disparate technologies in place to address these requirements in various ways, but there was no coordination via a central platform to publish and subscribe to reusable events from one location. As a result, there was no single way of streaming live data in a controlled, governed manner across the application landscape.
The operator saw a centralized, consolidated data streaming platform as the foundation of planned and future streaming projects to support all its customer-facing units. Put another way, it needed to provide orchestrated event streaming as well as providing standards, governance, and best practice frameworks as a managed service for integration.
The shift from legacy and disparate technologies that had been deployed over a long period of time and expected business outcomes from the new platform, presented a number of technical challenges such as:
- Interfacing with many sources and downstream systems;
- The new system consistently providing a high level of availability;
- Reducing overall data latency across the integration spectrum;
- Maintaining huge volumes of data and scalability of the systems; and
- Continuous monitoring and automatic fault discovery and recovery.
How Torry Harris helped
Torry Harris had a dedicated team of site-reliability engineers with the expertise to conduct exhaustive analyses of the client’s requirements across the organization. The engineers and subject matter experts interacted with customer teams to develop a deep understanding of its technical and business needs.
DevOps helped the Torry Harris team in bridging the divide between functional and non-functional requirements by experiencing, first-hand, their real-world manifestations. The team attained an intrinsic motivation to deliver an extensible and resilient product. DevOps sped up the delivery lifecycle by fostering a clear ownership of the deliverables. The utilisation of resources remained sustainably optimal in this setup throughout the project, thereby helping the best use of the project’s funds.
The overall architecture
The platform, streams and processes high volumes of data and events from different sources and targets – across the organization and its hybrid cloud set-up – reducing the latency of transactions.
The solution was built using Confluent Kafka, a centralized common event broker (CEB), which briefly hosts all the published messages and events required by current and future projects. Publishers, wherever they are hosted, only need to publish events to the cluster of CEBs local to their application. Where applicable, the cluster then internally synchronizes topics that hold those events with other CEB instances so that systems and applications, which may be located elsewhere, can consume them.
The publish and subscribing processes are managed predominantly in two standards ways depending on the specific requirement:
- StreamSets Data Collector is used to provide intelligent pipelines which can interface with a vast array of systems to stream data to and from, and perform in-flight data transformation.
- Confluent Kafka Connect acts as a simple connector process that interfaces with source and target systems when data transformation is not necessary.
The technology stack was chosen based on business requirements such as scalability, high throughput, and availability. Torry Harris was part of all major product evaluations for implementing the event-streaming platform. The solution includes open-source alternatives and automations, implemented through various modern delivery methodologies, as shown in the graphic below.