Opinion

The rise of enterprise integration

By
By
Shawn McAllister

The real world is deeply interconnected but businesses still aren’t – witness the global impact of the recent Crowdstrike outage, drought in the Panama Canal, and other prime shipping route disruptions. Only 12% of businesses have integrated data systems, operating only at a macro level of data integration with an incomplete picture of events. Business processes are still in silos, subsidiaries across the globe are still in silos, and more importantly, the data is all still in silos.

Here, Shawn McAllister, Chief Technology Officer and Chief Product Officer at Solace, explains how an event-driven approach to integration can empower business leaders by enabling integration down to the micro level – connecting IoT devices, SaaS applications, legacy applications and mobile devices – to seamlessly exchange events in real-time.

A large multi-national enterprise typically comprises thousands of applications, with data inputs spanning hybrid/multi-cloud environments, IoT/mobile devices, and distributed operations.

These data events, by their very nature, happen in real-time – a customer places an order online; a supplier updates available inventory; a passenger scans a boarding pass; a sensor detects a sudden temperature change. These events are not synchronous, and they all trigger follow-on actions that ripple throughout different departments and operations across an enterprise.

The vast volumes of information in play, all containing data critical to an enterprise’s day-to-day operations, is underscoring a dramatic shift in the way increasingly globalized business systems are integrated.

The old integration ways cannot match the needs of today’s real-time business world

Traditional methods such as iPaaS and ESB are still being used to “knit” together data across this complex web of disparate systems. However, they struggle to keep pace with the demands of today's data-driven landscape. Synchronous, point-to-point solutions simply aren't built to handle the ever-growing volumes of real-time data coursing through modern enterprises to an ever-increasing number of applications. Traditional integration methods result in a complex web of connectivity that is difficult to understand, fragile, can’t deal with bursts and is generally not robust to failures and outages. This creates longer and longer application runtimes which, in turn, result in poor user experience.

To simplify integration, become more real-time, and more connected, organizations need to adopt an “event-driven” approach.

Event-driven data needs event-driven integration

Event-driven integration offers a new and innovative approach to connecting systems through the instantaneous sharing of real-time events.

At its core, whenever an event occurs within a system, a message is published to a central hub called an event broker. Other systems subscribe to this broker, or a network of event brokers, called an event mesh, receiving messages in real-time and reacting accordingly. As we say in the real world – they are all on the same page! This on-demand, “data as a service” approach unlocks a level of flexibility and scalability far more dynamic, responsive, and real-time than traditional methods.

The analyst community is already there

Leading analysts now recognize event-driven integration as a key component to optimize real-time movement of business-critical data. Gartner identified this trend in the evolution of IT systems architecture, heralding an “event native” mindset that moved from looking at IT systems as being “data (centric) custodians” to one that looks at IT systems as comprising the “nervous system” of an enterprise. More to the point – viewing “data in motion” as opposed to “data at rest” as the real source of effective decision making.

It should be noted that this event-driven integration approach works perfectly well with an organization’s existing Integration Platform as a Service (iPaaS), essentially augmenting the iPaaS with an event-driven platform. In fact, with iPaaS and event-driven integration working in tandem, IT teams can migrate the right information flows incrementally to move towards becoming event-driven, allowing a phased implementation of key business processes over time.

This is reflected by IDC recently, noting in "IDC Market Glance - Connectivity Automation, 2Q24", Shari Lava, Andrew Gens, June 2024: “Another trend that is helping fuel the growth of event-driven architecture (EDA) is the potential to utilize EDA in conjunction with iPaaS to split queuing, avoid bottlenecks, and manage workload and data traffic asynchronously. With an event broker layer, an organization can also ensure that they have visibility into the state of queued messages even if there is a sending error.”

Turning integration inside out

An “event-driven” approach means re-thinking and re-architecting integration.

Integration of the past 20 years has been fairly centralized, with monolithic applications with many dependencies. Current integration approaches have placed integration components in the data path – i.e., at the core. Integration, whether real-time or batch, requires connectivity and transformation. The challenge with centralized integration approaches is that they couple connectors, transformations, mappings, and potentially transactional context into one deployable runtime, through which all data must flow in a synchronous manner. As such, the integration component can become a bottleneck and suffer from many of the same problems as monolithic applications.

Event-driven integration turns this approach inside out.

It moves integrations and connectors to the edge, with decentralized and real-time data flow and events in the middle. The result is an application and integration architecture that is more agile, scalable, robust and real-time – much like event-driven microservices.

Today’s use cases demand an integration architecture that can handle both traffic bursts and slow/offline consumers without impairing performance; can scale to handle an increase in consumers, producers and in data volume; and can guarantee delivery of data even to temporarily unavailable consumers. It also demands an architecture that lets you easily plug in processing components and distribution mechanisms without impairing the overall design and even to easily adopt new technologies you had not even imagined before – such as Gen AI Agents and LLMs.

Going inside out unlocks transformative business insights

Arriving at an optimal event-driven integration implementation does not happen overnight. It’s an evolutionary process that can be measured in four transformative milestones:

Breaking down data silos

Traditional systems often create data silos, hindering accessibility and informed decision-making. Event-driven integration fosters information democratization and availability, allowing businesses to readily access the data they need, when they need it.

Dealing with unexpected bursts in traffic

The decoupling of components serves as a “shock absorber” that gracefully deals with unexpected bursts of traffic, such as sudden spikes in demand, leading to vastly increased order volumes. This decoupling also provides a more robust infrastructure that is tolerant of failures and outages and prevents cascading failures.

Nurturing innovation at warp speed

Event-driven integration streamlines the process of integrating new applications and services, empowering businesses to innovate faster and stay ahead of the curve. If you used event-driven integration today, I guarantee it would be faster and easier to give your Gen AI applications real-time enterprise context to work from, because you would just add these new components to your existing event distribution architecture.

Keeping the user’s finger on the pulse

Enabling real-time data integration provides users with a more consistent and up-to-date view of information, leading to a smoother and more efficient experience.

Event-driven integration in the real world: Organizations that have ‘thought out’ of the traditional integration box

Word is getting out, as organizations across industries such as financial services, manufacturing, retail, and more are beginning to embrace event-driven uses cases.

Heineken, as part of its strategy to become the world’s most connected brewer, implemented an event-driven strategy where production line events trigger real-time inventory updates and automatic order fulfillment for distributors – on a scale across 350 global and local beer and cider brands it sells in 190-plus countries.

Leading German grocery chain, EDEKA, leverages an event-driven approach to modernize its supply chain and merchandise management systems, replacing synchronous batch updates between siloed systems with real-time data sharing. Powered by a continuous event-driven flow of information, EDEKA is now supporting a better shopping experience for customers.

In fact, a recent IDC Infobrief found that 82 per cent of survey respondents have either begun or are planning to add between two to three event-driven use cases in the near future.

Rethinking integration architecture to meet today’s needs

Increasing data volumes and connectivity levels amid shifting consumption models and customer expectations are changing the way large organizations need to architect the flow of critical information across their business. Traditional integration approaches are no longer able to support businesses that have to match customer, employee, and supplier requirements in real-time.

Event-driven integration offers a path for multinational businesses to modernize their integration strategy and empower their business to be more adaptable, scalable and robust – in a landscape that is only going to continue to become more digitized and real-time.

Written by
October 2, 2024
Written by
Shawn McAllister