Insights
- Operational intelligence is built on three pillars: intelligent processes, adaptive integration, and real-time, contextual intelligence.
- To create intelligent processes, existing process and integration architecture should evolve to ensure that AI agents coexist with traditional business process management systems and event processing.
- Adaptive integration means evolving traditional data plumbing into an integration platform by standardizing how agents access enterprise data while enforcing robust governance.
- For contextual intelligence, a unified real-time backbone, built on streaming analytics, supports producing, routing, and consuming events. This enables real-time experiences.
- These elements provide an architectural blueprint for organizations building their perceptive enterprise.
- Here, we provide a three-phase approach to evolve from traditional digital architecture built on existing digital and cloud investments to one ready for the perceptive enterprise.
In today’s time-sensitive era, organizations need fast, intelligent decisions, acting on the right data and signals in real time, within enterprise-defined guardrails.
This requires operational intelligence, which combines sensing, intelligence, and policy-based processes. Agentic AI provides this capability through systems that perceive, reason, decide, and act independently.
Operational intelligence is built on three pillars: intelligent processes, adaptive integration, and real-time, contextual intelligence.
In this chapter, we describe all three pillars, and provide a strategic roadmap for leadership to use existing investments in cloud and digital to create the perceptive enterprise integration architecture.
Intelligent processes
Enterprise processes today are built on the premise that humans are involved in initiating and executing business processes. However, the explosion of data and tools, and the need for immediate decision-making and always-on digital operations, require these processes to be rewired to adapt and respond in an agile fashion — what we term sentience.
Sentience involves collating data streams to create a digital twin of the operational environment.
The system then uses this data, often through agentic workflows, to reason about the data, deal with ambiguity, and enable agents to monitor, decide, and act safely in live environments.
For instance, sensor data collected during flight operations can provide critical insights on replacing key components when the aircraft arrives. This information could initiate a sequence of autonomous actions, including logistics to transfer parts from the nearest hub to the aircraft’s location.
Agentic processes that align with this operational objective can sense, analyze, and respond effectively. The agent’s observe-decide-act feedback cycle operates faster than humans and gives the organization strategic agility. It also transforms AI from a series of independent tools into a central nervous system for the business, a live enterprise that evolves, learns, and responds faster in a rapidly changing environment.
To create intelligent processes, existing process and integration architecture should evolve to ensure that AI agents coexist with traditional business process management (BPM) systems, event processing, and human-in-the-loop mechanisms, avoiding the need to replace existing infrastructure. Existing business processes and systems can join the agentic ecosystem using the model context protocol (MCP), an open standard that streamlines an AI agent’s access to enterprise data and services, while leveraging existing application programming interfaces (APIs) and ensuring that MCP tools are built around particular agentic skills.
Adaptive integration
The architecture must also support an agent’s ability to handle different inputs and outputs, while coping with extended memory requirements and the context behind the data. While traditional architecture patterns were built on simple request-response workloads and prebuilt workflows, agents need to think for minutes, consult other agents, and maintain a stream of awareness and real-time decision-making throughout a complex workflow. This means evolving traditional data plumbing into an integration platform by standardizing how agents access enterprise data, coordinating between human workflows and agents, and enforcing robust governance so that agents operate reliably and responsibly.
The need to update integration architecture
In traditional architectures, the intelligence layer, which routes logic, message transformation, and orchestration rules, is centralized in a routing mechanism, or enterprise service bus (ESB), with the applications that consume the data remaining relatively passive. Forcing agents to always communicate through an orchestrated centralized bus is a cognitive bottleneck, increasing uncertainty, and becomes a single point of failure, while constraining innovation.
Also, having a single pipe negotiating all the rules of the agentic system can lead to lack of adaptability and fluidity, with too many agents accessing the same tools or resources, while increasing security vulnerabilities.
Further, the ways in which agents receive and process information can differ significantly. This means they can’t talk to each other without thinking about how the architecture will onboard new protocols like MCP and agent-to-agent (A2A), where agents securely discover, coordinate, and exchange tasks or information with other agents across systems.
The hybrid bus
Solving these challenges requires an upgraded architectural blueprint that decentralizes intelligence and turns plumbing from dumb endpoint, smart pipe, to smart endpoint, dumb pipe.
A hybrid bus architecture enables this. It combines centralized messaging with direct point-to-point communication between agents. Triggers guide the flow of information. This setup works with a domain service mesh and operational data hub (ODH).
The domain service mesh is a layered architecture that manages, connects, and secures interactions between services within a specific business domain, while the ODH is a centralized hub that consolidates operational data across systems for real-time visibility and actionable insights.
Instead of the centralized orchestration intuitive to the ESB, this hybrid bus and event-driven model pushes intelligence to the edge, or to the agents themselves. Here, the network isn’t responsible for understanding the data flow, just routing it reliably. All the business logic and routing decisions, along with the process state, reside within the agents, decoupling the agent from the network infrastructure.
Intent-based coordination takes this one step further, where agents are coordinated based on high-level business goals. Modern integration platform-as-a-service (iPaaS) platforms are now embedding AI tools to build this sort of real-time orchestration of applications, data, processes, and teams, making integration more agile and intelligent.
The agent mesh, data mesh, and MCP
Agents should discover and interact with each other through the agent mesh, a network of AI agents that collaborate and coordinate autonomously. This pattern also supports the standard A2A protocol. Products built on messaging, streaming, and service mesh technologies have evolved with data mesh capabilities, a decentralized, domain-driven, data-as-a-product (DaaP) architecture that enables self-serve and federated governance.
Most integration platforms now also incorporate MCP and agentic features within iPaaS. Exposing data and API products as MCP servers gives enterprises scalable, secure, and interoperable agent skills. MCP supports long-term integration needs, reduces development, and supports complex workflows without repeatedly building connectivity. Agents can discover and interact with backend systems semantically through these curated skills.
Real-time, contextual intelligence
Two architectures support data ingestion and analysis for real-time, contextual intelligence:
- Kappa architecture: This relies on event streaming techniques such as Kafka and Flink for real-time analytics, while exposing curated data products and pushing them into traditional data lakes.
- Lambda architecture: This pattern has two speed layers — a batch layer for traditional big data batch processing, and a stream layer for real-time ingestion, with data consolidation into the ODH for further analytics.
A unified real-time backbone, or event fabric, built on streaming analytics, supports producing, routing, and consuming events. It enables scalable, cost-effective, enriched data products and real-time experiences.
The uni-stream layer for contextual intelligence
The hybrid bus architecture, DaaP, and real-time, self-directed decision-making help update the technology landscape from centralized orchestration to a live agent fabric. This is an evolution in enterprise integration that allows for seamless connectivity between AI agents and business processes and data (Figure 1).
Figure 1. Evolution of integration for the agentic-first enterprise
Source: Infosys
Event streams generate insights, which in turn trigger signals. Agents need to be designed to subscribe to insight-driven signals, whether they arise from stream processing or traditional analytics, as these agents lack the agility to reconcile these two streams.
Infosys is working on a uni-stream layer that enables both data speed and completeness using an event-driven architecture. This layer treats a historical record from 2010 and a sensor reading from 2026 as two events on the same continuum, accessible through the same protocol.
Governance and the integration control plane
Thousands of agents in the agentic enterprise will communicate via different protocols. This requires a central governance structure that provides visibility without imposing data processing bottlenecks. This is where the integration control plane (ICP) comes in.
The ICP acts as a mission control for the agentic enterprise, managing decisions such as which LLM is best for which agent, and what prompts should be stored in memory.
It also manages cost, enabling FinOps practices. For example, administrators can set budgets such as “the support agent cannot spend more than $10 per 100 business interactions” and receive alerts when an agent’s reasoning loop becomes inefficient.
The cognition plane
However, the control plane is inadequate to look after all the reasoning, planning, and decision-making capabilities of the updated architecture. Enterprises must separate this into a cognition layer, to ensure that intelligence is not embedded into execution paths, and so that decisions made by agents can also be audited (Figure 2).
Figure 2. The cognition plane adds agentic intelligence
Source: Infosys
By providing guardrails and controlling what agents should be allowed to do, the cognition plane introduces context-aware intelligence, a key pillar of the perceptive enterprise.
Integration across edge, on-premises, and cloud
Agents must live where the action is. This creates an edge-cloud continuum, where edge agents powered by SLMs process video feeds or sensor data locally for faster response, while cloud agents, with access to massive frontier models, perform more complex data processing. The cloud agent solves the enterprise problem and then pushes the solution back to the edge agent.
In regulated industries, like finance, defense, and healthcare, data can’t leave the premises and agents are needed to process data locally. In this situation, organizations will benefit from the hybrid MCP architecture, where intelligence resides in the cloud and MCP tools remain on-premises.
A strategic roadmap for leadership
These elements provide an architectural blueprint for organizations building their perceptive enterprise. Organizations should build the cognition plane to ensure intelligence evolves in a controlled manner and is governed with the right level of observability and explainability.
Through client projects we have implemented, we advise a three-phase approach to evolve from traditional digital architecture to one ready for the perceptive enterprise:
Phase 1: Evolve the foundation
- Establish a hybrid bus pattern by adopting an event-driven, smart endpoint approach for dynamic agentic processes and integrations where runtime dynamism is needed.
- Deploy an agent mesh using Istio, Cilium, or leading integration platforms.
- Standardize on MCP, DaaP and the cognition plane by mandating that all agents access systems and data through MCP servers and establish the operating model for DaaP and the cognition plane.
Phase 2: Governance and pilots
- Enhance the integration control plane by implementing an agent control tower to track agent identity, token usage, and policy compliance.
- Launch pilots by identifying high-value, low-risk processes and augmenting existing business processes.
- Secure the edge by piloting edge agents for operational monitoring with a secure integration path to the cloud.
Phase 3: Scale out
- Reimagine user journeys to enable external agents to interact with the enterprise.
- Close the loop by integrating the ODH so agent actions feed back into the system to retrain and refine the organization’s collective intelligence through DaaP and MCP.
- Move to end-to-end agentic processes with multiagent workflows where swarms of specialized agents collaborate autonomously to achieve high-level business goals.
The perceptive enterprise
Traditionally, integration has been a static protocol and message translation layer. Now it must evolve to support the perceptive enterprise, receiving inputs, aiding in analysis and planning, interacting between humans and agents, and delivering outputs.
To get ahead, organizations should reuse API, event, and data analytics investments; extend governance rather than replace it; and evolve integration without architectural or experience disruption, building on existing digital and cloud investments.
In time, this approach ensures stability, trust, and incremental adoption, blending proven processes with adaptive intelligence — a blueprint for evolving from a digital enterprise to a perceptive, agentic enterprise that is secure and ready for the challenges to come.