When OpenAI publishes a report grounded in real enterprise usage, it's worth paying attention. The data doesn't just predict the future; it documents how today's enterprise networks are already being reshaped.
In The State of Enterprise AI (2025), OpenAI analyzes usage across more than one million business customers. The findings show a clear inflection point: enterprise AI usage has grown 8x year over year, while the use of advanced reasoning models has increased more than 300x. This signals a fundamental shift from simple prompts to complex, multi-step, workflow-driven AI.
AI is no longer confined to pilots or innovation teams. It's being embedded directly into everyday workflows, customer interactions, and operational systems. The report's critical insight is about how AI is converging around specific, high-impact use cases that are reshaping network requirements and raising the bar for what enterprise networks-and IT teams-are expected to deliver. Let's examine this pattern and what it reveals.
How enterprise AI use cases are reshaping the network
As enterprises adopt AI across departments and workflows, the emerging use cases are fundamentally transforming network demands, architectures, and the critical business role that networks play.
AI-powered customer support turns the network into an experience layer
AI-driven support is one of the fastest-scaling enterprise use cases. Organizations are deploying AI agents across chat, email, and real-time voice to resolve a growing share of interactions end to end.
Voice-based AI introduces continuous, latency-sensitive traffic, while backend integrations with customer relationship management (CRM), billing, and order systems generate persistent application programming interface (API)-driven flows. As AI usage scales, these interactions move from edge cases to core customer journeys.
The network becomes part of the customer experience. Inconsistent WAN performance or unstable cloud paths can degrade customer satisfaction and increase pressure on IT teams to diagnose issues across voice, cloud inference, and backend systems.
AI-assisted software development drives explosive east-west traffic
AI is now embedded across the software lifecycle-generating code, refactoring applications, testing, and debugging. This activity is expanding well beyond traditional engineering teams, generating dense, continuous east-west traffic between developers, repositories, continuous integration/continuous deployment (CI/CD) pipelines, testing environments, and cloud inference services. As reasoning-driven AI usage grows, internal dependency chains become deeper and more tightly coupled.
Networks optimized primarily for north-south traffic struggle here. AI-assisted development increases internal traffic volume, cross-domain dependencies, and troubleshooting complexity-often requiring IT teams to reason across network fabrics, cloud connectivity, and application pipelines simultaneously.
AI-driven analysis and research create bursty, cloud-heavy demand
Teams in finance, operations, and research and development (R&D) are using AI to analyze datasets, synthesize research, and extract insights-compressing work that once took weeks into hours.
These workloads are bursty and cloud-heavy, triggering large data transfers and inference requests in short windows rather than predictable patterns.
Networks must absorb sudden spikes without degradation. Congestion or throttling delays critical business decisions and increases the burden on teams already operating at capacity.
Agentic AI workflows make the network part of the execution path
One of the most significant shifts identified in OpenAI's report is the rise of agentic workflows-multi-step AI systems that retrieve data, apply logic, take action across systems, and verify outcomes. These workflows span identity services, APIs, software-as-a-service (SaaS) platforms, and cloud inference endpoints-making the network part of the execution path.
Agentic workflows introduce continuous cross-system dependencies, expand the security attack surface through machine identities, and require IT teams to troubleshoot failures spanning identity, cloud, security, and network domains. Any instability-latency spikes, dropped connections, or misrouted traffic-can break the workflow chain.
AI-driven personalization puts the network on the revenue path
Intelligent personalization engines shape how enterprises engage customers-tailoring offers, recommendations, and experiences in real time. The network is no longer just supporting revenue-generating applications-it's directly part of the revenue path.
Performance degradation translates into missed opportunities, while security gaps increase business risk. IT leaders are now expected to deliver speed and protection simultaneously.
Employee AI assistants create always-on, everywhere demand
AI assistants are becoming the front door to institutional knowledge-supporting onboarding, troubleshooting, and daily productivity across campuses, branches, and remote locations.
Sustained, always-on AI traffic compounds existing collaboration and application loads. High-density wireless, reliable WAN connectivity, and consistent security enforcement are pushed harder than ever-often without a corresponding increase in IT staff.
Embedded AI turns the network into an integration fabric
As AI is embedded directly into digital products-search, diagnostics, automation-the network becomes the integration fabric, connecting users, applications, data, and inference.
Traffic patterns become continuous and unpredictable, making it harder to maintain performance, enforce segmentation, and sustain visibility across domains. The network must function as a unified integration layer connecting AI components across every domain-users, applications, data sources, and inference endpoints.
Enterprise networks-and IT teams-are struggling to scale AI
These use cases expose a growing gap. Many enterprise networks were designed for human-driven interactions, predictable traffic patterns, and manual operations. AI-driven environments introduce continuous machine-to-machine traffic, real-time performance expectations, and deeply interconnected systems.
This gap isn't just architectural-it's operational. AI increases operational complexity, expands the security attack surface through new identities and integrations, and demands skills that are increasingly difficult to hire and retain. AI works in pilots, but struggles at scale.
In many organizations, the technology is moving faster than the operating model required to run AI reliably at scale.
Cisco helps close the readiness gap
The architecture behind the network matters more than ever. This is the gap Cisco is filling with AI-Ready Secure Network Architecture-built to treat the network as an execution platform for AI, connecting users, applications, data, inference, and automation with the performance, security, and visibility AI demands.
By design, it delivers:
- Infrastructure built for real-time, high-concurrency AI workloads
- Security enforced within the network fabric, not bolted on
- Deep telemetry and cross-domain intelligence (AgenticOps-autonomous operations at machine speed) that reduces operational complexity and limits the security blast radius so smaller IT teams can operate AI-scale environments reliably
The goal isn't more complexity. It's simpler operations with greater capability.
What IT leaders should do next
OpenAI's enterprise data confirms AI is becoming foundational to enterprise operations. For IT leaders, this means reassessing not just applications and data, but the network and operating model that underpins them.
As AI embeds itself into workflows, products, and operations, the network becomes inseparable from AI success. Organizations that modernize for real-time performance, embedded security, and autonomous operations will scale AI with confidence. Those that don't will struggle to move beyond experimentation.
In the AI era, the enterprise network doesn't just support the business-it enables it.
Modernize your network
Additional resources
What is agentic operations (AgenticOps)?
