In Search of Enterprise AI ROI
Recent industry conversions, most notably around the recent MIT AI study, have called into question the efficacy of AI investments. The takeaway of the study is that despite $30–40B in GenAI spend and widespread trials of tools like ChatGPT and Copilot, the findings suggest a stark “GenAI Divide,” where only ~5% of integrated pilots generate meaningful P&L impact while ~95% deliver zero return. Importantly, this poor showing is not due to models or regulation, but to brittle workflows, weak contextual learning, and poor fit with day-to-day operations.
Recent data from our partner ETR confirms similar trends from a much larger sample, with somewhat better but still tepid results as shown below.

Our interpretation of these findings is that most organizations struggle with data silos and lack a coherent, unified data model that harmonizes information across departments and processes. This data challenge causes great difficulty when implementing cross-functional automation or Agentic AI projects.
We believe recent Oracle announcements demonstrate a clear understanding of this problem, and the company is putting forth the beginnings of a roadmap to address the challenge.
A new, AI native organization can start with a clean sheet of paper and develop processes that unify data from the start. The vast majority of established enterprises, however, suffer from data fragmentation and stale data, two of the most significant barriers to enterprise AI adoption.
In this research note we analyze key announcements from Oracle AI World and draw from an exclusive interview with EVP of Database Technologies at Oracle, Juan Loaiza.
Two Notable Announcements at Oracle AI World
While Oracle put forth a firehose of news at its recent event, two announcements stood out and are relevant to our enterprise AI thesis:
Oracle AI Database 26ai
In our view, 26ai is an architectural convergence that brings vectors, LLM integration, and traditional workloads into a single control plane. Oracle has maintained its unified engine, spanning relational, JSON, XML, spatial, and graph, while adding unified vector search that runs in-database. This eliminates the operational overhead of stand-alone vector databases. The path from 23ai to 26ai via October ’25 announcement avoids application recertification and dramatically minimizes migration pain. We believe this design, along with improved latency and integration will shorten time-to-value for AI implementations.
Autonomous AI Lakehouse
An important strategic move is Oracle’s embrace of Apache Iceberg as the open table format (OTF) of record. Oracle blends its relational strengths and Exadata performance with Iceberg interoperability across major clouds and Exadata Cloud@Customer. A “catalog of catalogs” sits atop Unity (Databricks), Glue (AWS), Polaris (Snowflake), and more, exposing thousands of external tables through simple SQL. By caching Iceberg data into Exadata storage, Oracle aims to deliver performance without abandoning open formats, allowing customers to mix database engines while avoiding lock-in.
We believe Oracle is focused on delivering a single database plane for transactional/analytic/AI workloads plus an Iceberg-first lakehouse that normalizes interoperability. For enterprises standardizing on open formats but requiring predictable performance and tight linkage to operational systems, making these releases strategically important.
Oracle’s 26ai Moment: Bringing AI to the Data – Without Breaking What Works
In our view, Oracle’s AI Database 26ai is a pragmatic, enterprise-friendly step forward. Oracle’s strong claim is that AI is being layered into the database rather than bolted on beside it or forcing disruptive rewrites. The headline isn’t “Oracle has AI.” Everyone has AI. The story is that mission-critical governance, availability, and security, hardened over decades, now applies natively to AI. Oracle has added vectors, RAG, natural-language to SQL, and emerging agent frameworks to its core database without breaking existing application compatibility. Customers on 23AI get access to 26ai as a quarterly patch; with many of the new AI tools also running on 19C. Our expectation is this approach reduces integration overhead, preserves trust boundaries, and accelerates time-to-value.
What 26ai Changes – and Why It Matters

We believe Oracle is executing on a concept we’ve been advocating – i.e. bring AI to where the data and controls already live. The company has infused AI across the stack (transactions, DR, security, query processing, and application tooling) while keeping the database’s core architecture intact. We see four implications of this approach, specifically:
- Seamless upgrade: 23AI to 26ai via the October 14, 2025 quarterly update; no traditional recertification.
- Continuity of controls: The same security, privacy, and HA/DR that guard SQL now guard vectors, RAG, and AI-driven access.
- Converged data model: One engine for relational, JSON, XML, spatial, graph, streaming, warehouse, and IoT, now with vectors as a first-class data type.
- Unified vector search: Combines semantic similarity with business predicates (e.g. entitlements, balances, geos, product SKUs, etc.) for relevant results that understand enterprise context.
Upgrade Paths & Capabilities
The upgrade story is unusually clean for a forty+ year old tech firm. Organizations on 23AI simply roll forward on the quarterly patch and automatically get 26ai. For organizations on 19C, you can adopt many of the new AI tools in place, giving you a lower-risk operating platform while you plan your eventual 26ai migration path.
On AI vectors and unified search, 19C users can access select tools today, while 23 introduced full availability. With 26ai, Oracle integrates vectors natively and performance-tunes the experience autonomously for both OLTP and analytics, which we believe is essential for fusing semantic similarity with business logic at low latency.
Similarly, communicating in natural-language follows a similar progression. Capabilities were limited on 19C and available on 23ai, but 26ai broadens the feature set and moves privacy and policy enforcement into the database. In our opinion, that shift reduces the risk of data leakage from app-layer code generation and tightens governance where it belongs.
RAG was nascent or unavailable in 19C and became available in 23ai. With 26ai, retrieval-augmented generation is more tightly integrated across data types and workloads, enabling unified pipelines that combine documents, images, and structured data without stitching multiple engines.
The Lakehouse path stands out as well. While 19C often relied on ecosystem reads and 23ai offered broader support, 26ai makes Autonomous AI Lakehouse with Iceberg a first-class citizen – meaning you can get open table formats with Oracle performance and caching.
Finally, multicloud moves from on-prem/Cloud@Customer (19C) to broad options (23ai) and becomes generally available across Azure, Google, and AWS in 26ai, with the promise of “Same:Same” spanning estates. We believe this consistency across fault domains is a meaningful advantage for resilience and data gravity and represents one of the most comprehensive multicloud offerings in the industry.
- Key takeaway: 26ai compresses the path to AI innovation leveraging existing data and controls. The seamless upgrade path extends meaningful capabilities, while an Oracle’s Lakehouse strategy embraces standards and preserves performance.
Note: Oracle is providing premier support for 19C through 2029 and extended support through 2032. This matters for organizations that “move at the speed of the CIO, not the speed of Nvidia,” providing time to modernize while harvesting near-term AI gains.
Enterprise-Class AI
Vectors as a native type. 26ai embeds vectors turning myriad data formats into numbers, strings, and dates, and integrates with Oracle’s heritage value elements around transactions, disaster recovery, security and governance. Unified search enables vector similarity with simple filters, so recommendations, fraud detection, and document/image retrieval can include who the user is, what they’re entitled to see, and an audit of the workflow.
Natural-language access with safety. The “SQL translator” lets users ask questions in English (or other languages), with the database enforcing row-level privacy, roles, and policies. In our opinion, moving trust guarantees down into the database – rather than relying on app-level code – is a very Oracle-like move to prevent leaks and privacy violations.
GenDev and guardrails. Oracle’s Open Application Spec and generator produce applications from human-readable templates while routing access through JSON-relational duality and REST, not ad-hoc SQL. Oracle has built guardrails that restrict LLM access and validate outputs.
Autonomous AI Lakehouse and Iceberg Support

The Lakehouse era is evolving to an open platform. But governing open table formats often creates customer concerns around enterprise robustness and confusion. By embracing Apache Iceberg (open, read/write capability), Oracle connects its SQL performance heritage and Exadata optimizations to the object-store ecosystem. Two notable items stand out:
- Catalog-of-catalogs. Rather than replacing your catalog, Oracle mounts existing ones (e.g., Unity, Glue) with simple addressing to expose thousands of tables instantaneously;
- Performance focus. The Data Lake Accelerator and Exadata caching for Iceberg aim to deliver Oracle-class performance on open formats, minimizing the “either/or” tradeoffs between relational speed and open data flexibility.
Oracle’s Agents Approach
The next mile is agentic AI – i.e. systems that plan, retrieve, and act – is gearing up for enterprise-grade capabilities. Oracle’s Select AI Agents in Autonomous Database and the Agent Factory (low/no-code) offer what the company claims is an enterprise-ready agent layer that respects identity, policy, lineage, and audit. Integrations with frameworks like LangGraph, CrewAI, and AutoGen offer optionality, while Oracle focuses on the mission critical pieces of its stack, namely privacy, recoverability, security, and workflow safety.
Comprehensive Multicloud
Oracle’s multicloud is now a reality. It claims to have ~50 data center locations across Azure, Google, and most recently AWS; plus Cloud@Customer (on-prem) and even on-prem cloud regions when required. The same Exadata, RAC, Data Guard stack runs everywhere, which, when combined with 26ai, enables region- and cloud-aware fault-domain strategies without killing the operational model.
Competitive Posture
In our opinion three points are notable with respect to Oracle’s competitive posture, namely:
- Against cloud data platforms: Snowflake and Databricks excel in their respective swimlanes, but Oracle’s edge lies in its tight coupling to operational apps (Fusion, ERP, HCM, industry workloads) and the ability to bridge ai and transactions. We see transactions as a key element of platforms so that agents can confidently take action without siloed data.
- Against hyperscalers: AWS and Google lead on raw cloud services and scale, but lack Oracle’s enterprise application estate. Microsoft has Purview – its version of catalog of catalogs – and a ubiquitous app presence, but continues to lean on Databricks for advanced ML and data plane needs.
- Differentiator: Oracle has a deep end-to-end, governed path from data to agents to operational action without stitching five vendors and three control planes.
What’s missing in our view is greater clarity on how Oracle is harmonizing data silos and creating a 4D map of the enterprise (people, places, things and activities– see the green block below). We believe this is the next step and a “holy grail” aspiration for most tech firms; but at the same time a critical piece of the future agentic roadmap.
Our view of the emerging software stack is shown below. In a future post we’ll map Oracle’s portfolio to these key architectural elements. Suffice it to say that Oracle plays in virtually all layers and is a strong candidate to compete for the green SoI layer, which we see as an emerging high value piece of real estate in the stack.

Our premise is that the industry is shifting from softwae-as-a-service (SaaS) to Service-as-Software (SaSo), meaning that rather than software that contains fixed workflows we are evolving to an on demand workflow model delivered through software. New productivity enhancing services will emerge and impact not just the IT department or tech companies, but all departments across all firms in any industry. Our belief is this new model will confer software-like marginal costs to the firms that “get AI right” and bring winner-take-most dynamics to industries.
What You Can Do Now
- If you’re on 23ai: Plan your quarterly patch window to pick up 26ai. Validate vector indexes, NL to SQL policies, and RAG patterns; productionize valuable features;
- If you’re on 19C: Start with AI tools compatible with 19C to build muscle memory (vector pipelines, policy templates, catalogs) while scoping your 26ai path.
- For Lakehouse teams: Standardize on Iceberg; leverage existing catalogs and test Exadata caching paths for “hot” datasets where latency matters.
- For security/governance: Move privacy and entitlement into the database wherever possible; treat app-layer code generation as untrusted unless verified.
- For agents: Begin with narrow, high-value workflows where data is harmonized and of high quality; Instrument for lineage/observability, and iterate.
Our Take
In our opinion, Oracle’s AI Database 26ai is one of the more grounded attempts to operationalize AI in mission-critical settings. The seamless upgrade, converged data model, open Lakehouse stance, and database-level trust addresses key real blockers to AI scale – i.e poor data quality, data silos and complexity. We believe the next 12–24 months will hinge on the following: 1) Proving unified vector and AI search at OLTP and analytics speeds across messy, multi-catalog estates works as advertised; and 2) Working to test and mature agentic workflows with the observability and policy rigor enterprises require.
The data we’ve analyzed suggests that the winners in this cycle won’t simply tout bigger models rather they’ll leverage proprietary data and govern their agents in a way that drives unforeseen productivity levels while maintaining enterprise-grade privacy, security, compliance and data management.
Oracle’s bet is that the path to AI is shortest by placing intelligence inside the database. We think that’s a defensible thesis that can stand the test of time given Oracle’s track record and vast technology asset base.