Formerly known as Wikibon

Special Breaking Analysis | Dell’s Enterprise AI Gambit Stretches From Edge to AI Factories at Planetary Scale

In our pre-event research note, we argued that Dell Technologies has wagered its future on a duel mandate: 1) Racing to lead in enterprise AI and 2) Refreshing its large installed base. In our view, this challenge rests on Dell taking a “data-first” interpretation of artificial intelligence. Michael Dell’s 2025 keynote and the analyst roundtable that followed, push that thesis from aspiration to execution. Dell’s message is clear that AI is no longer a cloud-only phenomenon. It is becoming the operational fabric of every device, store, branch office, and data center, and Dell intends to be the neutral grid that distributes this new electricity.

Michael Dell on theCUBE, DTW 2025

Data Gravity is the Linchpin that Flips the AI Map

Dell’s fundamental market driver in our view begins and ends with data gravity. Dell’s fundamental assumption is that that more than 75 percent of new data, will be created and processed outside hyperscale regions. “AI will come to the data,” Michael Dell asserted, positioning edge locations, corporate data centers, and even AI-enhanced PCs as the center of value creation. That framing is a direct challenge to the “single cloud utility” model championed to a great extent by AWS.

Our research indicates that three forces combine to support Dell’s narrative:

Force reshaping enterprise AIImplication for buyersDell’s responseCloud incumbents’ likely counter
Regulated & latency-sensitive data in financial services, healthcare, government and retailData sets cannot move freely; inference and training must often sit on-prem or at the edgeAI Factory 2.0 racks (XE978x series) with liquid cooling, PowerScale Object lakes, cloud-like consumption modelsAWS Outposts / Microsoft Arc extensions – still tied to public-cloud control planes
TCO shock from cloud AI services as token volume explodesCFOs demand predictable, amortizable CapEx or pay-as-you-use opex-flexDell claim cloud costs are 60 % higher than Dell’s AI soluitons for steady inferencePrice-cut in cycles (e.g. Bedrock RAG, OpenAI ) keep hyperscale TCO in flux
Edge intelligence for real-time decisionsRetail, manufacturing, telco need sub-20 ms response timesMicro-factories (5 – 6 U) per store; AI PCs with NPUs; thin-client refresh pipelineAzure Percept, Google Vertex Edge; still emerging reference designs

The table above underscores a strategic fork in the road. Namely, hyperscalers push centralized services outward, while Dell drags AI inward toward sovereign data.

Proof Points: JPMC and Lowe’s Illustrate Promise—and Rarity

Larry Feinsmith, Head of Global Technology Strategy, Innovation and Partnerships at JPMorganChase, spoke to an exabyte-scale hybrid architecture where “constellations of models” run across private data centers, edge hubs, and the public cloud. He identified two pivotal needs:

  1. Density & Utilization. Market-risk calculations already lean on GPU clusters; generative agents will multiply demand.
  2. Data Governance. The bank treats data as a first-class, governed asset—critical in regulated finance.

Seemantini Godbole, CIO of Lowe’s, described 1,700 micro data centers. Each Dell-powered AI edge node delivers computer-vision enhanced workflows and generative assistants directly to store associates. Crucially, Lowe’s rolled Gen AI bottom-up to hundreds of thousands of employees, avoiding the pilot purgatory that plagues many Fortune 500 peers.

We believe these accounts validate Dell’s architecture but also highlight its biggest obstacle – i.e. lack of widespread enterprise AI adoption. Even Michael Dell admitted early adopters represent the first 10 % of the S-curve. Most enterprises lack clear use-case roadmaps or in-house MLOps skills. Unless Dell can “appliance-ize” AI factories into turnkey solutions such as RAG-in-a-Box and others, cloud players will continue to dominate mindshare with one-click managed services.

[Read why Jamie Dimon is Sam Altman’s biggest competitor].

Inside AI Factory 2.0: Engineering for Token-Scale Economics

Dell’s hardware announcements from day 1 centered on the PowerEdge XE9780-L/85-L racks—256 NVIDIA Blackwell GPUs per rack, direct liquid cooling, and rear-door heat exchangers leaning on Dell’s 850 thermal patents. Jensen Huang’s video sitdown with Michael Dell framed the spectrum of intelligence shifting from early LLMs to reasoning agents with successive innovation leaps growing token throughput by an order of magnitude.

Dell’s performance claims on AI Factory 2.0:

  • Tokens-per-second: Dell touts a 100× increase in tokens per second versus last year’s 1.0 release, along with 80 % latency cuts. Even a conservative 25× gain would materially shrink inference cost, an expense line item that has concerned early adopters.
  • Energy efficiency: Up to 60 % cooling-energy savings via direct liquid cooling (DLC). Given power consraints in colocation and on-prem facilities, any energy savings means more GPUs you can add.
  • Networking: X800 Spectrum-X Ethernet switches close what for Dell is a historical weak spot in networking. The shift to extreme parallel computing (EPP) has given Dell an opening to re-enter the networking space with a competitive strategy. Nonetheless, hyperscalers still tout extremely lowest-latency fabrics (EFA, Aries, Andromeda), an area Dell is addressing with open-source SONiC integrations and Broadcom Jericho/Tomahawk roadmaps.

The bottom line for us is Dell and its partners are narrowing the raw-horsepower gap versus the super-clusters we see inside AWS, Azure, and Google Cloud. The open question is software and its full AI stack. Customers will be looking for a control plane equivalent to SageMaker, Vertex, or Azure ML; a data stack a la Snowflake and Databricks, a data harmonization layer, governance at scale and an agentic control framerwork. While Dell has existing and new software partners (NVIDIA, Red Hat, VMware, Cohere) in our viewe it must continue to integrate AI capabilities up the stack and mask complexity; building AI solutions versus hardware product.

Market Outlook: Lots of Headroom and Some Challenging Headwinds

Our research suggests that enterprise AI-related data center infrastructure spend will exceed $250 B in 2025, growing at a 24 % CAGR over a 10-year period. That’s the good news for suppliers like Dell. However most of that spend in the early years will occur in hyperscale, neo-clouds and service provider locations; with on-prem and edge enterprise AI deployments in 2025 accounting for roughly $15 B of the $250 B. While Dell’s AI server business is exploding, it’s largely due to deals at firms like Xai and CoreWeave. Edge sites, private clouds, and colo “near-cloud” zones capture the bulk of Dell’s future TAM and gaining enterprise AI adoption remains the key opportunity and challenge

ChallengeWhy it mattersDell’s mitigation strategyOur assessment
Solution Selling GapEnterprises want business outcomes, not GPU counts100 vertical “reference factories”; Deloitte, Accenture, EY integrationsNeeds acceleration; 3000-customer AI Factory figure demands scrutiny
Ecosystem DependencyHeavy NVIDIA alignment; risk of continued vendor margin pressureDiversify with AMD/Intel GPUs, Qualcomm NPU PCs; pooled licensing via APEXPlausible, but NVIDIA still captures outsized economics
Cultural AdoptionAI anxiety in workforce; change-management hurdlesTop-down imperatives and “group-of-30” enablement cascades inside DellPlaybook is credible; must convert into advisory services for customers
Power & Sustainability1 MW racks strain facilities; ESG targets loomDLC, asset-recovery programs, fractional GPU sizingNet-positive story, but customers need financing and time for retrofits
Cloud MindshareAWS Bedrock, Microsoft Copilot, Google Gemini embed AI in existing workflows with strong developer affinityTCO narrative (Dell claims 60 % cheaper than cloud) and data-gravity argumentCompelling for steady-state inference; less so for bursty experimentation – customers should do the TCO math themselves
Market Dynamics Scan

Strategic Takeaways

  1. Enterprise AI is Becoming a Three-Layer Market. Hyperscalers own utility AI APIs; model companies (xAI, Mistral, CoreWeave) support specialist AI factories; Dell targets private enterprise AI deployments where data sovereignty, latency, or cost prohibit public-cloud dominance. Each layer is large enough to support multi-billion-dollar businesses – this is not a zero sum game. But there will be large winners, small winners and losers.
  2. Dell’s Differentiator Is Operational Know-How, Not Silicon. While it’s non-trivial to wire GPUs, few can integrate the supply-chain muscle, 30 years of thermal IP, global service coverage, and the 50/50 channel balance that Dell touts. That operational excellence is a key aspect of Dell’s moat and matters as enterprises grope for turnkey stacks.
  3. Software Remains the Missing Middle. Dell could productize its internal applications of AI (e.g. Next-Best-Action LLM workflows) and offer unified MLOps solutions, and accelerates its vertical factory templates. As indicated, cloud providers will continue to own developer loyalty. NVIDIA’s new developer cloud is an on ramp for Dell to reach developers.
  4. Economics Trump Ideology. Early cloud-first zealots now balk at GenAI bills. If Dell can prove its 60 % TCO edge—especially when token volume hits trillions per month—it will attract on-prem AI investments and possibly even siphon workloads back on-prem. Repatriation was more a vendor marketing narrative than a market trend but economics can reverse courses. Lest we forget however, AWS has cost advantages and can pull pricing levers at its discretion. Don’t confuse price with cost.
  5. The Edge Is Dell’s Ace. With 1,700 AI micro-factories already live at Lowe’s, Dell demonstrates a replicable pattern for retail, manufacturing, healthcare, and telco. Hyperscalers lack physical proximity, although they are building out data centers globally. Dell’s client business and AI PCs represents a formidable edge platform for growth.

Bottom Line

In our opinion, Dell has moved from grand vision to tangible architecture faster than many skeptics—ourselves included—anticipated. AI Factory 2.0, micro-edge clusters, and AI-ready PCs create a continuum from device to planetary-scale data center that few vendors can match. The road, however, is uphill as hyperscalers still command developer mindshare, and most enterprises remain stuck in the experimentation cul-de-sac. Large volume deals from the likes of Elon and neo-clouds provide a bridge to the future but within 18-24 months, enterprise AI adoption must be ramping in our view for Dell to thrive.

The next 24 months will determine whether Dell’s AI Factory grid becomes the default platform for on-prem intelligence or a nice alternative for the most data-sensitive workloads. Watch for three catalysts: (1 Reference AI Factory wins beyond existing customers like Lowe’s and JPMC; 2) A coherent, Dell-branded MLOps and AI stack that abstracts multi-model complexity; and 3) Tangible market proof that liquid-cooled racks deliver the promised 60 % energy savings over cloud at scale.

If Dell hits these milestones, the company will not merely ride the AI wave, it will reshape enterprise infrastructure, competing effectively with the cloud providers for profitable workloads and cementing its place as an indispensable infrastructure partner for the enterprise AI era.

Article Categories

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
"Your vote of support is important to us and it helps us keep the content FREE. One click below supports our mission to provide free, deep, and relevant content. "
John Furrier
Co-Founder of theCUBE Research's parent company, SiliconANGLE Media

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well”

Book A Briefing

Fill out the form , and our team will be in touch shortly.
Skip to content