

Powered by artificial intelligence, AI factories are fast becoming the blueprint for packaging compute, interconnects and software into production systems that churn out large-scale intelligence across data centers, PCs and the edge. What once looked like a parts list is coalescing into tightly integrated pipelines optimized for utilization, throughput and “tokens per watt.”
Here’s the real shift: The hardware-software stack is being rebuilt around fused CPU–GPU designs, high-bandwidth fabrics and portability layers that privilege developer velocity as much as raw performance. That rebalances the partner landscape and raises new questions about capital intensity, power budgets and emerging standards. Nvidia Corp.’s push into x86 territory via Intel Corp., and CUDA’s expanding gravity, ripple through hyperscalers, neoclouds and enterprise buyers.
On the latest episode of theCUBE Pod, theCUBE Research’s John Furrier (pictured, left), executive analyst, and Dave Vellante (right), chief analyst, break down the week’s biggest moves with a builder’s eye. They unpack Nvidia’s investment tie-up with Intel, debate whether CUDA’s moat widens, game out implications for Advanced Micro Devices Inc. and Arm Ltd. and connect the dots to how enterprises will actually stand up AI factories.
“I think this is such a huge win for Nvidia,” Furrier said. “The moat just got massive, because our whole thing about the moat was CUDA and they got the two-sided marketplace. This, in my opinion, takes pressure off CUDA because now they can continue to pound away at CUDA’s competitive advantage.”
The new alignment between Nvidia and Intel reframes the data-center motherboard around tightly coupled CPUs, GPUs, fabrics and software. Beyond components, the play is distribution and developer gravity, pulling CUDA deeper into enterprise stacks and even the PC category at volume. That combination extends Nvidia’s reach into markets where x86 remains entrenched, while giving Intel relevance with AI developers and a bridge to hybrid architectures.
“I have some thoughts on that,” Vellante added. “I really do think it’s a win-win-win, definitely a win for Nvidia. They’re talking about chip markets, the overall market’s bigger than that, but their whole thrust here is get the best CPU, the best GPU integration.”
The move also reorders the medal stand: Nvidia consolidates gold, Intel contends for silver on the strength of CUDA access and joint SKUs, and AMD must refine its GPU software story, according to Vellante. That dynamic matters because hyperscalers want choice, but they also want time-to-tapeout and software compatibility at scale.
“I think what’s going to happen is AMD will get access, the government will maybe force them to get access. I’m sure Nvidia would be happy with that. The more, the merrier, as long as they’re the king of the hill,” Vellante explained. “But if they don’t get access to that, the CUDA instruction set, what are they going to do? They’re trying to build out their own software. They’re going to have the open-source software. We’ll see how that all plays out, but CUDA is rapidly becoming the standard.”
Industry pulse checks corroborate the pivot. IBM Corp. rang the NYSE closing bell and highlighted a long arc of sustainability leadership, a timely storyline with Climate Week bringing a U.N.-scale audience to Manhattan. Meanwhile, Furrier and Vellante previewed the “AI Factory” series from theCUBE and NYSE Wired, signaling a deeper dive into the new data center blueprint, from NVLink fabrics to liquid cooling and rack-level standards, according to Furrier.
“We got the big AI Factory event. It’s climate week, so it’s going to be a U.N. international crowd,” he said. “Tons of big names coming in NYSE; of course, we are pumping out content [on] the future of data centers.”
That macro backdrop frames the Nvidia–Intel tie-up as more than a headline — it is the template for how compute, networking and software will be packaged and financed. As volumes shift toward joint SOCs for PCs and x86-GPU NVLink hybrids in the data center, the economics of foundry, instruction-set access and software portability become board-level issues for suppliers and buyers alike, Vellante explained.
“Big win for Nvidia, big win for Intel, although they still got the foundry to deal with,” he said. “It’s clearly a boost in volume with the SOCs, but still they’ve got the foundry around their neck. Big … win for customers because now they’ve got that hybrid architecture, and I think it’s just a win for AI overall.”
At CrowdStrike Fal.Con, George Kurtz outlined a path to an agentic SOC and “security AGI,” bolstered by new agents, the AgentWorks builder and tuck-ins Onum and Pangea. A bullish analyst day signaled a V-shaped recovery and a surging ecosystem, as partners rallied around CrowdStrike Holdings Inc.’s Falcon’s data, governance and go-to-market momentum.
“Well, they had 8,000 people there this year,” Vellante said. “[The] ecosystem was coming apart at the seams. Everybody wants to be part of CrowdStrike because they have the flywheel going, cloud-native with AWS Marketplace, their own go-to market, now the ecosystem is exploding.”
If CUDA’s lead extends, challengers will need credible, open alternatives and workload portability that does not sacrifice performance per watt. That is possible in niches and cost pressure could catalyze open stacks, but the near-term gravity favors CUDA-compatible routes to production. Enterprises, for their part, will push for “AI factories” sized to their power, data and governance constraints — and crucially, they will require integrated “data factories” to make model outputs reliable and auditable.
“In my view, this deal increases the probability that CUDA extends the disruption timeframe from some of the open source,” Vellante said. “I think it lowers the probability that CUDA gets disrupted. I think it’s further solidifying its standards.”
The funding, valuation and robotics threads swirling around xAI Corp., Tesla Inc. and others underscore how capital is chasing physical-digital convergence. Yet the near-term winners will be the vendors who can package compute, interconnects, software and data pipelines into repeatable blueprints that customers can deploy with confidence — from hyperscale to enterprise-scale.
“I think the market still has to understand this,” Furrier said. “This is why I love the market right now, because even if this market doesn’t grow as fast and certainly as for certain companies, the value extraction is going to be with the AI side. This is all enabling the AI. I think the enterprise is going to have not only just a resurgence, and what’s even great is that our favorite area, the data center and cloud and storage network and compute is still the hottest thing.”
Thomas J. Watson Sr., former chairman and CEO of IBM
Sarah Bruning Meron, chief communications and brand officer at IBM
Bill Gates, co-founder of Microsoft
Lip-Bu Tan, CEO of Intel
Lisa Su, chair and CEO at AMD
Jensen Huang, founder and CEO of Nvidia
Abhishek Mehta, chairman and CEO of Tresata
David Flynn, CEO of Hammerspace
Sarbjeet Johal, founder and CEO of Stackpane
Steve Jobs, co-founder and former CEO and chairman of Apple
Jony Ive, former SVP of industrial design and chief design officer at Apple
Elon Musk, CEO of Tesla
George Gilbert, principal analyst at theCUBE Research
George Kurtz, CEO of CrowdStrike
Jim Cramer, American TV personality and author
Todd McKinnon, CEO of Okta
Nir Zuk, founder and CTO of Palo Alto Networks
Sam Altman, co-founder and CEO of OpenAI
Jamie Dimon, chairman and CEO of JPMorgan Chase
Pat McGovern, former chairman and founder of International Data Group (IDG)
Here’s the full episode of this week’s theCUBE Pod:
Don’t miss out on the latest episodes of “theCUBE Pod.” Join us by subscribing to our RSS feed. You can also listen to us on Apple Podcasts or on Spotify. And for those who prefer to watch, check out our YouTube playlist. Tune in now, and be part of the ongoing conversation.
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.