UPDATED 12:35 EDT / JULY 10 2025

Michael Harris, vice chair and global head of capital markets at the NYSE, talks to theCUBE about AI infrastructure at RAISE Summit 2025. INFRA

12 signs AI infrastructure is reshaping the IPO market: theCUBE’s RAISE Summit insights

Capital markets are being reshaped by artificial intelligence, with AI infrastructure emerging as a critical pillar of how companies scale, compete and go public. What was once considered a back-end concern is now front and center in IPO roadshows and investor decks.

John Furrier, executive analyst at theCUBE Research, talks about AI infrastructure at RAISE Summit 2025.

TheCUBE’s John Furrier talks about AI infrastructure at RAISE Summit.

The shift is about more than hype. Investors are scrutinizing how companies embed AI into their business models — not just to drive innovation, but to ensure long-term operational resilience. It’s no longer enough to show top-line growth; firms need to demonstrate that their AI capabilities can sustain profitability and differentiate them in crowded markets. That’s changing the playbook for what it means to be IPO-ready in 2025, according to theCUBE Research’s John Furrier (pictured, left).

“AI is not just a tech thing anymore; it’s every sector,” Furrier said. “You’re starting to see the business models change with AI.”

From the floor of RAISE Summit in Paris, theCUBE + NYSE Wired unpacked how AI infrastructure is transforming capital markets, with insights from the NYSE, Google Cloud, Nutanix Inc., Vultr and others. TheCUBE’s conversations revealed how tech-fueled innovation is redefining IPO readiness, investor priorities and enterprise growth strategies across every sector. (* Disclosure below.)

1. The role of AI infrastructure in IPO momentum

With public markets hovering near all-time highs, investor sentiment is running strong. The performance of recent initial public offerings has been especially impressive, drawing renewed interest from private companies ready to go public, according to Michael Harris (pictured, right), vice chair and global head of capital markets at the New York Stock Exchange. Capital is flowing and follow-on and convertible markets are seeing similarly robust participation from both institutional and retail investors.

AI’s influence is everywhere, reshaping business models and becoming a key driver in IPO narratives. Investors are prioritizing durability alongside growth, looking for companies that can scale sustainably, Harris noted. As European firms eye U.S. listings and regulatory momentum builds, the back half of 2025 is poised to deliver an IPO boom.

Watch the full interview from theCUBE.

2. Sovereign cloud rises as AI’s compliance backbone

Kevin Cochrane, CMO of Vultr, talks to theCUBE about AI infrastructure at RAISE Summit 2025.

Vultr’s Kevin Cochrane discusses sovereign cloud with theCUBE.

As AI adoption accelerates, enterprises are turning to sovereign cloud to meet strict regulatory demands without slowing innovation. It’s no longer optional — especially for industries such as healthcare and finance that require composable, compliant infrastructure across diverse geographic regions, according to Kevin Cochrane, chief marketing officer of Vultr, a registered trademark of The Constant Company LLC.

By embedding governance into infrastructure from the start, sovereign cloud platforms such as Vultr’s help companies avoid costly retrofits and accelerate time to production. With built-in policy controls, enterprises can deploy AI systems confidently across borders while maintaining full visibility into data, compute and networking layers.

Read more of theCUBE’s complete coverage.

3. AI success starts with the right data foundation

Enterprises can’t scale AI without first nailing their data infrastructure. From high-performance storage to widely deployed platforms, a strong foundation is critical for operational success, according to George Kurian, chief executive officer of NetApp Inc. As chips and models improve, the focus has shifted to applying these tools meaningfully across real-world business challenges.

Organizational readiness is just as crucial, Kurian noted. Enterprises need a culture that encourages experimentation and learning, supported by formal AI centers of excellence. With reusable technologies and consistent deployment patterns, companies can drive faster AI adoption while strengthening core capabilities and closing gaps across the business.

Check out theCUBE’s complete interview.

4. Making AI simple for the enterprise

Nutanix is betting big on simplifying the road to enterprise AI by delivering a full-stack, turnkey solution that hides complexity and accelerates innovation. Rajiv Ramaswami, president and chief executive officer of Nutanix, outlined how the company’s platform, paired with Nvidia Corp., gives customers the AI infrastructure needed to run real applications — fast.

Most companies won’t be training large models, but they still want results. That’s where Nutanix comes in — offering a stack that supports inference workloads across data centers, edge and cloud. By abstracting away infrastructure headaches, Nutanix is turning AI infrastructure into an enabler, not a blocker, for enterprise builders, according to Ramaswami.

Don’t miss the full interview on theCUBE.

5. Google Cloud’s full-stack push for AI scale

Google Cloud is redefining how enterprises approach AI infrastructure by building a complete stack — from custom chips to open protocols — designed for scalability, simplicity and real business value. Alison Wagonfeld, chief marketing officer of Google Cloud, emphasized how AI is now embedded into the fabric of everything from BigQuery to Workspace.

Beyond just tooling, the company is focused on ecosystem momentum and agent interoperability. With protocols such as MCP and A2A gaining traction, Google Cloud is enabling agents to talk across platforms securely and efficiently — fueling a new era of collaborative AI infrastructure and accelerating adoption across industries.

Watch theCUBE’s full sit-down.

6. Red Hat sharpens open-source edge for enterprise AI

Matt Hicks, president and CEO of Red Hat, talks to theCUBE about AI infrastructure at RAISE Summit 2025.

Red Hat’s Matt Hicks talks about how inference plays a part in its AI strategy.

Red Hat Inc. is reimagining the software stack for the age of AI infrastructure, focusing on making inference cheap, efficient and scalable. Matt Hicks, president and chief executive officer of Red Hat, emphasized the need to minimize the cost per token and maximize GPU utilization so enterprises can scale reasoning and agentic workloads without financial friction.

With innovations such as the Red Hat AI Inference Server, vLLM for small models and cluster-ready llm-d projects, Red Hat is extending its trusted open-source foundation into AI. Hicks sees the real enterprise shift beginning when companies move beyond single-model experiments and confidently operationalize hybrid AI infrastructure strategies.

Don’t miss the full interview on theCUBE.

7. The future of app building is agent-powered and effortless

As AI infrastructure becomes more accessible, Heroku is focused on empowering a new generation of builders — not just traditional developers. Betty Junod, chief marketing officer of Heroku Inc., highlighted how low-friction deployment and managed scalability are removing technical barriers, making it easier to move from prototype to production at enterprise scale.

The rise of agents is also reshaping the app landscape, and Heroku is embracing open protocols such as MCP to streamline integration. By supporting agent interoperability and simplifying backend complexity, Heroku positions itself as the go-to platform for building intelligent applications that scale with both creativity and demand, according to Junod.

Hear the full story on theCUBE.

8. Why evaluation and memory are the next frontiers in AI

The race to customize and deploy AI at scale starts with one overlooked principle: defining success. Naveen Rao, vice president of artificial intelligence at Databricks Inc., believes enterprises must begin with evaluation — establishing clear goals and feedback loops — before throwing data at models. It’s a product mindset applied to model development.

Rao also sees the AI infrastructure landscape evolving quickly, driven by bottlenecks in memory and bandwidth, not just compute. He predicts that solving hallucinations and building truly intelligent systems will depend as much on model-to-model collaboration and efficient context handling as on raw power — marking a new era for infrastructure and interaction design.

Watch the full interview from theCUBE.

9. How Snowflake is reshaping data platforms for the AI era

Benoit Dageville, co-founder and president of product at Snowflake, talks to theCUBE about AI infrastructure at RAISE Summit 2025.

Benoit Dageville, co-founder and president of product at Snowflake, talks to theCUBE about AI infrastructure at RAISE Summit 2025.

Snowflake Inc.’s latest innovations focus on making unstructured data a first-class citizen alongside structured and semi-structured formats, with AI becoming the new SQL for extracting insight. Benoit Dageville, co-founder and president of product at Snowflake, explained how these capabilities are transforming Snowflake into a unified platform for modern data-driven applications.

Under the hood, Snowflake is optimizing GPU utilization and pipeline execution to support more efficient, scalable AI infrastructure. Semantic views and AI agents are enabling higher-quality interactions between users and data, while governance and open catalog protocols provide a foundation for long-term enterprise architecture, according to Dageville.

Check out theCUBE’s complete interview.

10. Why domain-specific AI is accelerating infrastructure demand

As AI use cases expand across industries and regions, specialized models are moving from experimental to essential. Andrew Feldman, founder and chief executive officer of Cerebras Systems Inc., sees demand skyrocketing not only in enterprise but in healthcare, sovereign applications and consumer-facing products. Growth is being driven by real-world needs, not just hype.

With vertical models for genomics, finance, aerospace and more, the AI infrastructure challenge is scaling fast. Feldman emphasizes the need to future-proof chips and systems for a world where reasoning-based agents, high inference loads and sovereign cloud requirements are the norm. Cerebras is betting big — and building fast — to meet that future.

Don’t miss the full interview on theCUBE.

11. Building AI factories at full speed

Mike Mattacola, GM, international, at CoreWeave, talks to theCUBE about AI infrastructure at RAISE Summit 2025.

CoreWeave’s Mike Mattacola speaks about speed-to-market and CoreWeave’s value proposition in accelerating AI infrastructure deployment for enterprises.

CoreWeave Inc. is redefining speed-to-market in the AI era by compressing infrastructure buildout timelines from years to weeks. According to Mike Mattacola, general manager, international, at CoreWeave, the company’s tight integration with Nvidia and Dell Technologies Inc. enables rapid deployment of advanced racks such as the GB200 and GB300. The company’s approach is vertically integrated, with internal teams handling every phase — from hardware delivery to software optimization—through a proprietary observability platform called Mission Control. That means customers don’t have to worry about system failures — they just get compute that works.

This philosophy has made CoreWeave a standout in the AI infrastructure race. Rather than relying on legacy data center models, the company has embraced modular expansion and strategically placed AI hubs for low-latency delivery across Europe. It’s footprint now includes six regions, with more on the way, as demand continues to outpace capacity.

Mattacola believes the urgency of enterprise AI adoption, especially in Europe, means most companies will abandon on-prem builds in favor of trusted partners that can deliver infrastructure — and value — at speed.

Watch theCUBE’s full sit-down.

12. Speed, scale and the race for AI compute

Jonathan Ross, founder and chief executive officer of Groq Inc., is steering one of the fastest-moving companies in AI infrastructure today. With a focus on inference — not training — Groq is helping sovereign nations and enterprises deploy high-performance, energy-efficient compute in months instead of years. Ross credits the company’s vertical integration and token-as-a-service model for dramatically cutting cost and complexity.

As demand for tokens skyrockets, driven by agentic AI and multi-step reasoning, Groq’s value proposition is simple: faster output, lower cost and no GPU bottlenecks. From a new data center in Helsinki to sovereign deployments in Saudi Arabia and Canada, Groq is delivering what others can’t — scalable AI infrastructure at startup speed, according to Ross.

Hear the full story on theCUBE.

Here’s the complete video playlist from SiliconANGLE’s and theCUBE’s coverage of RAISE Summit:

(* Disclosure: TheCUBE’s event coverage of RAISE Summit is brought to you by Vultr. Sponsors of theCUBE’s event coverage do not have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.