UPDATED 15:32 EDT / JUNE 10 2025

Stephen Watt, vice president and distinguished engineer, Office of the CTO, at Red Hat, discusses the Red Hat AI strategy during the Red Hat Summit - 2025. AI

Three insights you might have missed from theCUBE’s coverage of Red Hat Summit

As enterprises navigate the growing complexity of hybrid cloud, they’re turning to open-source solutions to stay agile and competitive. In this landscape, Red Hat AI has gained momentum as a strategy that blends automation, virtualization and open tools to simplify operations and make AI more accessible across the stack.

That momentum took center stage at Red Hat Summit, where the company spotlighted its deepening focus on AI automation, open-source infrastructure and trusted tools, such as Ansible and OpenShift Virtualization. The emphasis wasn’t on flashy promises — it was on delivering enterprise-ready platforms designed to simplify hybrid cloud operations at scale.

TheCUBE Rebecca Knight and Rob Strechay discuss Red Hat AI during the Red Hat Summit - 2025.

TheCUBE’s Rebecca Knight and Rob Strechay on set at Red Hat Summit.

“One of the things I loved about Red Hat and the talk about OpenShift Virtualization was that it wasn’t just simplistic [virtual machines] that were moving over,” said Rob Strechay, managing director of theCUBE Research, during a preview video for the event. “It was really complex systems of VM.”

Strechay provided event coverage around the Red Hat AI strategy with theCUBE’s Rebecca Knight and Paul Nashawaty at Red Hat Summit on theCUBE, SiliconANGLE Media’s livestreaming studio. TheCUBE’s coverage featured on-site interviews with industry professionals to explore how Red Hat executives, engineers and partners aim to advance open-source platforms to support enterprise AI, automation and hybrid cloud at scale. (* Disclosure below.)

Here’s theCUBE’s complete pre-show coverage:

Plus, here’s three key insights you may have missed from theCUBE’s coverage:

Insight No. 1: The Red Hat AI strategy aims to make AI practical and transparent with open-source tools.

Across both its technical and regional initiatives, Red Hat is emphasizing not just the performance of AI infrastructure, but also its transparency, usability and long-term sustainability. The Red Hat AI strategy has been crucial for infrastructure to evolve in order to support real-world AI deployment at scale.

Enterprises often face bottlenecks in moving AI from the lab to production due to challenges with data access, system compatibility and performance. Red Hat is tackling real-world challenges by working with chipmakers to optimize AI and memory technology, and it all starts with large language models, according to Stephen Watt (pictured), vice president and distinguished engineer, Office of the CTO, at Red Hat.

Stephen Watt, vice president and distinguished engineer, Office of the CTO, at Red Hat, talks with theCUBE about the Red Hat AI strategy during Red Hat Summit - 2025.

Red Hat’s Stephen Watt talks with theCUBE about optimizing memory technology.

“I think we had this sort of era of predictive AI, and now with generative AI, I think there’s a whole lot of … new applications … interesting new use cases in three different areas: training, fine-tuning and inference,” he said during Red Hat Summit. “Last year, we announced the InstructLab, which was democratizing fine-tuning models. With our Neural Magic acquisition, we’ve got a lot more into inference, and that’s about serving models and creating value for applications in the enterprise.”

Here’s theCUBE’s complete interview with Stephen Watt:

Building on that foundation, Red Hat is also working to close the AI skills gap and ensure responsible data practices, including in emerging markets. Red Hat’s Latin American division is addressing the region’s AI skill gap through an open-source approach that promotes collaboration and shared growth.

“It provides a moment for [Latin America] to leapfrog,” said Maria Bracho, chief technology officer for Latin America at Red Hat, during Red Hat Summit. “To take on and learn new things and then be again, ahead of the curve or at least catching to the curve when it comes to understanding and having more AI skills, to apply all these new technologies that we’re shipping into results that are practical for people and change lives.”

Here’s theCUBE’s complete interview with Maria Bracho:

Insight No. 2: Red Hat and chipmakers are making AI infrastructure more flexible and affordable.

At Red Hat Summit, partnerships with Intel Corp. and Advanced Micro Devices Inc. took center stage as the company showcased how open collaboration is driving more flexible and efficient AI infrastructure for the enterprise. For Red Hat and Intel, the collaboration is focused on translating open-source code into efficient AI solutions, which includes the use of a virtual large language model.

“What we’re working with Red Hat to do is minimize that complexity, and what does the hardware architecture and what does all the infrastructure software look like, and make that kind of seamless,” said Chris Tobias, general manager of Americas technology leadership and platform ISV account team at Intel, during Red Hat Summit. “You can just worry about, ‘Hey, what kind of application do I want to go with, and what kind of business problem do I wanna solve?’ Then, ideally, that gets you into a cost-effective solution.”

Here’s theCUBE’s complete interview with Chris Tobias and Ryan King, global head of AI and infrastructure ecosystem at Red Hat:

Meanwhile, as enterprises push for more flexible, cost-effective AI infrastructure, AMD’s long-standing partnership with Red Hat is also focused on the road ahead. The partnership aims to help enterprises move away from rigid systems by combining open platforms with versatile compute solutions, according to Phil Guido, executive vice president and chief commercial officer at AMD.

“From Linux to OpenShift to all those different open platforms … a lot of them are saying, ‘I’d like to do it on the cloud, I’d like to do it on prem, I’d like to do it on the edge,’” he said during Red Hat Summit. “Once again with Red Hat and AMD, we could provide that in every dimension.”

Here’s theCUBE’s complete interview with Phil Guido:

Insight No. 3: Stable platforms, open-source tools and a growing partner ecosystem are key to the Red Hat AI strategy.

Red Hat’s focus on strong infrastructure, open tools such as vLLM and llm-d, and trusted partners aims to help companies adopt and scale AI more easily, with less disruption and more impact. The Red Hat AI strategy includes the belief that AI should enhance existing enterprise systems rather than replace them.

Open source unlocks the world’s potential, according to Stu Miniman, senior director of market insights at Red Hat. Infrastructure should be stable, open and built to scale.

“We’re not building all the applications,” he said during the event. “We’re giving you the tools and the capabilities and freeing up your people to be able to take advantage of that more than anything else.”

The launch of Red Hat Enterprise Linux AI and OpenShift Lightspeed marks a major step in operationalizing AI on trusted infrastructure, allowing users to run AI workloads on familiar platforms without major rewrites or replatforming. Red Hat’s consistent platform approach also aims to help enterprise teams deploy AI faster and with less risk by building on systems they already know and trust.

“They’re going, ‘Oh wait, all this stuff I can just layer on top of OpenShift. I’ve been working with that for a bunch of years. I’ve got expertise in that. My application team already use it,’” said Brian Gracely, senior director of portfolio strategy at Red Hat, during Red Hat Summit. “They’re really excited that it’s not a completely new thing … it’s just going to drop in and play with what they want to deal with.”

Here’s theCUBE’s complete interview with Stu Miniman and Brian Gracely:

The Red Hat AI strategy continues to focus on making AI production-ready, with its latest open-source projects designed to help enterprises move from experimentation to scalable, real-world deployment. That includes a series of strategic updates the company recently outlined, including around vLLM and llm-d.

Chris Wright, chief technology officer and senior vice president of global engineering at Red Hat, talks with theCUBE during Red Hat Summit

Red Hat’s Chris Wright talks with theCUBE about the company’s strategic updates.

Red Hat introduced the tools to streamline AI production across enterprises, enabling scalable, cost-effective workloads without requiring major infrastructure changes. The company sees huge potential in using AI to enhance both business performance and operational efficiency, according to Chris Wright, chief technology officer and senior vice president of global engineering at Red Hat.

“How do we sort of gracefully bring those worlds together?” he said during the event. “In technology transitions, it’s never healthy to throw everything away and start over. In general, we have to have some kind of evolutionary path. So, we’re focused on that.”

Here’s theCUBE’s complete interview with Chris Wright:

Finally, as AI use cases grow more complex, Red Hat is strengthening its partner ecosystem to ensure its platform can support scalable, real-world outcomes across infrastructure, models and services. Red Hat is expanding its AI partner ecosystem with new validation tools, allowing partners to independently test and certify their applications on OpenShift AI to meet rising demand, according to Stefanie Chiras, senior vice president of partner ecosystem success at Red Hat.

“Now they’ll move into partner validation in the ecosystem catalog,” she said during Red Hat Summit. “A partner can come in, do their own testing … and then they can say, ‘We will support it.’ It goes into the catalog and is visible for all partners and customers to see. That’s important because this rate and pace of partner involvement is going to be faster and faster.”

Here’s theCUBE’s complete interview with Stefanie Chiras:

To watch more of theCUBE’s coverage of Red Hat Summit, here’s our complete event video playlist:

(* Disclosure: TheCUBE is a paid media partner for Red Hat Summit. Neither Red Hat, the primary sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.