UPDATED 14:00 EDT / AUGUST 12 2025

Supermicro and partners talk with theCUBE about data storage at the Open Storage Summit 2025. AI

Supermicro leads AI-ready data storage shift for high-performance compute

AI inference is surging to the forefront, reshaping how enterprises think about — and build — their data storage systems. As models grow in complexity and seep deeper into daily workflows, storage is under mounting pressure to deliver unprecedented speed, throughput and scalability.

Giorgio Regni of Scality, Anders Graham of Kioxia, Kevin Tubbs of WekaIO, Allen Liu of Supermicro, and John Kim of Nvidia talk with theCUBE about data storage at the Open Storage Summit.

Industry experts from Supermicro, Scality, Kioxia, WekaIO and Nvidia talk with theCUBE about evolving data storage.

Meeting these demands requires rethinking how storage handles both the volume and complexity of AI workloads. New approaches are emerging to enable faster data retrieval and support the growing need for AI systems to process and retain larger amounts of relevant information at once, according to John Kim (pictured, front row, right), director of storage marketing at Nvidia Corp.

“With inference, it turns out there are two things that are driving the need for storage,” Kim said. “The first is what we call RAG … retrieval augmented generation. That means the query can go into a semantic search, and it can retrieve related documents, real-time documents that could be from just a few hours ago. The other thing that drives the need for storage and AI inference is KV cache. As the queries get longer, if you have agentic AI, if you are using RAG, the context or the input sequence length gets longer, and longer, and longer, and it takes more and more time to compute this.”

Kim spoke with theCUBE’s Rob Strechay at the Supermicro Open Storage Summit, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They were joined by Giorgio Regni (front row, left), chief technical officer of Scality Inc.; Kevin Tubbs (front row, center), field chief technology officer, HPC and AI, of WekaIO Inc.; Anders Graham (back row, left), senior director of marketing and business development at Kioxia America Inc.; and Allen Liu (back row, right), senior product manager at Super Micro Computer Inc. They discussed the latest technology trends in the evolution of data storage. (* Disclosure below.)

Data storage for AI infrastructure

Expanding AI workloads have led data storage vendors to develop new technologies to meet demand for more powerful compute, high-performance storage and low latency networks. This requires optimized end-to-end solutions that can address market need along the entire data path, and that starts with the data center.

“In the past 12 months, the explosive growth of AI workloads has brought huge challenges to AI infrastructure,” Liu said. “Supermicro just released a new data center building framework named the Data Center Building Block Solution, or DCBBS. It is a fully integrated, balanced, designed data center building solution. It has comprehensive elements for data center building, including rack-level and storage-level building blocks that include compute, storage, network, as well as the environment blocks like thermal management and liquid cooling.”

Storage vendors are also focused on building products that can rapidly evolve to meet the changing demands of AI workloads. For providers such as WekaIO, this means building solutions that can evolve with the entire stack.

“One of the important things about workloads is they’re unpredictable … what we knew 18 months ago is not what we know now,” Tubbs explained. “We have our product called NeuralMesh by Weka, and the key thing here is it’s storage that evolved. We built it a certain way for modern AI workloads and high-performance computing workloads, but we implemented it in a way to make sure that we could scale it with the size of the problem but also with the innovation that we’re seeing in AI.”

Solutions for SDDs and object storage

The growth of AI workloads is also driving renewed interest in solid state drives, or SSDs. One of the technologies that has emerged as a key driver in this arena is peripheral component interconnect express, or PCIe, and Kioxia is leveraging this solution to boost SSD performance in AI processing.

“We’re seeing the need for higher throughput, particularly is some of these areas of the AI lifecycle, especially on the training and the inference side,” Graham said. “There we’re seeing a lot of the PCIe 5.0 SSDs really becoming kind of ubiquitous for the higher throughput. PCIe is really the trend there, and with Kioxia, we have our CDAP drives today and our next generation drives coming out soon.”

The evolution of storage in the AI era has led industry observers to ask: What is the future for object storage? The technology appears capable of supporting AI workloads based on its ability to deliver scalability and a capacity for handling unstructured data. Using solutions such as non-volatile memory express or NVMe, Scality is focused on making object storage a key element in optimizing data speed and cost efficiency.

“How do you make it faster?” Regni asked. “The software stack, you remove it, and that’s why you do NVMe over Fabrics so that you go directly from the GPU to the flash. That means the way we write data on the flash drive should be optimized for that. We’re working on this and it’s part of our RING XP product. We move the features you don’t need so you get the best performance.”

Here’s a short clip from our interview, part of SiliconANGLE’s and theCUBE’s coverage of the Supermicro Open Storage Summit:

(* Disclosure: TheCUBE is a paid media partner for the Supermicro Open Storage Summit. Neither Super Micro Computer Inc., the sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.