

Artificial intelligence is fundamentally reshaping the data center landscape, driving a transition toward intelligent infrastructure capable of powering the next generation of applications and enhancing business productivity.
HPE’s Antonio Neri talks with theCUBE about AI’s impact on data centers and intelligent infrastructure.
The industry is witnessing a significant push toward creating scalable, efficient and sustainable solutions tailored to meet the intense demands of AI. This drive also prioritizes ease of deployment and faster time to value for enterprises, according to Antonio Neri (pictured, right), president and chief executive officer of Hewlett Packard Enterprise Co.
“When I think about AI, I think about it as a business productivity tool and as a way to change the world for the better,” he said. “But, ultimately, you need a lot of infrastructure.”
Neri spoke with theCUBE Research’s John Furrier (left) at the Nvidia GTC event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed AI’s impact on data centers and intelligent infrastructure, including the transition to scalable, efficient and sustainable solutions tailored to meet intensive AI workloads. (* Disclosure below.)
Different market segments have distinct infrastructure requirements. While some need large-scale AI factories with the latest cooling innovations for maximum performance, others prioritize simplicity and rapid deployment for enterprise AI applications, according to Neri.
“We talk about AI factories and building massive infrastructure, and you saw the latest technology with direct liquid cooling, no using fans and being more efficient,” he said. “When you come here [to this booth], you talk about enterprise AI; there it’s all about the simplicity and time to value.”
The network is also emerging as a critical component of the AI intelligent infrastructure stack, because high-speed data transfer is essential for distributed AI workloads, according to Neri. Recognizing this fact, there’s a growing trend toward deeper integration of networking capabilities into comprehensive AI solutions.
“For me, it starts with that understanding and foundation, and that’s why one of the core foundations is the network,” Neri said. “I spoke about this as a part of the Juniper acquisition — why the network is a core tenant of the stack.”
For AI to achieve widespread adoption within enterprises, the focus must be on providing user-friendly solutions that yield quick and tangible results, according to Neri. Enterprises are more concerned with the practical integration of AI into their existing operations to enhance efficiency and unlock new opportunities rather than the intricate technical specifications. This need necessitates a collaborative approach within the technology industry to develop comprehensive and easily deployable AI solutions.
“We started with this premise: We need to make it so simple for enterprises to leverage this technology,” Neri said. “That’s why we focus on the simplicity of the experience.”
Strategic partnerships are essential for accelerating the integration of AI within enterprises, according to Neri. By combining specialized expertise in areas such as chip manufacturing and intelligent infrastructure design, companies can offer integrated solutions specifically tailored to address diverse business needs. This collaborative approach ensures that businesses can effectively leverage the power of AI without being overwhelmed by unnecessary complexity.
“They don’t care about the chips; they don’t care about everything else,” Neri said. “They care about how [they] can deploy an AI environment to develop or an AI application at the speed of the business. That level of partnership requires co-engineering.”
Private cloud AI solutions are gaining traction as a viable method for enterprises to harness the benefits of AI while retaining control over their sensitive data and infrastructure. These solutions often come as integrated packages encompassing all the necessary hardware and software components, thereby simplifying the deployment process and delivering a cloud-like experience within the confines of an on-premises environment, Neri added.
“It’s because it’s solving the fundamental problem that ‘I don’t need to buy a server, a storage, a networking [and] all the software, and [then] I have to stitch it together,’” Neri said. “It comes in one skew, and it comes with a cloud-native and AI-driven experience. Three clicks, less than 30 seconds, [and] I’m done.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Nvidia GTC event:
(* Disclosure: TheCUBE is a paid media partner for the Nvidia GTC event. Sponsors of theCUBE’s event coverage do not have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU