UPDATED 17:40 EDT / AUGUST 23 2024

Bob Laliberte, principal analyst at theCUBE Research, talsk about efficient data centers as a part of a CUBE Conversation. AI

AI is reshaping data center design: Analyzing Applied Digital’s systematic approach

The surge in artificial intelligence is revolutionizing everything from smartphone features to enterprise strategies, yet it brings significant challenges for efficient data centers. As AI’s power demands escalate, data centers face mounting pressure to manage increased energy loads and implement advanced cooling solutions to keep up with the pace of innovation.

[Wes] Cummins highlighted that traditional data centers typically consume around 7.5 kilowatts per cabinet for several servers, networking and storage,” said Bob Laliberte, principal analyst at theCUBE. “However, a single Nvidia H100 server, essential for high-performance AI tasks, requires over 10 kW. This disparity highlights a fundamental shift: next-generation data centers must support higher power densities.”

In his full analysis, Laliberte dissected the increasingly pivotal issues from sophisticated, power-hungry AI and supercomputing hardware such as graphical processing units and data servers. He also highlighted several approaches to combat the problem and build efficient data centers.

Designing the efficient data centers of the future

Next-gen data centers call for a meticulous approach to design, architecture and infrastructural considerations. First, they must be situated to enable access to ample power supply, seamlessly combining traditional and renewable sources.

“A key component of any new data center’s sustainability strategy is sourcing power from renewable sources,” Laliberte said. “Choosing renewable energy aligns with global efforts to reduce the carbon footprint of data centers and supports the transition to greener energy sources.”

Next up is cooling efficiency. Today’s AI and supercomputing workloads generate unprecedented amounts of heat, which require ingenious cooling solutions. The power usage effectiveness metric measures cooling efficiency, with the ideal coefficient being as close as possible to 1.

“A lower PUE indicates that more of the power consumed is going to actual computing rather than cooling and other infrastructure needs,” Laliberte explained.

Site selection is another critical factor. The choice of physical location must consider environmental factors and drive, rather than hinder, operational efficiency, according to Laliberte.

“Northern climates not only provide natural cooling but also help in minimizing operational costs,” he wrote. “Additionally, proximity to renewable energy sources, such as wind farms, plays a critical role in reducing the environmental footprint of these facilities.”

Data center design highlights the increasing need for sustainability considerations. Offsetting the power load by investing in carbon-neutral solutions is key to balancing the overall resource strain and architecting efficient data centers, according to Laliberte.

“For example, the Applied Digital Ellendale, ND facility will be powered by wind energy, with approximately two gigawatts of wind power feeding into its substation,” he said.

Additionally, green initiatives such as heat recovery mean that excess energy can be repurposed.  Transitioning to liquid cooling systems facilitates the recycling of heat for applications such as heating greenhouses or shrimp farming, offering additional benefits to local communities.

Finally, tight collaboration and integration with stakeholders will inevitably result in leaner, more efficient data centers. Applied Digital Corp., for example, works with technology providers such as Nvidia Corp., Super Micro Computer Inc. and Dell Technologies Inc. to deliver optimal and energy-efficient environments. Being an elite-tier partner with Nvidia ensures access to advanced GPU technology. Collaboration also extends to data center design and construction, leveraging expertise to optimize facilities for high-performance computing and AI workloads.

“Applied Digital’s approach to addressing the high power and cooling demands, optimizing connectivity and latency, and incorporating sustainable options where possible highlights its leadership in this space,” Laliberte said.

Read the full analysis here.

Image: SiliconANGLE/Canva

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU