UPDATED 16:39 EDT / JUNE 03 2024

INFRA

Microsoft details $3.2B plan to expand its data center capacity in Sweden

Microsoft Corp. will spend 33.7 billion Swedish crowns, or $3.2 billion, to expand its data center capacity in Sweden.

The company will make the investment over two years as part of an initiative detailed this morning. The move comes three years after the company inaugurated its first Azure region, or data center cluster, in Sweden. The cluster comprises three facilities located in the cities of Gävle, Sandviken and Staffanstorp.

Microsoft will use a portion of the project’s budget to expand the three facilities. As part of the effort, the company will deploy 20,000 new graphics processing units to support artificial intelligence workloads. The GPUs will be “chips like the Nvidia H100,” Brad Smith reportedly stated during a press conference today.

The H100 is two generations behind Nvidia Corp.’s flagship data center GPU. The chipmaker introduced the former product in 2022 and debuted a newer, significantly more capable version with a larger memory pool last November. This past March, Nvidia introduced an even faster GPU called the Blackwell B200 that can train AI models several times faster.

The H100 nevertheless continues to be in high demand. Meta Platforms Inc. revealed plans to buy 350,000 units in January, about two months before the Blackwell B200 made its debut. 

In the case of Microsoft’s push to deploy 20,000 new AI accelerators at its Swedish data centers, not all the chips will be supplied by Nvidia. “You will see us increasingly diversify the chips that we have,” Smith detailed during the press conference. “We’ve been public about being very bullish on Nvidia but also AMD and ultimately some of our own chips as well.”

Microsoft has so far detailed one internally developed AI chip: the Maia 100, which debuted last November. It’s touted as one of the largest processors to use a five-nanometer manufacturing node. Data flows in and out of Maia 100 chips via a custom Ethernet-based network protocol that Microsoft says can manage 4.8 terabits of traffic per second per accelerator.

AI processors rely on liquid cooling equipment to dissipate the heat they generate. When it debuted the Maia 100 last year, Microsoft detailed that its data centers weren’t designed to accommodate the large liquid chillers that would usually be necessary to cool deployments of the chip. The company resolved the challenge by developing a custom heat dissipation system.

Microsoft’s custom cooling hardware might make it feasible to install the Maia 100 in its Swedish data centers. Down the road, the company could potentially deploy additional custom chips at the facilities. Microsoft detailed an internally developed central processing unit, the Cobalt 100, at its Ignite conference last November and revealed that it’s developing a second-generation Maia accelerator.

Besides expanding its data center footprint in Sweden, Microsoft will also spend a portion of the $3.2 billion allocated to the project on local education initiatives. The goal is to provide 250,000 people with access to AI training by 2027. The announcement of the investment comes only weeks after rival Google LLC revealed a $1.1 billion plan to expand its data center campus in Finland.

Photo: Microsoft

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU