UPDATED 15:36 EDT / OCTOBER 13 2025

INFRA

OpenAI partners with Broadcom to deploy 10 gigawatts of AI hardware

Shares of Broadcom Inc. jumped more than 9% today after it announced a four-year infrastructure partnership with OpenAI.

The initiative will see the ChatGPT developer deploy 10 gigawatts’ worth of data center hardware over the next four years. According to OpenAI, the infrastructure will be powered by custom artificial intelligence processors co-developed with Broadcom. In a podcast released today, OpenAI President Greg Brockman said the AI provider used its own neural networks to design the chips.

“We’ve been able to get massive area reductions,” Brockman detailed. “You take components that humans have already optimized and just pour compute into it, and the model comes up with its own optimizations.”

Off-the-shelf graphics cards are geared toward a broad range of customers, which means modules that are important for some users aren’t needed by others. Building a custom processor makes it possible to remove unneeded modules, which saves power and space. That power and space can be reallocated to circuits optimized for a company’s workloads. 

OpenAI plans to deploy its custom processors in racks that will likewise be based on an in-house design. The systems will be equipped with PCIe and Ethernet networking equipment from Broadcom. PCIe is mainly used to link together internal server components, while Ethernet is geared toward connecting servers with one another.

Broadcom debuted a new AI-optimized Ethernet switch, the TH6-Davisson, last Wednesday. It can process 102.4 terabits of traffic per second, which the company says is double the throughput of the nearest competitor. The laser emitters that the TH6-Davisson uses to transmit data over the network are based on a field-replaceable design meant to ease maintenance.

Usually, Ethernet switches are deployed alongside devices called pluggable transceivers. Those devices are responsible for turning electronic data into light that can be transmitted over fiber-optic networks and vice versa. TH6-Davisson features a built-in transceiver that removes the need for an external optical module, which lowers costs.

OpenAI didn’t specify which of Broadcom’s PCIe products it will use as part of the partnership. The chipmaker sells a line of PCIe switches called the PEX series. It also makes retimers, modules that prevent errors from finding their way into data while it travels across a PCIe link. 

“OpenAI and Broadcom have been working together for the last 18 months,” OpenAI Chief Executive Officer Sam Altman said on the podcast today. “By being able to optimize across that entire stack, we can get huge efficiency gains and that will lead to much better performance, faster models, cheaper models.”

OpenAI and Broadcom plan to deploy the first data center racks developed through their partnership in the second half of 2026. According to the chipmaker, the remaining systems will go online through 2029. The appliances’ expected power consumption of 10 gigawatts corresponds to the energy usage of several million homes.

Broadcom and OpenAI didn’t specify the expected price of the project. In August, Nvidia Corp. CEO Jensen Huang stated that 1 gigawatt of AI data center capacity costs $50 billion to $60 billion. He added that most of that sum goes to Nvidia hardware, which suggests Broadcom stands to generate billions of revenue from its new OpenAI partnership. 

Photo: OpenAI

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.