

Nexthop AI launched today with $110 million in funding led by Lightspeed Ventures to help the world’s largest cloud companies build the next generation of artificial intelligence infrastructure.
The funding combines a seed and Series A round for the startup. Kleiner Perkins, WestBridge Capital, Battery Ventures and Emergent Ventures were also participating investors.
Nexthop builds custom networking solutions for large-scale cloud companies and hyperscalers, bringing hardware and software that”s designed to integrate directly with existing cloud stacks. Its engineers work cooperatively alongside customer cloud architecture engineers and system operators to tailor specialized build-outs.
John Furrier, executive analyst at theCUBE Research, spoke with Chief Executive Anshul Sadana, who explained that with AI, every cloud operator has a custom stack, leading to a classic “build versus buy” problem.
With the advent of the cloud, the limits of technology were already being pushed. However, hyperscale and large-scale cloud clusters required to train and deploy large language model AI at scale demand massive amounts of compute. This translates to racks of graphics computing units, storage and networking architecture used by cloud data centers that are so custom that these enterprise companies can no longer just buy off the proverbial shelf and simply fit it into their custom stack.
“What we decided at Nexthop is to co-develop our hardware and software along with our customers so that they can buy what they would have wanted to be built,” said Sadana.
This can include building networking hardware designed to each customer’s specifications, a network operating system of choice hardened by Nexthop AI and optical and electrical interconnects from the customer’s supply chain.
Sadana said power efficiency is often at the center of the discussion. AI has been dominating energy needs in the past few years, especially since the launch of ChatGPT by OpenAI in November 2022. Ten years ago, a 30-megawatt facility could be considered large, according to a report from industry analyst firm McKinsey & Co. Today 200 megawatts is normal. Hyperscaler power demands are expected to outstrip even that.
“Each hyperscaler is adding one to two gigawatts of capacity every year. Soon, each one will have 10 gigawatts of capacity or more,” explained Sadana. “If you can provide even 1% of power efficiency that’s 100 megawatts, that’s the largest data center in the world just a few years ago.”
Networking also plays a critical role in data center design and providing the best possible outcomes for scaling AI. Nexthop works with open-source network operating systems to provide the best-in-class interoperability for customers, such as SONiC, or Software for Open Networking in the Cloud, an open-source network operating system based on Linux that runs on a large number of network switches.
Nexthop said the open-source nature of open-source software, such as SONiC, allows AI data centers and hyperscalers the broad customization needed by such facilities focus on the hardware ecosystems they want to integrate.
By partnering with a startup such as Nexthop, Sadana commented cloud AI infrastructure companies can get a leg up on rapidly changing market conditions and stay ahead of technological changes. Since they can build out their custom GPU hardware, storage and racks to their specifications, this allows them to get to market sooner.
Sadana said Nexthop intends to use the funds to hire more talent and build more products to keep pace with the accelerating technological innovation within the AI industry.
THANK YOU