Graphcore raises $222M for its ultrafast AI chips
Graphcore Inc., a British startup taking on Nvidia Corp. in the artificial intelligence chip market, today said it has closed a hefty $222 million funding round at a unicorn valuation of $2.77 billion.
Bristol-based Graphcore has developed a chip for running AI models that it describes as the “most complex processor ever made.” The startup also sells AI appliances that it says can provide up to 16 exaflops of performance when linked together in a cluster. One exaflop equals a million trillion computing operations per second.
Graphcore’s flagship AI chip, the Colossus Mk2 GC200, is based on a seven-nanometer process and features some 59.4 billion transistors. That’s compared with the 54.2 billion in Nvidia’s newest and most powerful A100 graphics card for data centers. According to Graphcore, the GC200’s transistors are organized into 1,472 processing cores, each of which has an integrated pool of memory for storing data.
The startup has equipped the chip with a number of features that it says give it an edge over competing products from the likes of Nvidia. For one, the memory circuits attached to each of the GC200’s cores have 900 megabytes of combined capacity, which is in some cases enough to store an entire AI model. That removes the need to store parts of the AI on external memory separate from the processor, which in turn eliminates the delays that emerge when data has to travel between two separate components and thereby speeds up computations.
Another tentpole feature of the GC200 is something Graphcore refers to as Stochastic Rounding. The information that AI models work with is often stored in the form of floating point values, basic units of data that usually take up between 16 and 64 bits of space. The smaller a value, the less time it takes to process. Stochastic Rounding allows the GC200 to compress large floating point values into 16 bits, even if they take up more space in practice, to increase processing efficiency.
Graphcore says systems based on its chip can significantly outperform AI appliances powered by Nvidia’s A100 graphics card. In some cases, the startup claims, its silicon makes it possible to train a model more than five times faster.
The GC200’s performance likely isn’t the only factor that led Graphcore’s backers to invest $222 million in the startup. Graphcore has also demonstrated early go-to-market momentum: despite only launching in 2016, it already boasts high-profile customers such as the University of Oxford and the Lawrence Berkeley National Laboratory. Moreover, thanks to a deal with Microsoft Corp. announced in 2019, Graphcore is among the few processor startups whose chips are available to enterprises via one of the major public clouds.
Graphcore Chief Executive Officer Nigel Toon wrote in a blog post today that the startup will have over $440 million of cash on hand when the $222 million round closes. The Ontario Teachers’ Pensions Plan Board was the lead investor in the round. It was joined by Fidelity International, British asset manager Schroders and existing backers.
Image: Graphcore
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU