Light-based AI chip startup Lightmatter raises $80M round backed by GV
MIT spinout Lightmatter Inc. today announced that it has raised another $80 million in funding to finance the development and commercialization of its optical artificial intelligence chips, which use photons to perform calculations.
Viking Global Investors led the round. The firm was joined by Alphabet Inc.’s GV venture capital arm, Hewlett Packard Enterprise Co. and multiple other institutional backers.
Lightmatter’s flagship product is an optical chip called Envise (pictured) that can be used to perform AI inference, or running machine learning models that are already trained. The startup says that Envise makes it possible to run some AI workloads up to seven times more efficiently than on graphics cards.
The chip is based on optical computing technology that Lightmatter’s founders developed during their time as researchers at MIT. Envise encodes data into laser signals and passes them through tiny optimal channels. Those channels manipulate the signals to perform computations. Envise can’t perform as wide a range of tasks as a standard processor, but the chip lends itself well to certain linear algebra operations, mathematical calculations used by AI models to crunch data.
Each Envise chip combines two optical computing modules with 256 conventional cores that are responsible for orchestrating the data processing workflow. Data is stored on an onboard 500-megabyte pool of memory for quick access. For connectivity, there’s an industry-standard PCIe 4 interface that allows Envise to be linked to existing data center infrastructure in a relatively simple manner.
Lightmatter claims that, though at an early stage, its technology is already fast enough to outperform the industry’s fastest graphics cards in some cases.
The startup has run tests comparing a server packing four Envise chips with an Nvidia Corp. DGX-A100 appliance, which features the chipmaker’s top-end A100 data center graphics cards. The task was to run a version of the popular BERT machine learning model. Lightmatter claims that the Envise-powered server managed to provide three times higher inference performance than the DGX-A100 and seven times better power efficiency.
The startup will reportedly use a significant part of its new $80 million round to build an initial batch of chips for early customers. Lightmatter Chief Executive Officer Nick Harris told TechCrunch that those customers include hyperscale data center operators, but didn’t share any names.
Lightmatter’s go-to-market plan has several other components besides the Envise chip itself. The startup is working on another product, Passage, for connecting multiple processors to each other. AI applications are often distributed across a larger number of chips to speed up computations, which requires linking the chips at the hardware level. Lightmatter claims Passage can offer faster connectivity at lower cost than traditional products.
The startup is also developing a software platform called Idiom to make it easier for developers to deploy AI models on Envise chips. According to Lightmatter, Idiom provides the ability to run neural networks created in popular frameworks such as TensorFlow without major code changes. In data centers containing multiple Envise-powered servers, the software can also handle the logistics of splitting processing tasks among the systems.
Photo: Lightmatter
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU