UPDATED 12:35 EDT / SEPTEMBER 06 2023

AI

D-Matrix raises $110M to commercialize next-gen generative AI-processing chips

D-Matrix Corp., a startup that designs and builds high-efficiency silicon and generative artificial intelligence compute platforms for data centers, today announced it has raised $110 million in an oversubscribed Series B funding round led by Singapore-based global investment firm Temasek.

The company builds specialized chips and platforms that target the deployment of generative AI at the inference stage. This is when an AI is being used to predict and produce new information from data it has never seen before, such as when it’s providing assistance, advice or insights.

Generative AI large language models, such as Meta Platform Inc.’s Llama 2 and OpenAI LP’s ChatGPT, work by being trained on large amounts of text data so that they can understand and “converse” with human-like speech. Training takes a lot of compute power up front, but it’s done only once before deployment. After an AI is deployed, inference is used to answer questions, summarize documents and more.

“Generative AI will forever change how people and companies create, work, and interact with technology,” Chief Executive Sid Sheth told SiliconANGLE. “This realization was crystallized in a ‘big bang’ moment with the release of ChatGPT. While the performance of generative AI models such as ChatGPT is stunning, the costs are staggering.”

D-Matrix produces silicon with specialized “chiplet” architecture that uses digital in-memory computing, or DIMC. This brings fully programmable memory directly onto the chip to reduce latency for inference processing to make it faster, more efficient and cheaper.

Using chiplets, multiple small chips can be used to build larger, modular and scalable integrated circuits. These two processes combined allow d-Matrix to produce varied platforms that can be scaled up for generative AI inference tasks with increased performance and efficiency.

This funding announcement follows the launch of d-Matrix’s Jayhawk II chiplet last month, featuring enhanced DIMC architecture. With this new chiplet-based platform, d-Matrix says that its customers can gain a 10 to 20 times increase in efficiency over graphical processing units and 10 to 20 times savings in costs.

Existing investors Playground Global, Microsoft Corp.’s M12 venture fund, Nautilus Venture Partners and Entrada Ventures also participated in the round and were joined by new investors including Industry Ventures, Ericsson Ventures, Marlan Holdings, Mirae Asset and Samsung Ventures. D-Matrix received $44 million in a funding round in April 2022 led by M12 and Korean semiconductor maker SK hynix Inc. The round brings the company’s total funding to $154 million.

“We’re entering the production phase when LLM inference total cost of ownership becomes a critical factor in how much, where, and when enterprises use advanced AI in their services and applications,” said Michael Stewart from M12.

Sheth said D-Matrix plans to invest the new funding in recruitment and the commercialization of its Corsair platform, which uses the DIMC architecture and chiplets, to lower the cost of inference. Planned for a 2024 launch, Corsair is a PCI form-factor card that works with a machine learning toolchain and server software for production that is mostly built from broadly adopted open-source software. AI models can be quickly ingested into the cards with the “push of a button” and require no re-training.

“The promise of generative AI is currently unattainable due to the high cost of inference, and with today’s announcement and our funding infusion, we will be able to bring a commercially viable solution to market faster than anyone else in the space,” said Sheth.

Image: d-Matrix

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU