AI chip startups Axelera AI and Deep Vision raise new funding
Two artificial intelligence chip startups, Netherlands-based Axelera AI B.V. and Los Altos, California-based Deep Vision Inc., today announced that they’ve raised new funding to support their respective product development efforts.
The funding rounds represent the latest sign of investors’ interest in semiconductor startups. Several other companies developing AI-optimized chips have also raised capital in recent months.
Power-efficient silicon
Axelera AI today said that it has raised $12 million in funding to bring to market a chip capable of running artificial intelligence models using a fraction of the electricity required by current hardware. The funding round was led by Bitfury Group with participation from Innovation Industries, imec.xpand and IMEC, a Belgium-based nanoelectronics research institute.
Previously, IMEC partnered with Axelera in early 2020 to support its chip development efforts. The partnership is just as notable as the institute’s participation in the funding round: IMEC is one of the world’s largest nanoelectronics research centers, with a staff of about 4,000 scientists.
Axelera AI expects to roll out its first chips to customers in early 2022. The startup is designing its silicon for use in edge computing systems that require the ability to run AI algorithms with a high degree of power efficiency. Edge computing systems, such as the computers powering self-driving cars, only have as much electricity at their disposal as the attached battery can accommodate, which makes power efficiency a major design priority.
Axelera AI estimates that its silicon will have a “fraction of the power consumption” as current AI hardware while offering lower costs. The startup hasn’t yet revealed the technical details of its chips. But Axelera AI did say that the chips will use in-memory computing to run machine learning models.
In-memory computing is a term used to describe a broad range of processing methods that make creative use of a server’s memory to speed up calculations.
In databases, in-memory computing refers to the practice of speeding up queries by keeping the information being analyzed on speedy RAM instead of the slower nonvolatile storage hardware normally used for the task. In the semiconductor industry, the term describes an emerging chip architecture that is being increasingly applied to AI use cases.
Normally, the processor that carries out calculations for a neural network and the memory circuits that store the neural network’s data are implemented as two separate components. Data must constantly travel between the two components as it’s being processed, which delays calculations. With in-memory computing, the memory circuits are integrated directly into the processor or perform some computing tasks on their own. Both approaches reduce the need for data travel, which in turn shortens the associated delays and speeds up calculations.
Axelera AI will target its chips at industries such as retail and manufacturing where organizations are increasingly rolling out edge computing devices with AI features. To simplify the task of running machine learning software on its chip, the startup plans to provide support for popular open-source AI frameworks.
“Our extraordinary team merges complementary expertise in software development, image processing, dataflow architecture, in-memory computing, algorithms and quantization with a proven track record of business success,” said Axelera AI co-founder and Chief Executive Officer Fabrizio del Maffeo. “We look forward to building on our extensive R&D and introducing new solutions across the globe over the next few years.”
Axelera currently has 30 employees, including more than 20 engineers and developers who previously worked at Intel Corp., Qualcomm Inc., IBM Corp. and IMEC. The startup is reportedly holding discussions with about 20 customers ahead of the planned release of its chips in early 2022.
Edge computing in focus
Deep Vision is also developing power-efficient silicon for running AI software at the edge. The startup today announced that it has closed a $35 million funding round led by Tiger Global. Exfinity Venture Partners, SiliconMotion and Western Digital Corp. contributed as well.
Deep Vision’s flagship product is a chip dubbed ARA-1. The chip can be installed in edge computing devices to run AI models that perform real-time video analytics and natural language processing.
According to Deep Vision, ARA-1 provides up to ten times more performance per watt for inference tasks than the graphics processing units commonly used to power AI models in data centers. Inference is the term for running neural networks in production after they’re trained.
The startup says that the chip’s efficiency allows it to be used in compact devices such as industrial sensors that often have a limited power supply. ARA-1 also lends itself to more sophisticated devices such as edge servers.
Deep Vision credits the ARA-1’s efficiency to a technology that reduces how often data has to be moved between the chip’s circuits while it’s being processed. The technology analyzes each AI model that a company deploys and automatically finds a way of running it while keeping data movement to a minimum. Because transferring information between a chip’s circuits is one of the most demanding tasks involved in running AI calculations, streamlining the process improves power efficiency.
Photo: Unsplash
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU