AIStorm raises $13.2M for its analog data-reading sensor chips
AIStorm, a maker of specialized computer processors for sensors that detect events or changes in their environments, is emerging from stealth mode with a $13.2 million early-stage round of funding.
The Series A round was led by Egis Technology Inc., a supplier of biometrics technology for handsets, gaming and advanced driver-assistance systems. Image sensor maker TowerJazz, food preparation equipment manufacturer Meyer Corp. and Linear Dimensions Semiconductor Inc., a maker of biometric authentication and digital health products, also participated in the round.
These are not your usual kind of investors for a technology-related startup, but their interest in AIStorm isn’t a surprise considering how useful the new computer chips could be to them.
AIStorm has built an “AI-in-Sensor” system-on-chip that enables faster processing of complex artificial intelligence problems at the very edge of the network. The chip is designed to be integrated within the sensors that are embedded into mobile devices, “internet of things” machinery and self-driving cars, processing data directly within them.
The company reckons that by processing sensor data directly at the edge of the network, it can help to reduce costs and eliminate the security risks of transmitting large amounts of raw data to the cloud to be processed.
Processing data at the edge is nothing new in itself, but the way that AIStorm does it is. Traditional AI systems usually require data to be encoded into a digital format before it can be processed, but sensory data is typically generated in analog form.
Transforming this data is costly and time-consuming and requires extremely powerful graphics processing units to do the actual processing. But these GPUs aren’t suitable for mobile and other lower-powered devices because they require continuous digitization of input data, consuming lots of power in the process.
AIStorm’s low-powered SoCs eliminate the need to transform data into a digital format because they process information directly from the sensors while it’s still in its native analog form. The processed data can then be used to train AI and machine learning models for a wide range of different tasks, the company said.
“By combining the processing chip with the sensor, we can deliver efficient and extremely fast processing of sensor data right at the edge,” said David Schie, co-founder and chief executive officer of AIStorm. “This offers two huge benefits. It reduces response latency and it eliminates the need to send data over the network to a different processor for machine learning and decision making.”
AIStorm’s technology should be able to speed up data processing and AI in dozens of different industries, so the company is targeting several at once, Schie said. They include smartphones, machine vision, wearable tech, IoT, automotive, food service, AI assistants, security, biometric devices and imaging applications.
These are all markets that AIStorm’s investors happen to be concerned with, which helps explain why they’re so keen to help fund its technology.
“When we presented this concept to potential customers and business partners, they indicated that it was so compelling that they wanted to incorporate it into their offerings as quickly as possible,” Schie said. “They invested in AIStorm to accelerate our time to market.”
Schie added that the company would use the funds to accelerate its go-to-market and engineering efforts in order to “bring a new type of machine learning to the edge.”
A message from John Furrier, co-founder of SiliconANGLE:
Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
We really want to hear from you, and we’re looking forward to seeing you at the event and in theCUBE Club.