UPDATED 08:00 EDT / FEBRUARY 11 2019


AIStorm raises $13.2M for its analog data-reading sensor chips

AIStorm, a maker of specialized computer processors for sensors that detect events or changes in their environments, is emerging from stealth mode with a $13.2 million early-stage round of funding.

The Series A round was led by Egis Technology Inc., a supplier of biometrics technology for handsets, gaming and advanced driver-assistance systems. Image sensor maker TowerJazz, food preparation equipment manufacturer Meyer Corp. and Linear Dimensions Semiconductor Inc., a maker of biometric authentication and digital health products, also participated in the round.

These are not your usual kind of investors for a technology-related startup, but their interest in AIStorm isn’t a surprise considering how useful the new computer chips could be to them.

AIStorm has built an “AI-in-Sensor” system-on-chip that enables faster processing of complex artificial intelligence problems at the very edge of the network. The chip is designed to be integrated within the sensors that are embedded into mobile devices, “internet of things” machinery and self-driving cars, processing data directly within them.

The company reckons that by processing sensor data directly at the edge of the network, it can help to reduce costs and eliminate the security risks of transmitting large amounts of raw data to the cloud to be processed.

Processing data at the edge is nothing new in itself, but the way that AIStorm does it is. Traditional AI systems usually require data to be encoded into a digital format before it can be processed, but sensory data is typically generated in analog form.

Transforming this data is costly and time-consuming and requires extremely powerful graphics processing units to do the actual processing. But these GPUs aren’t suitable for mobile and other lower-powered devices because they require continuous digitization of input data, consuming lots of power in the process.

AIStorm’s low-powered SoCs eliminate the need to transform data into a digital format because they process information directly from the sensors while it’s still in its native analog form. The processed data can then be used to train AI and machine learning models for a wide range of different tasks, the company said.

“By combining the processing chip with the sensor, we can deliver efficient and extremely fast processing of sensor data right at the edge,” said David Schie, co-founder and chief executive officer of AIStorm. “This offers two huge benefits. It reduces response latency and it eliminates the need to send data over the network to a different processor for machine learning and decision making.”

AIStorm’s technology should be able to speed up data processing and AI in dozens of different industries, so the company is targeting several at once, Schie said. They include smartphones, machine vision, wearable tech, IoT, automotive, food service, AI assistants, security, biometric devices and imaging applications.

These are all markets that AIStorm’s investors happen to be concerned with, which helps explain why they’re so keen to help fund its technology.

“When we presented this concept to potential customers and business partners, they indicated that it was so compelling that they wanted to incorporate it into their offerings as quickly as possible,” Schie said. “They invested in AIStorm to accelerate our time to market.”

Schie added that the company would use the funds to accelerate its go-to-market and engineering efforts in order to “bring a new type of machine learning to the edge.”

Image: Beear/Pixabay

Since you’re here …

Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!

Support our mission:    >>>>>>  SUBSCRIBE NOW >>>>>>  to our YouTube channel.

… We’d also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.

If you like the reporting, video interviews and other ad-free content here, please take a moment to check out a sample of the video content supported by our sponsors, tweet your support, and keep coming back to SiliconANGLE.