UPDATED 14:32 EDT / DECEMBER 11 2015

NEWS

Wikibon’s George Gilbert defines the new, machine-learning based analytics pipeline for IoT

Capturing and managing the huge volumes of data being generated by Internet of Things (IoT) and deriving value from it requires a new data analytics architecture, writes Wikibon Big Data Analyst George Gilbert. In his latest Professional Alert, “Recipe For An IoT-Ready Analytic Pipeline,” Gilbert provides a road map for building that new pipeline, first at the IT director level and then for IT architects.

The best way to understand this emerging IoT analytic pipeline, he says, is to find elements in the traditional approach that are changing and extrapolate based on the new requirements. The cost of capturing traditional data manually has stayed roughly constant at $1 billion per terabyte for several decades. But the new IoT data is generated and captured at a marginal cost approaching zero. The new data pipeline must leverage elastic clusters of commodity hardware and software using automated management, bringing the cost of capture and management to as close to zero as possible.

The data pipeline needs to support a much higher data velocity and provide near real-time responsiveness between capturing data and driving action, while still leveraging historical data to improve the context of analytics. It needs to provide converged analytics, supporting both batch and real-time as well as both business intelligence and machine learning on any data type.

An example application is General Electric Co.’s Predix software-as-a-service (SaaS) application for predictive maintenance service for industrial equipment. This analyzes continual data streams from instrumented machinery to monitor and anticipate maintenance needs for smart, connected products operated by a manufacturer’s customers.

This “messy” data is semi-structured and often originates in analog form from sensors. The structure evolves over time, requiring flexible management. The sources are highly decentralized and in some cases (such as airplanes, automobiles and train engines) in motion. The system needs edge processing capability to separate normal readings from abnormal ones that might indicate a developing issue and send only the latter over the network, which may have low bandwidth and intermittent service.

The full alert discusses the new architecture in more detail.

Image via jeferrb

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.