The excessive cost of moving data means the vast majority of Internet of Things use cases will need to adopt a hybrid cloud solution where data is processed at the edge.
That’s according to a new report from David Floyer, chief technology officer at Wikibon, the research analyst group owned by the same company as SiliconANGLE.
Floyer first pitched his argument for an alternative architecture for IoT deployments last year. Edge computing is used to capture and perform initial analysis of raw data from local sites, passing along only data that is immediately significant to the network and storing the rest locally.
Since his original report, AT&T Inc. has announced a deal with Amazon Web Services that will see the two companies partner on a new IoT networking solution that relies on cellular networks to transmit data back to the cloud. But while this newer solution is more cost-effective than the “cloud only” model that sees all data transmitted to the cloud, Floyer’s updated report, “The Vital Role of Edge Computing for IoT: 2016 Update“, shows that a hybrid IoT model where most computing is done at the edge still offers much greater value and reliability.
In his updated research, Floyer presents a case study of a wind farm site equipped with 100 sensors and two video streams. It compares the total cost of managing and processing on three different kinds of architectures: cloud-only processing with a dedicated network, AT&T’s new cellular network with hardware and cloud processing, and an edge-and-cloud processing with a dedicated network.
In the first scenario, Floyer calculates the three-year cost of transmitting the data, plus cloud costs and equipment costs, works out at $254,552 a year, while the total cost of the cellular-based network comes to $113,884 a year.
By contrast, an edge-computing approach would cost about a third of AT&T’s cellular-based solution, at just $37,628 per year, Floyer’s calculations reveal.
Edge computing provides other advantages too, including greater availability, reliability and performance, Floyer’s study shows. Moreover, the report takes into account the impact new technology trends will have on the costs of communicating data and concludes that these costs will dominate and get comparatively bigger in the future, further strengthening the case for edge computing.
There will still be some scenarios where non-edge architectures are required, where the value of having the data at the center is greater than at the edge. One example is the supercomputing requirements of weather models, where all the data content has to be be available in a single, very large compute environment. Another is electrical power networks that require specific IoT data usage to be transmitted to a central location, where data can be processed and monitored to minimize the risk of blackouts across the whole network.
However, in the vast majority of IoT use cases, non-edge computing architectures will become economically unfeasible. “Wikibon projects that 95% of IoT data will live and die at the Edge, and this will grow to 99% over the next decade,” Floyer writes. “A strategy to move all IoT data to the ‘cloud’ will be long-term economic suicide for most enterprises.”