How companies are taking aim at ever-changing datasets

ravi-dharnikota-and-katharine-matsumoto

As big data expands its applicability to more business efficiencies, the dialogues between connected devices are generating their own developments and studies, though the highly automated nature of their processes creates a high volume of filler for analysis.

In the work of human-guided analysis, “there’s a balance between how much really is special and how much you can find patterns. And that’s really where you get into much more interesting things,” said Katharine Matsumoto (pictured, right), data scientist at Eero Inc.

Matsumoto and Ravi Dharnikota (pictured, left), chief enterprise architect at SnapLogic Inc., sat down to speak with Jeff Frick (@JeffFrick) and George Gilbert (@ggilbert41), co-hosts of theCUBE, SiliconANGLE Media’s mobile live streaming studio, at BigData SV 2017 in San Jose, CA. (*Disclosure below.)

The discussion addressed issues of changing connectivity standards, the move toward streaming and the importance of teaching systems to understand data differentiation, among other topics.

Finding flexible data management solutions

San Francisco-startup Eero is driven to increase home connectivity performance and ease of use, which had chosen SnapLogic as its partner for handling data interpretation, management and other facets of their business. The need for something flexible enough to handle future changes in application program interfaces, interface protocols, Software as a Service and similar aspects was what had drawn Eero to SnapLogic, Matsumoto explained.

As part of its services, Eero provides a Wi-Fi meshing service powered by small devices installed throughout a customer’s home, allowing for fuller coverage and minimizing of the usual issues like dead zones or slow buffering in remote areas of the house.

Eero’s devices also transmit data relating to the coverage back to the business to analyze and consequently improve its services. It’s in this part of the connection that much of their data translating is done, including examinations of how different sorts of data, such as streaming video and app-based content, behaves in different contexts of the network, Matsumoto explained.

Changing times

“The dataset itself is changing,” Dharnikota noted, as he addressed ways in which companies are having to adapt to those fundamental changes, not just in volume and management of data, but in the way the data is approached and understood.

“You’re seeing a lot of the document data models that are being offered by the SaaS services, so that the old, sort of ETL companies that were built before all of this social/mobile sort of stuff came around, was all row and column oriented. So how do you deal with the more document-oriented … sort of stuff? We built the platform to be able to handle that kind of data,” Dharnikota said.

And the changes are extending to how data is shared within a single business’ infrastructure, as streaming grows larger in the minds of businesses, not just as a way of watching videos, but of transmitting apps and functionality at near-real-time speeds.

“But batch also has its place,” Dharnikota said, with that recognition informing their system’s design to be able to handle both as needed, and keeping them ready to continue adapting as the future arrives.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of BigData SV 2017. (*Disclosure: Some segments on SiliconANGLE Media’s theCUBE are sponsored. Sponsors have no editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE