UPDATED 18:00 EDT / OCTOBER 10 2017

BIG DATA

Big data stewardship, mobility build a foundation for machine learning

When it comes to business, machine learning is a powerful and disruptive technology, thanks to its ability to learn from and teach itself with vast amounts of “internet of things” data sourced from employee and manufacturing devices like laptops, equipment sensors and shipment trackers. But this type of cognitive computing couldn’t happen without emerging tools from open-source efforts, like Hadoop’s data management platform.

“What we’re finding is Hadoop and the big data space are uniquely positioned to solve these problems, both from quality control and process and management. You can get better uptime, better quality,” said Gus Horn (pictured), global consulting engineer of big data analytics and chief technology officer ambassador at NetApp Inc.

Horn spoke with John Furrier (@furrier) and Jim Kobielus (@jameskobielus), co-hosts of theCUBE, SiliconANGLE’s mobile livestreaming studio, during the recent BigData NYC event in New York. Horn talked about data stewardship, machine learning and the role of cloud computing in these emerging tech trends. (* Disclosure below.)

Value from timely and historical data

NetApp manages data like a data steward. This management style lets companies use their data for machine learning purposes. Proper management, including the architecture underneath, is critical for agility in the big data environment, Horn stated. It’s an evolving technology, and businesses must keep up.

Meanwhile, the cloud is taking on an integral role in business. As a data steward, NetApp guards data, but it also recognizes that it’s a perishable commodity. Data has its greatest value in the moment. Through NetApp products, companies can move the data to where it needs to be, when it needs to be, whether it’s in the cloud or on-premises.

Historical data is also important, because it’s a memory of sorts for machine learning. It represents a foundation for machine learning to learn from. Historical data provides context. “You’ll have to transform that data, because there’s a lot of noise, and the noise isn’t that important. It’s those anomalies within the stream of data that you need to capture and use as your training data,” Horn said.

Moving to a machine learning approach also means a change in technology for many businesses. “I think you have to embrace the cloud, and that’s one of the key attributes that NetApp brings to the table,” Horn said. The cloud is vital for scale and agility, but NetApp also seeks to make it easy for companies to move their data. The company sees data mobility as part of its stewardship role.

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of BigData NYC 2017. (* Disclosure: NetApp Inc. sponsored this segment of theCUBE. Neither NetApp nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU