Breakneck-speed data processing required in IoT era
While data processing speed is of the utmost importance to businesses today, veterans of tech remember a time when implementation took so long it required a coffee break. As the amount of accessible data has grown, so have efforts to expedite processes from 10 minutes and up to fractions of a second.
Tech veteran Manfred Buchmann (pictured, left), vice president of system engineering, EMEA, at NetApp Deutschland GmbH, a subsidiary of digital transformation company NetApp Inc., has worked in the industry long enough to remember such lengthy processes, and he is currently working to eliminate delays in data.
The availability of information through the “internet of things” has only made processing speeds a higher priority, and NetApp has been working with specialists like Mark Carlton (pictured, right), independent information technology consultant and NetApp A-Team Member, to leverage partner platforms and SolidFire — a service provider of solid-state storage systems that was acquired by NetApp in 2016 — to help customers keep pace with tech’s new frontier.
Buchmann and Carlton spoke with Rebecca Knight (@knightrm) and Peter Burris (@plburris), co-hosts of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during the NetApp Insight event in Berlin, Germany. They discussed the changing state of data processing across industries and how NetApp is working with customers to streamline the transition. (* Disclosure below.)
“Data’s sprawling, and it’s spreading so fast. … I think that’s one of the biggest challenges — knowing what data you have, how to use it, and how to get the most out of the data in the right place,” Carlton said.
Scale, flexibility opens doors
The core technology NetApp provides its clients connects an on-premises data center with the public cloud seamlessly for all different protocols, enabling secure data storage, migration and contextualization. And the data management suite NetApp has been developing since its acquisition of SolidFire has expanded options for customers in terms of faster operations and application deployment, according to Carlton.
“They’ve got scale, they’ve got flexibility. … It’s really starting to open those doors,” Carlton said. This integration is key to NetApp’s Data Fabric strategy of expanding its ability to seamlessly connect across public cloud and on-premises platforms.
For businesses seeking the competitive solution that best fits their needs, Buchmann advised first determining what is lacking in internal processes.
“It’s not only … the speed of the system, where you apply the data, but … what you are doing with your data on the support side. … We may give them some kind of guidance, but … the decision is something the customer needs to make … because they have the knowledge on the implementation side,” Buchmann concluded.
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of NetApp Insight Berlin. (* Disclosure: TheCUBE is a paid media partner for the NetApp Insight Berlin event. Neither NetApp Inc., the event sponsor, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU