UPDATED 16:37 EST / SEPTEMBER 05 2018

INFRA

HPE themes at VMworld 2018 focus on protecting data — especially big data

Coming off a stronger-than-expected quarterly earnings report, Hewlett Packard Enterprise Co.’s Patrick Osborne (pictured), vice president and general manager of big data and secondary storage at HPE, spoke with Dave Vellante (@dvellante) and David Floyer (@dfloyer), co-hosts of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during the VMworld conference in Las Vegas.

They discussed the relevance of data protection in the era of digital transformation, why it’s such a hot topic in the VMware Inc. ecosystem, and what HPE is doing in big data. (* Disclosure below).

Data protection: A topic of great interest

Data protection is on fire with incumbents protecting their installed bases and upstarts with gobs of VC money trying to get enterprises to adopt their products. Why is data protection so hot? There are several reasons for this, according to Obsorne, including “this new style of IT applied to secondary storage. We saw that with primary storage the last few years — the move toward simplification. With multicloud, the move to all-flash, low-latency workloads organizations want to spend less time doing heavy lifting of infrastructure. There are a lot of the things disrupting secondary storage. People want to do it different ways, they want to be able to simplify data protection. And because data is growing so fast in general, organizations want to spend their time making data work for them; not doing tasks that don’t add value.”

HPE and VMware have a partnership that spans almost two decades, and HPE is dedicated to providing infrastructure to support VMware’s products, according to Osborne.

“We want to make sure that for our customers who are choosing VMware, our infrastructure is the first of choice — servers, networking, storage. And we want to put as much context into the VMware management plane to make that very simple for them to use and stand up,” Osborne said.

Beyond total cost of ownership

The dominant conversation historically in data protection has been around backup and costs, according to Osborne. While costs are still vital to IT pros, TCO is no longer the sole consideration. It’s because, says Osborne, data protection is like insurance and “who likes to pay their life insurance premium? Because at the end of the day, I’m not going to derive any utility from that payment. So now, the data protection discussion it’s moving into more ROI. So we have things like, the Hybrid Flash Array, from Nimble, for example. It allows you to put your workloads and data to work to derive business value. We have a great cloud service, called HPE Cloud Volumes, that we use for our customers to be able to do intelligent DR, as a service, and be able to apply cloud compute to your data. So there’s a lot of things going on, in the space, outside of your traditional move data from point A to point B for backup. Now the focus is on making data work but getting more business value from investments.”

HPE’s big data play

The conversation turned to the topic of big data. While perhaps the term isn’t as widely used today as it was four or five years ago, so-called big data has gone mainstream. HPE customers are indicative of the broader market and getting value from data is a big theme in the company’s customer base, Osborne said. Vellante pointed out that Hadoop used to be the big hot topic and now the conversation has evolved to artificial intelligence, machine learning, and machine-oriented software-based intelligent system.

Osborne responded by taking the discussion directly to the edge. Internet of things is a big focus area for HPE. According to Osborne, “At HPE, we’re definitely focused on the whole Edge to Core analytic story. We have a great story and you can see in the numbers from Q3. The edge business, Edgeline servers, Aruba — these areas are driving a lot of growth in the company because that’s where a lot of the data is being created. And then customers are moving data back into the core data centers.”

He sees a relationship between the edge, IoT and big data and said that HPE sees a number of customers using these tools and practices as part of their digital transformation strategies. Osborne sees data, data protection, IoT/the edge as intertwined and very much on the minds of every organization — including HPE itself.

As well, according to Osborne, the workload profiles are changing. Specifically, he said, “They’re moving from batch oriented to now fast data in the form of streaming analytics. And then, incorporating concepts of AI and ML to provide better service or better experience for their customers. And we’re doing that with, for example, InfoSight. So we have a great product lineup for this space with Nimble, 3PAR, and we also provide a service, on top of that, which is a SAS-based service. It has predictive analytics and machine learning. And we’re able to do that, by using big data analytics and help customers get more value from their data.”

Embedding predictive analytics into infrastructure

Nimble was an acquisition HPE made last year, which brought the InfoSight technology to HPE. The company has been leveraging InfoSight across its other platforms. The play, according to Osborne, is to provide those predictive analytics and be able to make recommendations to optimize the infrastructure in terms of performance, availability and cost. The technology uses machine learning, and according to Osborne, because it’s a SaaS service, it scales across a entire portfolio.

Osborne claims HPE has tens of thousands of users accessing the service on a daily basis. He explained that HPE customers are moving from an ERP-like data warehouse system with batch analytics toward a more real-time modern approach with Elasticsearch and Kafka and other modern techniques, “so it’s really helped us unlock a lot of value for our customers.”

What about cloud?

According to Osborne, “It’s a hybrid world, so our customers are going to expect, from us, as a portfolio vendor, the ability to provide an automated solution, on-premises, as automated as what you’d get in the cloud.”

HPE is moving toward a sourcing experience that the company calls GreenLake, which is a flexible capacity model. The idea is customers can buy everything from HPE, from a solution perspective, in a pay-as-you-go elastic model where they can flex-up and flex-down their costs. In terms of who HPE targets as the customer, Osborne said, “Obviously we’ve been focused on the infrastructure persona, and often we’re talking to the DevOps folks, the cloud engineers, and anyone providing infrastructure services to their clients, whether it’s computer networking or storage … the idea is we want our customers to easily plug into all these frameworks. Whether it’s Ansible, Chef, or whatever automation services the ecosystem delivers.”

Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the VMworld conference. (* Disclosure: Hewlett Packard Enterprise Co. sponsored this segment, with additional broadcast sponsorship from VMware Inc. HPE, VMware, and other sponsors do not have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU