UPDATED 09:34 EDT / JUNE 05 2014

What data is “ripe for deletion?” | #IBMedge

#IBMEdgeThanks to Marc Andreessen, the tech industry has known since 2011 that “software is eating the world.” Steve Wojtowecz, VP of Storage and Network Management in the software development side of IBM, visited theCUBE to discuss just how much software can benefit compute, network, and especially storage, to help businesses cut their costs drastically.

 

The world is going software-defined

 

Wojtowecz laid out the core of software-defined compute, storage, and network: the abstraction layer. This layer acts as an intermediary between the hardware and its applications and the users. It allows IBM clients to “mix and match hardware,” to “write the application to a singled interface and take advantage of all those underlying physical resources and create an abstraction.” The trick, he says, is to “do it in a heterogenous way.” This means the tech provider must “abstract all of the physical characteristics and the uniqueness of those physical devices.”

For Wojtowecz, if the application “has to write a specific API to access a specific resource,” then “that’s not a level of abstraction.” At IBM, he says, they offer a “software-defined environment and it encapsulates computational resources, severs, networks, and storage.”

  • The key to storage is virutalization

Shifting his focus to storage specifically, Wojtowecz explained how adding virtualization allows companies to “include management, include built-in analytics, include the ability to protect the data that is being created by those applications, and then have the application physically interact with […] that software interface.” With these features, Wojtowecz continued, “you can move data around between systems, you can deprecate systems, you can pull a system off for maintenance, you can add systems, you can do whatever you want underlying it with zero downtime from the application because that’s that software layer’s job.”

Below the software layer, though, there’s a variety of “turmoil that could be happening underneath.” Like “filling up volumes,” Wojtowecz mentioned, or “hardware going off a lease or being deprecated, of being full or Tier 1 vs. Tier 2 vs flash, being able to move workloads around from hot data to cold to long term to on-prem to off-prem.” Even if there’s agitation below, Wojtowecz says, “applications continue to run, that’s what it’s all about: 100 percent of availability of the things accessing those physical layers.”

  • Why built-in management is essential

Responding to Stu Miniman’s question about managing server virtualization, Wojtowecz said, “there are things from a management perspective where you might have to go to that specific device to be able to do it.” But, he explained, there are tools, “that dive deep into the characteristics of IBM and non-IBM.”

These tools “understand the difference between block and file, understand Tier one vs. Tier two, understand the policies of the workloads that are coming in and where they need to be.”

Indeed, Wojtowecz said that without “built-in management with analytics and the ability to automate,” the technology of virtualization would either take “a lot of manual horsepower or extreme knowledge of lessons learned or best practices” to manage efficiently.

  • “Data is the next world currency”

Citing IBM CEO Gina Rometty’s prediction, Wojtowecz explained that the company that can help clients manage and protect their data will be “a very trusted advisor and confidant.” IBM, Wojtowecz, said, wants to be that company: “Data is the new currency of the IT industry and we want to make sure that we continue to have our control over where it goes, how it goes, how it’s managed, and be able to do analytics and protection of it.”

  • Efficiency is key to delivering operational solutions 

Replying to Miniman’s question about how far IBM can go with the operational solutions they offer, Wojtowecz said, “On the storage side, we’ve got many clients out there that are managing multiple petabytes with one admin.” In addition to implementing a virtual storage system and including “virtualization capabilities in addition to the management,” Wojtowecz added “the trick is exposing characteristics and analytics to those admins to make them very, very efficient. And then providing a single-user interface, a single console to those devices to be able to do what you want to do and take advantage of automation.”

  • Analytics help compensate for human error

In reference to automation, Vellante asked Wojtowecz, “Are we going to be able to place the right data on the right device for the most optimal cost savings?”

“If we’re not there,” Wojtowecz said, “then we’re really close.” Today, he explained, built-in analytics can observe usage patterns and extrapolate a certain number of characteristics. Based on those characteristics, the system can recommend a course of action, whether it’s moving data from Tier 1 to Tier 2 or moving from an on-prem cloud environment to a hybrid approach. Wojtowecz also cited IBM’s Watson, explaining that analytics engines use that same type of learning to make recommendations “based on world events or the typical usage of this type of data” about the type of applications that clients should use, or the type of database on which they should rely.

 

Clients need guidance to understand what data is “ripe for deletion”

 

#IBMEdgeDespite the savings potential, Wojtowecz explained, “a paper was just published last week by ITG that says less than 20 percent of all storage resources are virtualized.” Wojtowecz attributes this hesitancy to businesses’ unease when it comes get rid of data: “they don’t want to throw anything away because we all know the minute you delete something is the minute you need it.”

It is the responsibility of tech companies, Wojtowecz said, “to help customers get control of the infrastructure that’s storing all these things.” It’s a matter of helping clients reach the point where they’re comfortable getting rid of data because they understand “the risk it to keep it will outweigh the benefit of anything [they] might do with it in the future.”

  • Virtualization on storage has as much, or more, benefit than the computational side

One of the benefits of virtualization, analytics, and automation, Wojtowecz mentioned, is “if you take one terabyte of data on Tier 1 and […] move it to Tier 2 based on policies within that virtualization layer, you can actually save thirteen million dollars over five years.” And, “11 million bucks can be saved by dimly improving the capacity from an average of 30 percent right now to an average of 70 or 80 percent with a virtualized environment.” The cost savings for virtualization done right can be tremendous.

The hurdle tech companies must help “data centers, customers, line of business, applications, [and] databases” overcome, he says, is getting “comfortable with virtualization on storage,” and realizing that is has “as much, or more, benefit than the computational side on VMWare or ZMV.”


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU