UPDATED 10:21 EDT / DECEMBER 06 2012

Data Availability Delivering Business Results: Enabling Faster Decisions and Understanding Disparate Relationships

Data has always been important in business. However, the last few decades have seen an explosion in data unlike anything ever previously encountered. Not only do businesses have vastly more information to deal with, but they need that data to be available non-stop, at any time.

Enterprises are now re-thinking not only their approach to data storage, but also how that storage fits in with the rest of their computing environment. Not long ago, it was acceptable to view computing, networking and data storage as separate entities that only occasionally needed to interact with each other. Today, the need to exchange information between systems is a fundamental force in businesses. Systems have to be designed and built from the ground up to be inter-linked layers in a single stack, all working together as part of an efficient, agile data and computing infrastructure.

So why exactly are data demands growing at an exponential rate? New high-performance applications and systems are enabling new opportunities with critical insights leading to business action. Additionally, the explosive increase in social networks, sensors and mobile device are generating massive amounts of data. Across industries and professions – entertainment, law, telecommunications, and research and academic institutions – are benefitting from this opportunity.

Media: With every keystroke or use of a device, hundreds of millions of people worldwide are using mobile phones to take pictures and videos. Popular Internet social media and photo-hosting services — Facebook, Picasa, Flickr and others — are believed to hold more than 100 billion photographs, with that number growing every day. Last year, as part of an exhibit, an art gallery in Amsterdam printed a copy of every photograph uploaded to Flickr in a single day. The result was a waist-high lake of prints the spilled through all of the gallery’s several large rooms.

Research: At the other end of the size spectrum is the Large Hadron Collider, the particle accelerator on the Swiss-French border that has a circumference of 17 miles. Each month, the LHC produces more than two petabytes of data — 2,000 terabytes — that needs to be stored and analyzed. This single experiment has become the largest single data processing system in the world.

Law: Even businesses that have nothing to do with photography or physics are finding themselves challenged by the demands of data’s new reality. Westlaw, the legal research service of Thomson Reuters, stores more than five billion legal records, some of them dating back to the 1700s. Lawyers around the world count on being able to access this information whenever they need to, meaning that continuous uptime is a major Westlaw priority.

Service Provider: In fact, 24 by 7 uptime is becoming the rule for many businesses, such as Internet service providers. PeakColo, a cloud services company headquartered in Denver, guarantees its customers 100% uptime as part of a contractual service level agreement. That means PeakColo can no longer have even a few minutes of scheduled downtime, be it to upgrade some software or to switch over to a new bank of servers.

Operating an IT system in this new world means rethinking traditional computer architectures. Specifically, enterprises need the ability to be able to move data instantly and seamlessly within their computing infrastructure. That’s quite a change from past practice, in which the data for a given application was siloed on a single machine, like a server. If that machine went down, so did the application. And if the application was sufficiently mission-critical, the business often ground to a halt as well. If there is a single word that best explains the new approach to data, it’s “agility.”

Agility means changes in both hardware and software, with flexibility and interoperability becoming key elements. For example, diverse storage hardware products get pooled together, with robust data management software allowing users to “see” the combined storage as a single resource that can be divided up as needed by the demands of the business, even on the fly. Components can be swapped in and out, meaning that additional storage capabilities can be added without first having to take the entire IT system down. This new approach is to data storage what virtualization is to servers.

A New Approach

Hardware complexities, including different models of systems running side by side, are hidden from users by a smart, flexible storage operating system. If one part of the system goes down due to failure or scheduled maintenance, there is enough data redundancy and resiliency built into the system that the business can continue operation without interruption. This is true at every level of the hardware-software storage “stack.”  Users have long been able to swap out raw disk disks without incurring downtime. Now, though, they have the same ability with disk controllers, which operate at a much higher level.

Seamless data mobility features allow volumes and LUNs to be relocated from one system to another without application disruption during maintenance and tech refresh events. Storage controllers not only perform the rudimentary tasks of reading and writing data; they also increase storage efficiency by compressing data before sending it out to be stored, and checking to prevent duplicate blocks of data from being stored multiple times on the same disk. In a nutshell, every part of the storage process has been designed to support a shared virtual infrastructure and must therefore be made failure-proof.

In years past, an IT system that was immune to downtime seemed unattainable. Now, though, it is not only possible, but mandatory. We are living in a data-rich world, and we expect that data to be available all the time. Organizations require near-instant business analysis to uncover the awaiting opportunities in the data within their storage systems. Organizations are now facing an ever-changing set of pressures created by the data explosion and market changes. Every business today is data-dependent. The only viable option for today’s enterprise is rethinking data storage in the context of a re-imagined IT landscape, one that acts as though non-stop data availability is a matter of a business’s life or death. Because in many ways, it is.

Author: Jason Blosil, Sr. Product Marketing Manager at NetApp
Jason Blosil has over 16 years of Finance and Marketing experience. Jason currently works for NetApp focused on SAN and NAS solutions and is the chair of the SNIA Ethernet Storage Forum. Prior to NetApp, he spent 7 years in product management and marketing at Adaptec. He has a degree in Electrical and Computer Engineering and an MBA from Brigham Young University.

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU