UPDATED 17:53 EDT / JANUARY 26 2012

Why Networks Matter in Big Data and Cloud


{Guest Post from Raj Kanaya, CEO of Infineta.}

We live in an age during which information has become a key resource. Collectively, we have spent the last two decades digitizing and storing vast volumes of data, the amount of which is now estimated to total 1.8 zettabytes (1.8 trillion gigabytes) a 9x jump in just the past 5 years. Data is a beast that feeds on itself, growing stronger and multiplying as it does so. The only ingredient it needs is connectivity, and that’s where networks come in.

Networks provide the mobility that makes it possible to organize, share, and transform data into the kinds of information that makes businesses efficient and competitive. Businesses depend on networks to function at optimal levels:

  • The financial services sector relies on high-speed networks for timely access to market data to make better financial decisions.
  • Manufacturers leverage networks to receive just-in-time orders from customers to optimize inventory management.
  • Medical professionals rely on networks to ensure that a given prescription isn’t contra-indicated for a patient.

Business processes have been revamped over the last decade to rely heavily on the seamless access and exchange of information by their employees. Without immediate access to information repositories and to each other, employees wouldn’t be able to make sound decisions for the business, leading to significant competitive disadvantages. If a financial services company is able to access critical information minutes or even seconds before its competition, it can convert that advantage to significant revenue by making buying or selling decisions ahead of the market.

Continued reliance on local and remote information is forcing businesses to invest in fast and reliable networking equipment to deliver connectivity between information silos and employees. Businesses are constantly looking for ways to speed up connectivity, leading to equipment refreshes and bandwidth upgrades.

But this is only the beginning.

The Cloud Connection

The advent of cloud computing over the last few years has enabled many businesses to streamline their application delivery workflows and stretch budget dollars, from outsourcing IT services such as email and document repositories to software-as-a-service (SAAS) providers, to exploring models where computing and storage resources can burst to offsite (Amazon, IBM, Rackspace, etc.) locations during times of high utilization. Such strategies are built around highly available and robust internal and external networks that enable uninterrupted access to all services in the cloud.

Because the cloud can deliver an abstracted view of all available computing, storage and application resources, local and remote networks can start to function as a unified resource, but this requires all connectivity, be it local or off-premise, to scale up so the cloud does not impede application performance.

Big Data needs Big Networks

With data volumes growing year-over-year across multiple silos, businesses are exploring ways to analyze this data to derive information that will improve the bottom line. The need to analyze business and social data has given rise to “Big Data” companies (such as Cloudera, Horton Works and Mu Sigma) that are building tools to store and analyze data and produce actionable information faster and for less cost. Businesses are using Big Data tools to:

  • Update inventories and product placement based on nightly analysis of customer behavior.
  • Increase the odds of success when prospecting for oil/gas by analyzing more field data, faster.
  • Speed up complex computations to cut time-to-market for new life-saving drugs.

Big Data enables the creation of information from data sources that were previously being overlooked. In this regard, Big Data is even more ravenous than the data storage and analysis paradigms of the last 5-10 years. Businesses leveraging Big Data will experience an exponential growth in storage requirements; when all data has potential value, none will be discarded.

Supporting the promise of Big Data requires high-capacity networks to provide reliable, low-latency connectivity between scale-out storage silos and high-performance compute clusters. The speeds at which companies need to move data around to carry out Big Data computations will dwarf the network performance requirements of the last 5-10 years. It is axiomatic that Big Data will drive the need for major network overhauls, both internally and externally.


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU