What’s the Impact of Big Data on IT Infrastructure?

Big Data’s been a big boost to business in recent months, presenting a competitive advantage for those with the resources to work through the information at hand.  And as promising as Big Data frameworks are, there can be some significant infrastructure changes that need to take place within a business’ IT department. Stu Miniman, the co-author of Wikibon’s recently released Big Data Vendor Revenue and Market Forecast, discusses how analytics technology is disrupting the way enterprises look at infrastructure.

Perhaps the most used Big Data framework, Hadoop, was created with the ambitious goal of breaking traditional enterprise vendors’ hold on the data center. The reason: legacy infrastructure costs too much, and analyzing big data requires a lot of infrastructure.

“If you look at the price per gigabyte, and the massive amount of features built into traditional IT infrastructure – the people that want to build Hadoop clusters and other big data architectures want to break that,” Miniman said.

Storage vendors recognize the need for a change, and many have adopted new approaches to address the demands of their customers. Some, like Hewlett-Packard, are offering dense disk systems with embedded compute, while vendors such as EMC are taking the scale-out NAS route. Public cloud providers, meanwhile, are luring in customers with scalability and cost-near term efficiency.

Enterprises have a lot of options when it comes to storage, and the same is true in the network.

Miniman details how networking vendors are catching up with the times: Cisco is producing big data-aware switches with high buffers and low latency, while Juniper Networks is incorporating the same features into its general-purpose devices. Dell, IBM, and HP are also innovating in this space to directly gain market share, and make their integrated solutions more attractive at the same time.

RELATED:  No one answer for the Cloud | #oow15

The final topic Miniman tackles is the convergence of cloud and Big Data, and how the major cloud providers are making it happen.

Amazon is leading the charge with Elastic MapReduce and RedShift, an on-demand archiving service. Miniman names Rackspace as another leader in this space, and notes that Google and Microsoft are advancing just as fast with their own hosted analytical offerings.  See Miniman’s full segment below:

Maria Deutscher

Maria Deutscher

Maria Deutscher is a staff writer for SiliconANGLE covering all things enterprise and fresh. Her work takes her from the bowels of the corporate network up to the great free ranges of the open-source ecosystem and back on a daily basis, with the occasional pit stop in the world of end-users. She is especially passionate about cloud computing and data analytics, although she also has a soft spot for stories that diverge from the beaten track to provide a more unique perspective on the complexities of the industry.
Maria Deutscher


Join our mailing list to receive the latest news and updates from our team.

Submit a Comment

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Share This

Share This

Share this post with your friends!