IBM Study: Optimize Your Data Center, Spend 50% More On Innovation
It seems pretty simple: A better, more efficient data center means a better, more efficient platform for modern IT services like cloud, mobile and big data, which translates to more productivity and more time to innovate. But according to an IDC study commissioned by IBM itself, only 21% of the 300 “global IT leaders” surveyed are operating their data centers at peak levels of efficiency.
How do you measure “maximum efficiency,” anyway? In his blog entry summarizing the survey IBM VP of Global Site and Facilities Services Steve Sams says that the survey calculated metrics for “facility operations and management, systems, storage, network, applications, businesses drivers, budget and governance.”
So apart from that top 21 percent, IBM says that 62 percent of these IT leaders are only “somewhat” efficient. A further 17 percent are operating their data centers at the most basic level.
The surest sign that it’s worth the effort to optimize a data center that IBM can provide is that the top tier have one thing in common: They were able to reinvest around 50 percent of their IT budgets on innovating and providing new services. It’s always a good idea to take these vendor-sponsored survey results with a grain of salt, but it does make sense that a smoothly-operating data center operating at or near peak capacity will cut down on administration and purchasing costs.
IBM’s Sams holds up the NFL, a customer of Big Blue’s server solutions, as the prime example of what it’s talking about. IBM helped the NFL go from five percent efficiency on its old systems to 90 percent by moving to a virtual server environment. Data processing efficiency goes up, storage provisioning resources go down by half. And IBM says the time has come for organizations in the lower rungs of that chart to choose between that kind of data center efficiency or maintain the expensive status quo.
There are four things a CIO looking to improve data center efficiency should look into, Sams writes:
- Have a plan that aligns with business goals and keep it current.
- Optimize your current server, storage, network and facilities assets to maximize capacity and availability.
- Design for flexibility to support changing business needs.
- Use automation tools to improve service levels and availability. Less manual manipulation means faster response times.
Those points seem fairly broad, but that’s likely because Sams recognizes that different organizations and IT service providers have different architectural requirements for their infrastructure. All the same, IBM says that it’s followed its own advice, saving $1.25 billion between 2006 and 2011 by virtualizing 65 percent of its servers and implementing automated storage tiering that’s enabled the company to grow data by 25 percent while keeping flat costs.
Services Angle
Again, it’s absolutely critical to take these results as interesting but not necessarily reflective of reality – not everyone is going to suddenly find themselves with 50 percent windfalls in their IT budgets. But I’m a big believer in the transformative power of virtualization and cloud, and I believe IBM’s central premise that an optimized data center is one that’s more ready to add business value is a worthwhile one.
It’s not a new idea: Both the Facebook-helmed Open Compute Project (OCP), which pushes for open standards in scalable computing architectures, and the Intel-led Open Data Center Alliance (ODCA), which is all about finding more efficient usage models that meet customer needs, push for this kind of data center optimization. In fact, when the two announced a partnership back in September 2011, they were looking into making actionable game plans for data center operators.
Legacy IT organizations won’t be able to keep up as social, mobile and web applications (that much-discussed Social Enterprise model), delivered from modern, flexible data centers, enable higher levels of business agility. That problem is only going to be exacerbated when big data goes from “buzzword” to “business reality” over the next year or two – anyone still on the old data center model is simply going to be left behind as that top 21 percent implements all kinds of new technology.
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU