UPDATED 11:58 EST / AUGUST 25 2018

INFRA

The future of the data center: The walls come tumbling down

The developers at credit reporting giant Experian plc write code without any idea whether it will ultimately run inside the company’s data centers, a rented space known a colocation center, in the cloud — or all three. And that suits Chief Information Officer Barry Libenson just fine.

Libenson, a veteran CIO who joined Experian three years ago, has been on a campaign to make the boundaries between the company’s captive infrastructure and its global network of service providers as permeable as possible. Two years ago, it began using containers, the lightweight software environments that allow applications to run unchanged across many platforms. It also standardized on the OpenShift container management platform developed by Red Hat Inc.

Most of all, Experian mandates that applications don’t contain any native components for a particular cloud provider, and he’s more than willing to accept the typical 10 to 15 percent penalty on developer productivity that use of a standardized platform extracts. “I’ll trade that any day for the flexibility it gives us,” he said.

Barry Libenson, Experian Image: theCUBE

Experian’s Libenson: “This is an acceleration play.” (Photo: SiliconANGLE)

Experian is on the leading edge of a shift in thinking that has triggered the largest redefinition of enterprise data centers in their 50-year history. Thanks to a confluence of new technology developments and a renewed focus by enterprises on information technology as a competitive weapon, CIOs are ditching captive infrastructure in favor of an array of more diverse and flexible options.

They include public cloud services, software as a service, colocation providers and new approaches and application development that take advantage of “serverless” functions that are delivered on demand. They’re also overhauling their own data centers to look more like public clouds and even inviting vendors to install and manage equipment on their own premises.

The result is that the evolving data centers of tomorrow will look little like their raised-floor, climate-controlled predecessors. No one is expecting captive data centers to go away, but CIOs finally see the opportunity to escape the cost and complexity of managing all their equipment, a skill few would call a core competency. “CIOs don’t want to be in the business of building data centers,” said Dheeraj Pandey, chief executive of Nutanix Inc., the hyperconverged software vendor that is trying to blur the lines between private and public infrastructure.

Experian’s Libenson expects that in five years, the company will have no more than 20 percent of its equipment on its own premises, down from 70 percent when he joined. The rest will be divided between cloud services and colocation providers. “This isn’t a cost play; it’s an acceleration play,” he said. “This will enable us to speed up our delivery customers and give us the ultimate flexibility over our infrastructure.”

Many of the changes that are enabling a rethinking of the data center will be on display starting Monday at VMware Inc.’s VMworld conference in Las Vegas, one of the premier annual conferences for data center technology. Dozens of companies will announce new technologies and services aimed at helping enterprises thread their way through the rapidly evolving and intertwined worlds of the data center and the cloud.

Data centers here to stay

U.S. Department of Agriculture (USDA) Statistical Reporting Service (SRS) Administrator Harry Trelogan looks on as Agriculture Secretary Orville Freeman tests out some of the functions of the IBM 360 computer in this 1966 photo. Image source: Wikimedia Commons

U.S. Department of Agriculture Secretary Orville Freeman checks out an IBM 360 mainframe in this 1966 photo. (Photo: Wikimedia Commons)

It wasn’t long ago that people vigorously debated the question of whether cloud computing would rise up and subsume on-premises infrastructure. No one is arguing the point anymore. In the same way that venture capitalist and former columnist Stewart Alsop missed with his 1991 prediction that the last mainframe would be unplugged on March 15, 1996, traditional data centers are going to chug along for a long time.

The unpredictable nature of legacy applications demands it. In short, if the software isn’t broken, then don’t fix it. “I can’t tell you how many times I’ve seen applications that no one wants to shut down because they don’t know if they’ll start back up,” quipped Libenson.

In fact, there’s evidence that on-premises equipment sales are actually growing. Intel Corp. reported that its sales into enterprise data centers grew 6 percent in the first half of 2018, compared to a year ago period, following three years of decline. “We’ve seen a really healthy enterprise uptick this year because we’re hitting this point where applications have gravity on-premise,” said Raejeanne Skillern, vice president of Intel’s data center group.

Sales into enterprise data centers “really started to accelerate in the middle of last year and has continued to accelerate,” said Matt Baker, senior vice president of strategy and planning at Dell Technologies Inc. “Rather than there being a pullback in data center spending, there’s actually a blooming of deployment options on-prem and off-prem.”

Intel's Skillern:

Intel’s Skillern: “Healthy enterprise uptick this year” (Photo: SiliconANGLE)

The booming U.S. economy is certainly one factor. So is new legislation such as the General Data Protection Regulation in Europe, which specifies some location requirements. There’s also a sense that much of the low hanging fruit has been picked. Applications that were easy to migrate to the cloud have already gone there, and organizations are wrestling with the trade-offs of migrating older and more delicate code to another platform.

There’s a vigorous debate about how much life the classic data center has in it. Gartner Inc.’s David Cappuccio stirred the pot last spring when he predicted that 80 percent of enterprises will shut down their traditional data center by 2025, up from 10 percent today.

“It used to be that about 70 percent of the calls I got were about how to fix data centers. Now it’s about 10 percent,” said Cappuccio, who is managing vice president and chief of infrastructure research at the analyst firm. “The facilities team doesn’t know how to manage it. The capital cost is way too high and organizations aren’t in the business of running data centers.”

One veteran of three major cloud migrations thinks Cappuccio’s forecast is right on the money. Colin Bodell oversaw the shutdown of six data centers at Time Inc. between 2014 and 2016 as the company moved 95 percent of its infrastructure to the Amazon cloud. The only workload that was kept on-premises was latency-sensitive streaming video content that was too demanding for the public network. Bodell, who’s now chief technology officer at Groupon Inc., said the same economics apply to most companies. “Everything else will go to cloud because of cost and scalability.”

However, most experts who were contacted thought Gartner’s forecast is too extreme. Uptime Institute LLC sees enterprises owning 70 percent of all data center capacity in 2020, down from 76 percent last year. “Certainly data centers have a bright future,” said Andy Lawrence, executive director of Uptime Institute Research.

Even though cloud infrastructure growth looks to be strong for the foreseeable future, organizations are realizing that the fact that a workload can run in the cloud doesn’t mean it should. IDC recently reported that 80 percent of cloud adopters have moved one or more workloads back on-premises from the public cloud environment or plan to do so.

“We see the pendulum swinging backward,” said Jon Toor, chief marketing officer at Cloudian Inc., which makes an object storage appliance that runs both on-premises and in the cloud. “Cloud is no longer a default.”

Businesses also continue to be disturbed by ‘shadow IT’ behaviors enabled by public cloud vendors, said Charles King, principal analyst at Pund-IT Inc.

A more likely scenario than large-scale data center abandonment is that existing infrastructure becomes more specialized to handle the minority of workloads that require local control. Research firm Wikibon, a sister company of SiliconANGLE, predicts that facilities won’t be shut down so much as “repurposed to run core low-cost, extreme-scale private-cloud workloads on hyperconverged infrastructure,” said James Kobielus, Wikibon’s lead analyst for data science, deep learning and application development. Examples include log analysis, archiving, data lakes, governance and high-volume transactional applications.

In the long run, the question of who owns infrastructure is less of an issue than how it’s used. As we enter the age of the “multicloud” and edge computing, CIOs are more concerned with eradicating boundaries that inhibit the smooth migration of workloads between whatever resources they need based upon factors such as demand, immediacy and customer proximity. Whether they do so in the short term is less important than having the right mindset.

“I talk to my fair share of CIOs who have set a goal of being out of on-prem in the next five years,” said Don Boulia, general manager of cloud development at IBM Corp. “I don’t think it’s going to happen, but it drives behavior.”

Standard platforms should help eradicate the boundaries that make managing infrastructure such a chore, and there’s evidence that both providers and their customers are moving more rapidly than ever on that front. Software containers, which burst upon the scene five years ago, have taken enterprise IT by storm. More than two-thirds of enterprises are expected to adopt them, according to a 2017 study by Cloud Foundry Inc. Kubernetes, an open source container orchestration platform, has gone mainstream even faster, with 71 percent adoption in just three years, according to 451 Research LLC.

The same factors that took open source software from bit player 20 years ago to an enterprise standard today will drive the evolution of the data center, said Urs Hölzle, a Google Fellow and senior vice president of technical infrastructure at Google LLC. “The lasting value will be in the software stack that is more uniform and makes you more productive and secure,” he said.

The so-called LAMP stack (Linux, Apache, MySQL, PHP) went mainstream in the enterprise because it provided a set of software components that everyone could agree upon, Hölzle said. That made life easier for CIOs and also moved innovation up the stack into higher-value areas of differentiation.

The same dynamic has yet to happen in the cloud, though. “Today, if you’re on-premises and using two different clouds, you have three different ecosystems to deal with,” he said. As standards come together in such areas as security, load balancing and automation, cloud providers rapidly adopt them and distinctions between cloud services — and data centers — will fade away.

“Five years from now, all services will run on top of the same stack,” Hölzle said. “You won’t have to commit to think about whether your code will run on-prem or in the cloud.”

Out of the real estate business

David Cappuccio Image: Gartner

Gartner’s Cappuccio: “80 percent of enterprise data centers will shut down.” (Photo: Gartner)

There’s nothing new about enterprises trying to get out of the data center business. In the 1970s and 1980s, many signed deals with outsourcing giants like Electronic Data Systems Inc., which purchased their IT assets, hired staff and sold processing back to them as a service. In the 1990s, client/server architecture was supposed to revolutionize data centers by replacing mainframes with networks of small, localized processors.

For variety of reasons, neither approach proved successful, at least at replacing mainframes or data centers. But now, a confluence of trends is enabling organizations for the first time to rethink completely the way they compute. The driving factors have nothing to do with cost. In fact, many experts say cloud infrastructure-as-a-service is more expensive in the long run than on-premises computing.

The more important motivation: making organizations more agile and responsive to an increasingly dynamic business climate and growing customer choice. They realize that customers’ online expectations are now defined by the personalization and sub-second responsiveness of web-scale giants such as Amazon.com Inc. and Google. They want answers now, or they’re quickly gone.

That means businesses no longer have the luxury of processing every transaction through a central resource. Data must move closer to where it’s needed. “If I have a logical infrastructure that can change quickly, suddenly IT becomes an enabler rather than a detriment,” Cappuccio said, referring to IT’s reputation as a cost center.

The quest to transform digitally is turning the traditional notion of the data center on its head. Instead of infrastructure dictating the location of data, data increasingly dictates the location of infrastructure. “People want to work on the data where it lives,” said Boulia.

This transformation isn’t about just moving everything to the cloud, although worldwide spending on public and private clouds now represents nearly half of total IT infrastructure spending, according to International Data Corp. Rather, it’s the concept of moving data and workloads fluidly across infrastructure the cloud introduced that has captured the imagination of IT and business executives.

A confluence of recent technology and business innovations is making that goal more attainable.

Data center automation

Virtualization kicked off the transformation of the data center by moving infrastructure management from hardware into software more than a decade ago. That has set the stage for many tasks that once required screwdrivers and wire-strippers to be automated. Today, IT managers can choose everything from open-source tools such as Puppet and Chef to full-scale suites from the largest software companies organizations to handle once-onerous tasks such as provisioning servers in remote office locations with software. Mordor Intelligence LLP estimates that the data center automation market will top $16 billion in 2019.

Once organizations automate their on-premises operations, extending their reach to the cloud is straightforward. “Automation is critical to making all these pieces work together,” said Chris Gardner, a senior analyst at Forrester Research Inc.

Containers

These miniature virtual machines effectively eliminate underlying infrastructure as an issue in application development and deployment, enabling workloads to be easily shifted between any platform that supports containers, which is all of them. Containers are expected to be in production in over 60 percent of IT environments by this time next year, according to a study by Diamanti Inc. “Containers are huge and they’re going to live everywhere,” said Forrester’s Gardner

The surprisingly rapid adoption of Kubernetes has accelerated the shift by removing much of the manual configuration that used to be required to oversee large container deployments. Kubernetes has enabled organizations to scale their deployments to hundreds of containers on a single server, and may even boost the fortunes of mainframes. “IBM’s Rockhopper Linux mainframes can reportedly run 2 million containers,” Gardner said.

Multiclouds

Enterprises are expanding their cloud investments to multiple providers as well as their own private clouds, with the goal of being able to seamlessly shift workloads among a network of cloud resources depending on factors such as cost, performance and availability. “Most enterprises have five to eight different types of clouds and we expect that to continue,” said Intel’s Skillern.

Established providers and a host of startups are building tools to make multicloud environments as simple to manage as captive data centers, as well as automating the migration of workloads between public and private clouds. They’ll encounter some opposition from cloud giants who have no incentive to play nicely with others, but Gartner’s Cappuccio believes resistance is futile. “The customer’s going to win,” he said.

Standardization will drive cloud providers to drive innovation higher up the stack, as Google and Amazon have done with their respective BigQuery and Redshift data warehouses. “There’s plenty of room to differentiate, but not in the areas where everybody is doing the same thing,” said Google’s Hölzle.

Hyperconverged infrastructure

Dell EMC's VxRail is one of many contestants in the hyperconverged infrastructure market Image: Dell EMC

Dell EMC’s VxRail is one of many contestants in the hyperconverged infrastructure market. (Photo: Dell EMC)

Suppliers such as Nutanix, Dell Technologies Inc. and Hewlett-Packard Enterprise Inc. sell software stacks that work the same way both in the cloud and on-premises. With rapid installation, easy expansion and a single point of management for compute, storage and network resources — thus “hyperconverged” — HCI appeals to CIOs’ interests in minimizing complexity.

The market grew nearly 70 percent in the most recent quarter tracked by IDC. Less heterogeneity means fewer people are needed to manage infrastructure. “Machines are better than humans at managing machines,” said Nutanix’s Pandey.

On-premises public cloud extensions

Cloud vendors understand that significant processing loads will remain in enterprise data centers for a long time and are beginning to follow the path Microsoft Corp. carved with Azure Stack, an on-premises mirror of its public cloud services. Google’s new Cloud Services Platform, announced last month, includes an on-premises version of its Kubernetes Engine orchestration service. Amazon, which had been a staunch opponent of getting into the local infrastructure business, finally gave in last month with the announcement of Snowball Edge, a modified version of its data migration engine that now includes cloudlike processing capabilities.

“Most major [cloud service providers] and systems companies have introduced products and services that give customers real choices regarding where to process data,” Wikibon analysts Ralph Finos, Peter Burris and David Floyer wrote in the firm’s most recent True Private Cloud forecast.

Colocation

Once little more than specialized real estate management firms, colocation firms have grown into diversified cloud service providers. In addition to their traditional data-center-for-rent services, many now offer connections to a variety of high-speed carriers, cloud migration services and private cloud hosting options.

Colos also typically operate facilities in small cities, where their proximity to customers and business partners makes them attractive options for latency-sensitive applications. “If I need to get closer to my customer, I don’t want to put everything in a central data center,” said Gartner’s Cappuccio. “Local hubs do that better.”

VXchnge Operating LLC operates 14 data centers in mostly midsized cities, with a 99.99999 percent uptime guarantee. Big enterprises can’t provide that kind of availability in small markets, said Ernest Sampera, the company’s chief marketing officer. “Why do customers choose us? Location, location, location,” he said. “The bigger guys aren’t in Portland.”

Stratistics Market Research Consulting Pvt Ltd. expects the global colocation market will grow to $76.3 billion by 2022 at a compound annual growth rate of 17 percent.

Composable infrastructure

Cloud infrastructure’s pay-by-the-drink pricing model appeals to IT executives, and a growing number of vendors are now obliging with composable infrastructure, which is fully managed on-premises equipment licensed on a usage basis. Examples include HPE’s Synergy and a recent evolution called GreenLake, Oracle Corp.’s Cloud at Customer, VMware’s Cloud Foundation and IBM’s Softlayer.

The jury is out on whether composable infrastructure is a phenomenon or a fad. An HPE executive recently called it the future of on-premises data center technology and Cappuccio said he has seen “a tremendous amount of interest” from CIOs. But IBM, which is a player in the market, doesn’t see much demand, according to Boulia. “When you get on-prem, there’s a set of controls that are very hard to manage with a third-party provider,” he said. Still, it’s another option.

Serverless computing

The ultimate cloud application is one that doesn’t actually live anywhere. That’s the idea behind serverless computing, a relatively new approach to software development that assembles applications out of snippets of code on the fly.

Serverless blurs the lines between infrastructure by combining transactions from public and private sources in a manner that’s transparent to the user. Using application program interfaces and microservices, developers can tap into a wide variety of public and private services and stitch them together in a fabric that effectively lives in the cloud. Although it’s still in its infancy, many market watchers predict explosive growth.

“I’m seeing a lot of enterprises build [new] applications with microservices and then going back to their [existing] applications and asking, ‘Why am I managing this? It doesn’t scale,’” said Forrester’s Gardner.

One cloud

Wikibon Worldwide IT Spending for all Technology Segments, $Billion, 2018-2025Taken together, the data center of the future will not be so much distributed as atomized, with processing spread across a fabric of owned and rented sources and shifted around as needs demand. “There’s an argument that what we’re seeing is not attrition of enterprise computing but rather granularization of it,” said Uptime Institute’s Lawrence.

The overarching model will be one that was born in the cloud, with infrastructure fully virtualized and managed by software. While cloud computing won’t displace the traditional data center, it will transform it fundamentally.

Wikibon calls the new model True Private Cloud, an on-premises computing environment built in the image of the public cloud. “All data won’t end up in one public cloud, but rather a multitude of distributed locations, each architected to compute local data according to the needs of local tasks,” Wikibon wrote. “The long-term industry trend will not be to move all data to the public cloud, but to move the cloud experience to the data.”

The research firm estimates that the true private cloud market grew 55 percent in 2017 to $20.3 billion — outstripping the growth rate of the public infrastructure-as-a-service sector — and that growth will continue at a nearly 30 percent annual clip for the next decade.

In a True Private Cloud, data center functions “will be distributed in a vast virtualized network that will have increasingly complex boundaries in terms of multi-cloud domains, cloud-to-edge domains, application domains and the like,” said Wikibon’s Kobielus. Federation frameworks will be created to manage things like access control, workload placement, data governance and other control factors.

All this will call for changes in the way IT organizations are managed. The fast-running captive infrastructure demanded professionals with highly specialized skills in areas such as network management system’s administration. In the future, those functions will all be virtualized.

The most prized people will be those who can manage multiple contractual relationships and maintain the health of a fabric of virtual resources. Gartner calls them “versatilists.” “It used to be that the guys who looked across it all were a mile wide and an inch deep. They had no value,” Cappuccio said. “Today those are the people we’re looking for.”

Edge computing

And then there’s the edge, an unknown quantity that is likely to shake everything up again. With billions of intelligent devices and sensors going online in the coming years, much of the data organizations generate will originate outside the data center. Cloud-based approaches to processing all this information won’t work, becausae it’s simply impractical to send every drop of data over a pipe to the giant virtual mainframe.

Edge computing will demand that more decisions be made at the place the data is generated. “For example, decisions about whether to keep a damaged windmill turning have to be made in microseconds; waiting even a few seconds may cause great damage,” said Wikibon Chief Technology Officer David Floyer. “You can’t move that data to the cloud. You have to process it, cut it down by 99 percent and only push data after it’s been processed locally.”

Wikibon's David Floyer (Photo: SiliconANGLE)

Wikibon’s Floyer : “Much of the innovation will be at the edge.” (Photo: SiliconANGLE)

New technologies such as autonomous vehicles will demand new data processing architectures, said Arun Shenoy, head of sales at colocation provider Server Farm Realty LLC. “You’ll have networks of processors that are in moving vehicles along with static traffic control systems and emergency services,” he said. “A lot of that data will have a life of a few seconds, but some will have a life of many years. We have to consider how to manage a highly distributed edge environment.”

More intelligence will need to move to the edge, but central processors will be needed to store and analyze historical data, as well as to spot patterns. Colocation providers are licking their chops at the opportunity, and cloud companies are already putting edge strategies into place, such as AWS’s Greengrass, Google’s Cloud IoT Edge and Microsoft’s Azure IoT Edge. “The public cloud powerhouses have a significant first-mover advantage,” said Wikibon’s Kobielus.

With 99 percent of data “stopping at the edge,” in Floyer’s view, off-the-shelf processors will become more specialized and will need to be integrated into the IT fabric. As an example, he cites the facial recognition features that Apple Inc. built into the iPhone X. “Apple had to do a lot of work to process 3-D images in volume,” he said. “It would be stupid of you to try to do that yourself.”

Rather than pushing intelligence to the edge, enterprises will buy off-the-shelf software and sensors and integrate the data streams into their environment. Nobody knows what that will look like at this point, but plenty of vendors will have their say. “My personal view is that much of the innovation that takes place over the next 10 years will be at the edge,” Floyer said.

So in the long run, the data center of the future won’t look like a building so much as a landscape, which will give everyone a chance to move up the innovation curve. The cloud has finally moved IT priorities to where they belong: managing data, not machines.

Image: SitePoint Pty

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU