Jason Mendenhall, executive vice president for cloud at Switch Communications, the collocation provider that operates the “ginormous” SuperNap data center in Las Vegas, says that Amazon Web Services is not growing at the expense of the enterprise market – quite the opposite. During this week’s Wikibon Peer Incite event, he discussed how the public cloud giant is disrupting the industry by pressuring the competition to differentiate.
Mendenhall tells Wikibon’s Dave Vellante that “there is not going to be one cloud to rule them all”. Amazon is gaining a lot of traction in the enterprise, but many analysts believe that one size doesn’t fit all when it comes to IT infrastructure. According to Mendenhall, Amazon’s competitors are trying to attack verticals such as healthcare and financial services where customers have special compliance or security requirements that Amazon may not address, but the private cloud – and some competing providers – might.
He goes on to explain how organizations typically approach AWS:
“Here’s what we’ve seen: they [enterprise users] go to the public cloud just to experiment, they take their team and go figure it out. The team goes in there, and they’ll spin up some stuff in Amazon, maybe try a few workloads, try dev and Q&A, and maybe try to test some scaling capabilities, they’ll look at that. Then there will be some project, something that is driving this initiative…and it will do one of two things of them: it either pushes them into a private cloud deployment, [or] it takes them to the public cloud to try something new.”
Mendenhall says that in most cases, enterprises take the former route. They may keep the dev and Q&A on Amazon, but they’ll deploy their mission-critical apps in a private environment.
Later in the discusion, SiliconAngle founding CEO John Furrier phones in and provides his own take on the subject:
“CIOs that I talked to are saying the same thing: you know what, I have NetApp here, I have EMC, I have all this stuff [and] I got to make it work. Now I might throw it away over time, but right now I’m not gonna just trash everything and move to a pure cloud environment. It’s important to understand that the value side is really relevant in the enterprise for those types of workloads.”
Big Data is one example of a workload that Amazon does address well enough yet. Mendenhall points out that transferring a two petabyte dataset, which he says is “not unheard of” in the SuperNap facility, takes 11 days with a standard 10GB fiber cable. Moving data out of AWS is not only time-consuming but also expensive, and then there’s the fact that users have no way of confirming that their information no longer resides on the company’s servers.
For additional insights from Mendenhall and the Wikibon crew, fire up the full Peer Incite video (above). Be sure to check out Bert Latamore’s angle on the role that the public cloud plays in the transformation of the data center.
photo credit: lrargerich via photopin cc
Latest posts by Maria Deutscher (see all)
- Rumor: Google is making a move for the Internet of Things - May 22, 2015
- What you missed in Cloud: OpenStack evolution - May 22, 2015
- Why is this Ashton Kutcher-backed SQL startup taking a shot at geospatial data? - May 22, 2015