Out on the edge: The new cloud battleground isn’t in the cloud at all
In the process of building a global e-commerce empire that can deliver goods overnight to much of the developed world, Amazon.com Inc. has developed expertise in distributed computing, robotics and factory automation.
Google LLC’s quest to deliver subsecond responses to search queries has driven it to build a presence on more than 90 of the world’s 240 giant internet exchange points and more than 100 interconnection facilities around the world.
And years of selling to enterprise information technology organizations has made Microsoft Corp. and Oracle Corp. trusted partners in corporate data centers worldwide.
All those forces are coming together now as the cloud service arms of those companies vie for a pole position in the market for edge computing services, one that’s estimated to grow to nearly $16 billion by 2025.
As Amazon Web Services Inc. launches into the virtual version of its annual re:Invent conference this week, it will be against the backdrop of a cloud landscape that is on the path to looking little like the centralized model that has defined it for nearly 15 years. And it’s a trend that even AWS, long a proponent of everything moving to the cloud, has latched onto with a number of announcements this week.
Edge computing is a distributed architecture that places much of the intelligence at the far reaches of the network where data is collected. Processing is mostly done in real time and only a small amount of data typically traverses the network to a central cloud. Edge computing is essential to unlocking the promise of latency-sensitive applications such as orchestrating self-driving vehicles and choreographing the actions of hundreds of robots on a warehouse floor — use cases that don’t lend themselves to coordination from data centers 1,000 miles away.
Seeking to cash in on the phenomenon, the dominant cloud service providers have been quietly but steadily spreading their infrastructure across a mesh of regional, local and customer-owned data centers in a bid to expand their reach – and account control. By taking on the heavy lifting of building complex edge networks, cloud providers are betting that are betting that they can win a bigger share of their customers’ information technology budgets and fortify themselves against competitors. Although that invites concerns that customers will be locked in, many experts say the tradeoff is worth it.
A top trend
Distributed cloud, a model that incorporates physical location as part of delivered services, has acquired enough momentum that Gartner Inc. this year defined it as one of its top 10 technology trends for 2021. The research firm defines distributed cloud as a consistent control plane across a multitude of “substations” that encompass on-premises, localized, carrier-based and traditional central clouds. The goal is to give customers the benefit of cloud-like consistency and scalability while relieving the structural latency limitations of the centralized model.
Distributed cloud is also seen as a more manageable alternative to hybrid clouds, which Gartner says have proved to be more difficult to build and manage than many organizations expected. “Building and supporting hybrid combinations is hard,” Gartner researchers wrote. “Further, [users] often fail to realize that operating and maintaining the private and public cloud parts of a hybrid scenario separately can undermine or even break many of the cloud computing value propositions” because the customer is responsible for part of the operation but doesn’t have the benefit of the public cloud provider’s resources or skills.
The race to the edge is also being driven by an array of technology factors. As the population of smart endpoint devices dubbed the “internet of things” explodes over the next few years, organizations are mobilizing to tap into the operational efficiencies and insights they can gain from those data sources. More than 30% of the data generated in 2025 will be in real time, according to International Data Corp. That will require that computing power move closer to the sources of data to reduce the amount that must traverse the network.
At the same time, a trio of new wireless networking technologies — 5G, Wi-Fi 6 and Citizens Broadband Radio Service — promise to improve both network capacity and speed significantly. “IoT systems are going to move to 5G so they can transmit an almost unlimited amount of data to smaller, low-power systems,” said David Linthicum, chief cloud strategy officer at Deloitte Consulting LLP. “That’s a game-changer.”
Software containers, which are portable environments that include an application and all its dependent software, have gone mainstream, making it possible for applications to be deployed, patched and updated anywhere on a network, even in devices as small as a Raspberry Pi. “I want to deploy the same thing on my thermostat that I run on Wavelength or in an EC2 instance,” said Bill Vass, vice president of engineering at AWS, referring to the Elastic Compute Cloud component of his company’s cloud platform.
Too hot to handle?
But building out the distributed infrastructure to support the intelligent edge is a task few IT organizations will want to tackle alone. “We’ve done a lot of research on rising complexity and edge computing is going to magnify that,” Linthicum said. “My biggest fear is that many of these things won’t be operationalized because [IT organizations] lack the skills.”
Many IT executives who still remember the pain of client/server migrations in the 1990s are tapping their cloud providers for help. The trend prompted Forrester Research Inc. earlier this year to increase its estimate of public cloud growth in 2020 to 33% from 28%. “Firms are leaning further into digital and their best option is public cloud,” said Brian Hopkins, a Forrester principal analyst.
The big cloud providers are all pursuing similar paths to the edge, anchored by the on-premises versions of their cloud infrastructure that have started rolling out this year. AWS’ Outposts, which was built for use within customer data centers, is also the foundation for AWS Local Zones and AWS Wavelength, which are miniature versions of the cloud giant’s technology stack that live in small, local data centers and telecommunications carriers’ point-of-presence facilities. The company says the experience it gained building out its retail e-commerce business lends itself perfectly to edge computing.
“We already have more IoT devices connected to the cloud than any other cloud provider by a large margin. We have to do that for ourselves,” ‘said AWS’ Vass. Customers can employ such Amazon inventions as AWS Greengrass for IoT devices, AWS Snowball for storage and AWS Robomaker for development of robotic devices using Lambda serverless functions “on a POP, in a Local Zone and in the cloud, manage it all centrally and do decentralized execution,” he said.
Microsoft’s Azure cloud edge strategy uses a similar approach. Edge Zones, which the company rolled out early this year, are essentially scaled-down Azure data centers located within miles of a customer. Microsoft is stressing its long relationships with enterprise IT organizations and knowledge of on-premises data centers as a strength, said Shawn Hakl, a former Verizon executive who left the company to oversee Microsoft’s 5G strategy early this year.
Google points to its global cloud presence comprising 24 regions, 73 zones and 144 network edge locations as an example of its reach. Google also has a strategy to modernize carriers’ infrastructure through use of its cloud platform as well as an oft-stated commitment to enabling multicloud interoperability. On Nov. 30, it announced a collaboration with Intel Corp. to simplify enterprises’ ability to deploy cloudlike business models using their existing on-premises hardware.
“One company alone won’t be responsible for the edge’s success,” said Amol Phadke, Google’s managing director of telecom industry solutions. “It’s going to take global collaboration and an ecosystem that spans clouds, CSPs and other strategic software vendors.”
A millisecond away
Providers are pushing to be “within a millisecond of their customers,” or just short of 200 miles of physical distance, according to David Tennenhouse, chief research officer at VMware Inc. “You’re seeing the cloud folks moving toward the edge, 5G and containers in the telcos and then IoT all coming together at once.”
Public cloud providers have some singular advantages at the edge because of the robustness of the networks they’ve built to support their core businesses. “One thing many people don’t realize is the degree to which the large cloud players are large [wide-area network] players,” said Tennenhouse. “They are not just cloud service providers but also network service providers.”
Case in point, said AWS’ Vass: the 200 points of presence that Amazon has built to support its retail business. “We can also run processing like Lambda there,” he said. “We have a network latency map all over the world and are partnering with telcos to push out to the 5G hubs.”
That requires a different physical architecture than the one that has characterized public cloud thus far. Light travels 186 miles in a millisecond, meaning that achieving response times of five milliseconds or less demands that compute power be located within a few miles. Few customers will want to build the complex infrastructure to support that kind of processing so will rely on a single, scalable set of services and application program interfaces. “A lot of the gear will be replaced by something that’s based on public cloud infrastructure,” said Craig Lowery, a research director in Gartner’s technology and service provider group.
Plenty of contenders
But that doesn’t mean edge computing is the public cloud providers’ game to lose. For one thing, edge is not a monolith, said Forrester’s Hopkins. He defines four distinct types of edge computing. The customer premises-driven operations edge has garnered the most attention, but there are also use cases in which colocation firms, content delivery networks and carriers have the advantage.
Colocation providers, for example, have long offered the ability for customers within their physical POPs to exchange data at wireline speed with each other without incurring egress fees, a practice known as peering. Content delivery network vendors such as Akamai Technologies Inc. and Cloudflare Inc. specialize in providing load-balancing, caching and security that optimizes information delivery to customers. “They’ll stand up instances of your application code all over the world,” Hopkins said. “You don’t have to assign it to a particular AWS region.”
Telecom firms have been modernizing their infrastructure with cloudlike capabilities for the past several years and could attempt to muscle in on the big infrastructure providers, perhaps in partnership with other challengers. “The CDNs are going to the telecom guys and saying their platform can be deployed on a telecom carrier’s infrastructure without having to go to AWS,” Hopkins said. “All these companies are realizing that cloud agility is no longer limited to the public cloud providers. The battle is for the last mile and right now the telcos own that.”
So far there has been little indication that carriers want to compete with cloud giants. “Our focus is on how we can light up Google’s thousands of existing edge nodes already in communications service provider locations and really leverage that footprint,” said Google’s Phadke.
Microsoft’s Hakl, who worked for Verizon for 20 years, agreed. “I see these relationships as far more cooperative than competitive,” he said.
Although no cloud infrastructure company will reveal how many on-premises cloud instances it has shipped, some observers say the initial reception has been tepid. The promise was to deliver the equivalent of a full public cloud experience in the data center, but “while customers are very excited at the idea they’re a bit disappointed when they experience the product because it doesn’t match the public cloud experience today,” said Gartner’s Lowery.
In part that’s because the on-premises infrastructure isn’t a full duplicate of the massive cloud resource but rather a subset that must provision services selectively from a cloud region. “It may be on-prem, but it still needs to talk to the mothership every few hours,” Lowery said.
Deloitte’s Linthicum said his perception is that IT organizations are adopting on-premises cloud stacks, mainly for migration purposes. “People are still trying to figure out how it all comes together,” he said.
AWS’ Vass begged to differ. “Based on the number of units I see going out the door, the perception that the reception is muted is untrue,” he said.
Some experts also advise that the top-down perspective that cloud infrastructure providers bring to the table could limit their effectiveness in distributing intelligence optimally around the network. “The problem with cloud vendors’ strategy is that they look at the edge as an extension of public cloud and not as a separate market,” said Forrester’s Hopkins. Their objective “is still to bring back to the cloud as much as possible because that’s where they make their money.”
“For a traditional cloud provider, the inclination is to bring everything to the immense data center,” said VMware’s Tennenhouse. “At the very high level they’ll struggle somewhat.”
Some customers may also be wary about giving over their entire distributed infrastructure to a single company at the risk of being held captive to that vendor, but a certain amount of lock-in is inevitable if customers want to standardize their environments. “You have to lock in somewhere,” said Lowery. “It’s a matter of choosing where to get locked in.”
“The move to the edge doesn’t change the dynamics of vendor lock-in,” noted Brian Gracely, senior director of product strategy at Red Hat Inc. “They are the same as they were when people chose to use a cloud-specific service.”
And customers who want interoperability will have a bounty of choice, said Tennenhouse. “For the large enterprises it will be more about choosing different providers for different strengths,” he said. “We’re past the point where enterprises are going to choose only one player for everything.”
Regardless of the outcome, it’s increasingly clear that the battle for the edge will define the future of the cloud.
A message from John Furrier, co-founder of SiliconANGLE:
Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
We really want to hear from you, and we’re looking forward to seeing you at the event and in theCUBE Club.