UPDATED 17:50 EST / DECEMBER 13 2024

CLOUD

Three insights you might have missed from AWS re:Invent

The AWS re:Invent gathering in Las Vegas last week provided several newsworthy announcements, including the cloud giant’s entry into the world of AI foundation models with the release of Amazon Nova, along with major enhancements for Amazon Bedrock, Amazon SageMaker and Q Developer.

Yet beyond the announcement publicity generated was a clear statement from Amazon Web Services Inc. that it intends to leverage its position at the top of the cloud ecosystem to build the infrastructure in support of tech stack modernization for enterprise AI. It is a transition that will require the right tools to be successful, according to Andy Jassy (pictured), chief executive of Amazon.com Inc., in an exclusive interview with theCUBE, SiliconANGLE Media’s livestreaming studio.

“You have to build the right set of primitives, and that’s what we’ve been doing with SageMaker and Bedrock,” Jassy told theCUBE. “You have to have your infrastructure modernized for AI.”

Jassy and a number of industry executives and analysts offered their thoughts on tech stack modernization and the latest AWS announcements during interviews at the event by theCUBE.

Here’s theCUBE’s complete video interview with Andy Jassy:

Here are three key insights you might have missed during the event:

1. AWS believes that for enterprise AI to succeed, tech stack modernization matters.

The announcements from AWS at re:Invent pointed toward an interest in building the technology surrounding AI, a process described by one company executive as “scaling up” vs. “scaling out” in support of the workload. Models may be important, but if the infrastructure to drive them hasn’t been modernized, the outcome will not be good.

Andy Jassy, CEO of Amazon, spoke with theCUBE about tech stack modernization during AWS re:Invent.

Amazon’s Andy Jassy spoke with theCUBE about tech stack modernization.

“It’s not just the model. I think people very often trick themselves with a good model that they think they’re there, but they’re really only about 70% of the way there,” said Jassy. “In applications, they don’t really work well for users if there’s 30% error rates or wonkiness. So, the UI really matters. The fluency, the messaging really matters; the latency really matters and the cost efficiency really matters.”

Amazon’s process of tech stack modernization included announcements last week of new prompt caching tools for Amazon Bedrock to control AI processing costs. Recent enhancements for Q Developer will make machine learning more accessible for nontechnical users, and one analyst took note of the company’s more unified approach in tech stack modernization.

“I’ve been covering Amazon for years, and one of the things I’ve been noticing is the growth of building these platforms and having these tech stacks to help with these modernization efforts,” said Paul Nashawaty, analyst at theCUBE Research, during a conversation at the event. “Amazon is delivering. They’re putting together these unified packages that don’t offer this bag of bits anymore, and they’re offering a single way of delivering … this tech stack.”

The company’s approach toward stack modernization is also being reflected among members of the AWS customer base. Lori Beer, global chief information officer of JPMorgan, described how her organization has been focused on retooling for AI in the cloud during an appearance on theCUBE.

“We’ve been working really hard on continuing to strengthen resiliency in the cloud, security in the cloud, through our great partnership with AWS,” she said. “We have a hybrid approach, so we run massive scale inside our data centers. We’re critical infrastructure, but we also leverage the innovation happening in the public cloud. Our continued prioritization is around continuously modernizing.”

This process of modernization has shaped an AWS strategy to leverage its position at the top of the cloud ecosystem and build the key infrastructure elements in support of enterprise AI. It is an approach that could lead to a transformational synergy between the cloud and the rapidly-expanding influence of artificial intelligence, as noted by Jerry Chen, general partner at Greylock Partners LLC.

“The past 12, 13 years you’ve been living this cloud-mobile transition,” said Chen, in an interview during the event. “Phase one was just move the stuff you had in your data center to the cloud. Phase two is let’s rewrite things in a cloud-native way. Amazon’s really pioneered what cloud-native means. Now, we’re seeing what AI-native means. I think we’re seeing Amazon try to reinvent themselves to say, ‘We’re the AI-native cloud.’”

Here’s theCUBE’s complete video interview with Jerry Chen:

2. Inference will be a core building block for AWS going forward.

There was significant messaging from AWS at re:Invent around inferencing, the process in which an AI model uses a knowledge base to analyze information and make predictions. In an exclusive interview prior to the event, AWS chief executive Matt Garman described how inferencing would become a core element of AWS services.

Matt Garman discusses how inferencing will become a core element of AWS services.

Matt Garman discusses how inferencing will become a core element of AWS services.

“Inference is the next core building block,” Garman said. “If you think about inference as part of every application that you go build, it’s not just a separate generative AI application and then all my other applications — it’s just an integral part, just like you would think about databases.”

Many of the recent enhancements for Amazon Bedrock were designed to support this approach. At re:Invent, AWS has unveiled significant updates to Amazon Bedrock, its managed service providing access to high-performing foundation models via a unified API. AWS customers such as Salesforce Inc. have built Bedrock inferencing capabilities into its fully cloud-native Heroku platform. Salesforce has been focused on enhancements to enable developers who want to build and operate AI applications in the cloud.

“We replatformed, but the best thing is the user experience is exactly that simple clean experience that people are used to,” said Betty Junod, Heroku chief marketing officer at Salesforce, in an interview with theCUBE. “But they’re getting things like Graviton performance, they’re getting EKS, ECR, Global Accelerator. They’re getting managed inference powered by Bedrock … same experience they love with just more horsepower and more tools around it.”

One factor that is driving the AWS approach to inferencing centers around its importance to customers seeking to leverage enterprise interest in the technology. Nutanix Inc. is building tools for inferencing that will help businesses achieve consistency across AI workloads.

“One of the key components of that generative AI application development lifecycle, whether you do agents or retrieval augmented generation, is inference,” according to Debojyoti Dutta, vice president of engineering at Nutanix, in a conversation with theCUBE. “We are focused in Nutanix Enterprise AI on how to do inference really well for the enterprise. We simplify the entire lifecycle of inference for our customer. A customer can go and choose any model from Hugging Face or from the Nvidia catalog and then deploy the model very easily with a couple of button clicks.”

Here’s theCUBE’s complete video interview with Matt Garman:

3. AI adoption is driving an increased reliance on hybrid computing.

Many of the conversations at re:Invent centered around the need for flexibility in the tech stack. Prioritization of interoperability and performance is leading key industry players, such as VMware at Broadcom Inc., to build for a hybrid computing model.

AWS’ Steven Jones and Broadcom’s Ahmar Mohammad talk to theCUBE about hybrid cloud solutions.

AWS’ Steven Jones and Broadcom’s Ahmar Mohammad talk to theCUBE about hybrid cloud solutions.

“At VMware by Broadcom, the strategy is very simple. We are a private cloud company, but we also believe in the model of you have customer choice, you can run that,” said Ahmar Mohammad, vice president of partners, managed services and solutions GTM, VCF Division, at Broadcom, during a conversation with theCUBE. “Allowing customers that flexibility to run that same stack on-prem or in the public cloud or combination of both in a hybrid environment, that’s exactly what we are enabling now.”

The rise of AI agents, pieces of intelligent software to perform specific tasks, will also lead to greater reliance on hybrid application development. Large language models will need to draw from a wide range of data sources which will change the nature of how applications are constructed.

“There’s no magic in computing; there’s logic there,” said Sarbjeet Johal, technology analyst and member of theCUBE Collective, during an interview at re:Invent. “There are zeros and ones, and they get flipped based upon what we tell computers to do. In this case, large language models are telling them based upon our prompts. The fact is that when we view the next-generation applications, they will be hybrid applications. We will use the old constructs, data coming from the actual databases where it sits there right now and then also the generative AI agents will be called in.”

The hybrid landscape is also receiving particular attention as AI-driven cloud innovation drives a need for simplifying complex migrations and optimizing data management. This has been a priority for AWS, according to Mai-Lan Tomsen Bukovec, vice president of technology at the cloud giant, who spoke with theCUBE.

“In my area, we are all about changing how people think about solving problems, and we’re doing that in two ways … in data and in migration of Windows applications, VMware applications and event mainframes,” she said.

For companies such as SAP SE, solving problems means opening new avenues for customers to use software-as-a-service solutions in a hybrid world. This includes SAP’s ERP offering, which has moved to the cloud as the potential of AI solutions provides new vectors of growth.

“It is a transformational effort,” said Jan Gilg, president and chief product officer of cloud ERP at SAP SE, in an appearance on theCUBE. “For a long time, ERP has been looked at as very technical and upgrades have been looked at as very technical. Even moving to the cloud … instead of running it on my own hardware, I now run it on a hyperscaler. How can I actually extend the standard software to bring out the secret sauce? That is where there’s so much possibility nowadays with the technology available based on AI.”

Here’s theCUBE’s complete video interview with Mai-Lan Tomsen Bukovec:

To watch more of theCUBE’s coverage of AWS re:Invent, here’s our complete event video playlist:

Photo: Robert Hof/SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU