UPDATED 11:16 EST / SEPTEMBER 19 2024

VMware's blueprint for AI infrastructure positioning, streamlined subscription and licensing models, and the path forward within the Broadcom umbrella. INFRA

Three insights you might have missed from VMware Explore

Coming into 2024, the enterprise technology space buzzed with speculation on the future following VMware LLC’s acquisition by Broadcom Inc. Analysts and experts mused on how Broadcom would handle the portfolio direction for VMware’s many product lines.

At the recent VMware Explore 2024 event, many of those questions were addressed, with the key themes of streamlining products, integrating across ecosystems and positioning for artificial intelligence in focus.

Unpacking VMware's blueprint for AI infrastructure positioning, streamlined subscription and licensing models, and the path forward within the Broadcom umbrella.

John Furrier and Dave Vellante discuss VMware and infrastructure.

“What will happen with the ecosystem? Are people going to switch?” asked John Furrier, executive analyst at theCUBE Research, in an analysis segment. “What’s the deal with the licenses? Is VCF ready, is it truly a great platform? Can it sustain another run of durable value creation in the IT world? I’m telling you right now, that’s all everyone’s talking about.”

During the event, theCUBE Research analysts spoke with industry insiders, experts and professionals for emerging data virtualization and AI platform insights. TheCUBE’s exclusive interviews, now available for streaming on demand, explore VMware’s blueprint for AI infrastructure positioning, streamlined subscription and licensing models and the path forward within the Broadcom umbrella — among other key topics. (* Disclosure below.)

Here is a roundup of three things you might have missed during theCUBE’s coverage of VMware Explore:

1. Analyst insights reveal changing AI infrastructure dynamics.

There’s a new cloud transformation underway. Gen AI is catalyzing it and is forcing a rethink on how businesses operate and compete. But following the acquisition, several rough edges needed to be smoothed over at VMware as it balanced emerging platform challenges with fulfilling obligations to existing customers.

Unpacking VMware's blueprint for AI infrastructure positioning, streamlined subscription and licensing models, and the path forward within the Broadcom umbrella.

John Furrier, Rob Strechay and Dave Vellante give their opening analysis.

“While the market’s shifting to industry solutions and gen AI is changing the game, there’s a risk there that increases the risk factor,” Furrier said in the event’s opening analysis segment. “There’s a huge amount of power dynamics in the market right now where cloud is going at certain next generation … that means if the infrastructure isn’t performant, then you are going to have a serious problem.”

Companies demand performant, scalable and efficient gen AI infrastructure to power the new wave of operational efficiency, analytics and service delivery. In return, this demand has placed strong pressure on platformers such as VMware to deliver transformative pathways to shift from legacy systems to cutting-edge solutions. Doing so will require new industry partnerships, according to Dave Vellante, chief analyst at theCUBE Research.

“The old legacy stuff is declining faster than the new stuff is growing,” he said. “But IBM is excellent. With watsonx for data and governance and the whole rebirth of Watson, they have very good data chops, and I would partner with them. To me, that’s the strongest play.”

VMware has embarked on a full-on cloud shift to work hand-in-hand with its various product divisions. However, this move isn’t without its challenges; a fine balance must be struck between the cloud’s immense growth and innovation potential and the maintenance of the very ecosystems that underpin enterprise IT.

“I think everybody was focused on how you manage VMware,” said Rob Strechay, principal analyst at theCUBE Research. “I think now it becomes how do you manage VMware in a cloud sense and all of the different pieces that come? Maybe you are using the private AI stack; maybe you’re not. Maybe you’re using some of the Microsoft OpenAI stuff instead on top of [Azure VMware Solution], and I think that becomes the opportunity for the channel.”

Here’s theCUBE’s complete opening analysis segment:

2. Broadcom focuses on IT economics, AI security and its Edge Compute Stack.

Given VMware’s overt focus on delivering AI infrastructure solutions, a crucial delivery channel to deliver those solutions to enterprises is at the edge. As such, the company unveiled a slew of additions to the Edge Compute Stack as it bolsters its overall endpoint strategy.

“You have this cookie, and what we’re doing now is we are telling people, ‘Take a bite out of the whole cookie,’” said Sanjay Uppal (pictured), senior vice president and general manager for the software-defined edge division at Broadcom, during an event interview with theCUBE. “Deploy Edge Compute Stack, you have Velo underneath, and then your Telco Cloud Platform. You have all the intelligence to optimize these generative AI workloads. Edge AI is coming; the big tsunami is here.”

In delivering AI infrastructure, VMware is making a big bet on ushering in the future of gen AI enterprise security and threat detection. The strategy is especially poignant as companies prioritize efforts to counteract malicious attackers that already have gen AI in their arsenal.

“Along comes generative AI, and with these large language models, suddenly we can model a much bigger model over there, and we can look at all these behavioral attacks,” said Umesh Mahajan, vice president and general manager for application networking and security at Broadcom, during an interview with theCUBE.

Unpacking VMware's blueprint for AI infrastructure positioning, streamlined subscription and licensing models, and the path forward within the Broadcom umbrella.

Broadcom’s Chris Wolf and Ram Velaga on network performance.

Many analysts have opined that Broadcom and VMware make for a compelling hardware/software partnership. Summarily, a focus for both parties is driving the network performance needed to power GPUs and servers for gen AI infrastructure.

“When you start thinking about AI, it’s all about distributed computing,” said Ram Velaga, senior vice president and general manager for the Core Switching Group at Broadcom, during an event interviewwith theCUBE. “When you’re doing distributed computing; it’s a lot of GPUs and you have to connect them together with the network. That’s what we do. We look at where AI is going into the enterprise, and we believe it should be built on Ethernet.”

VMware Explore also examined the shift to private and hybrid cloud infrastructures as companies assess the overall economics of their IT operations. With this shift, companies are balancing the usefulness of public cloud for high-velocity scenarios with the data accessibility and cost savings of private cloud. More importantly, Broadcom is delivering the networking bandwidth to handle both scenarios, according to  Velaga.

“Networking bandwidth has been scaling quite a bit,” he said. “If you just look at our history of chips that we’ve been coming out with in the last few years, we went from 12.8 terabit chips to 25 terabit chips to 50 terabit chips. Most people can predict when we will have a 100-terabit device coming up. So, the network bandwidth is keeping up … it’s ready to take this challenge on.”

Here’s theCUBE’s complete interview with Sanjay Uppal:

3. VMware simplifies cloud and AI infrastructure with scalable solutions.

VMware is already executing on its cloud and gen AI infrastructure mission. The company is evolving VMware Cloud Foundation’s positioning in redefining enterprise storage and computing, with the removal of the vSAN requirement as a notable statement. These changes are simplifying the integration process for partners such as NetApp Inc. and unlocking enhanced scalability and cost efficiency,  according to Keith Norbie, head of worldwide partner solutions and sales acceleration at NetApp.

“One of the issues of vSAN and VMFS, which is their file system, is the scalability. It sort of flattens out after 120 nodes, where we can scale to thousands,” he explained. “We are actually engineering and leading as an engineering design partner of VMware, creating vVols version two that actually allows for that scalability. That also allows for the automatic tiering that we have brought to the hyperscalers. We can now bring that to our on-premises customers.”

Another use case is the collaboration between Samsung Electronics Co. Ltd. and VMware on CXL solutions for server memory expansion. In tackling both the hardware and software sides of the AI infrastructure conundrum, VMware lent its expertise as Samsung unveiled an expansion of its CXL memory module portfolio at a conference in Silicon Valley.

Unpacking VMware's blueprint for AI infrastructure positioning, streamlined subscription and licensing models, and the path forward within the Broadcom umbrella.

Samsung’s David McIntyre and VMware’s Arvind Jagannath discuss CXL solutions.

“We realized the potential of CXL very early,” said Arvind Jagannath, senior product line manager at VMware by Broadcom. “We have a lot of expertise with memory in general because we were one of the leads in virtual machine migration, tracking pages and memory, etc. CXL has a lot of interesting features, such as giving the CPU the ability to run instructions on the device directly. In terms of applications, we don’t want them seeing any performance loss, so CXL definitely achieves that for us.”

Introduced initially by Intel Corp. in 2019, the technology was conceived as an open interface for high-speed communications. With gen AI placing new demands on the enterprise, CXL is delivering the required performance boost, according to David McIntyre, director of product planning and business enablement — device solutions — at Samsung Semiconductor, a subsidiary of Samsung Electronics Co. Ltd.

“Servers can handle up to maybe five terabytes of dynamic random access memory, which is a huge amount of memory, but then it stops there,” he said. “CXL provides memory expansion and, eventually, memory pooling. It basically frees up these constraints so that now you can expand out with additional devices or even systems that are CXL enabled. It’s living and thriving, and it’s starting to find its way into actual production deployments.”

Here’s theCUBE’s complete interview with Keith Norbie and NetApp’s Jonsi Stefansson:

Watch theCUBE’s full VMware Explore 2024 coverage here:

(* Disclosure: TheCUBE is a paid media partner for the VMware Explore event. Neither VMware, the sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU