UPDATED 17:15 EST / MAY 21 2024

Holger Mueller,VP and principal analyst at Constellation Research, weighs in on enterprise software and the benefits of doing AI workloads in the cloud at IBM Think 2024 AI

Bigger is better: Running AI on the cloud appears to be winning strategy

Generative AI relies on data, and the larger the model and the more data it can access, the greater the advantage an enterprise will have in the coming decade.

As enterprises look forward, businesses that leverage extensive datasets and robust AI models will be better positioned to innovate and adapt in an increasingly competitive market. This trend underscores the critical importance of strategic data management and AI integration in modern business operations, according to Holger Mueller (pictured), vice president and principal analyst at Constellation Research Inc.

Holger Mueller, vice president and principal analyst at Constellation Research Inc. weighs in on the difference between "big" and "small" AI when it comes to enterprise software in a conversation with theCUBE

Holger Mueller (right), VP and principal analyst at Constellation Research, talks about why bigger AI is better than smaller, localized AI with theCUBE’s John Furrier (left).

“In my view, bigger is better when it comes to gen AI … bigger models have more awareness, are more human, will be better in business,” he said. “Bigger models need more data. So, data gravity, which we’ve been talking about for a long time, is getting really, really again tangible, and that means it’s going to move to certain clouds. Companies want to be cross-cloud, but if I want to do AI and have to do a large language model, I need to get the data together.”

Mueller spoke with theCUBE Research executive analyst John Furrier at IBM Think, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the battle between cloud and on-premises data centers and the future of enterprise AI. (* Disclosure below.)

Cloud offers more data, more flexibility for enterprise software

The history of enterprise technology has been a tug-a-war between cloud and on-prem architecture, and AI is increasing that tension. Large models require large amounts of data, so having a solid foundation model architecture and fast connectivity is critical, according to Mueller.

“The infinite compute platform, where you can move workloads seamlessly from the public cloud to on-premises, it has worked really well for some vendors,” he said. “Enters with a boom, gen AI, and all of a sudden it’s all about elasticity again. The balance is always tipping back and forward. It’s totally moving in towards the public cloud right now.”

Mueller draws a distinction between big AI and little AI, with the former drawing on far greater amounts of data. If enterprises are to survive in the AI era, he believes they need to focus on big AI in the cloud.

“The understanding of the real world, with multimodal models, which is the big innovation of 2024, that models can understand more things is the key aspect. This is why the larger is better,” Mueller said.

Cloud has long offered greater flexibility, and anyone focusing on finite computer structures is already falling behind. Small AI models can also be valuable, but they need to be connected to a larger network, according to Mueller.

“What is the key quality of the cloud? It’s elasticity, he said. “Elasticity from a technical perspective, from up and down and, more importantly, from a commercial perspective, you use more, you pay more. You use less, you pay less. This is why the cloud wins.”

Moving to the cloud is also a necessity because it gives models access to a wider range of data and brings them closer to Nvidia’s DGX system, which currently dominates the hardware for AI workloads. Mueller gives an example of a bank trying to detect fraud, where the information delays of an on-prem center could lose them millions.

“Keep the prize in mind, which is larger [AI] is better,” he said. “Anybody who’s on the infinite side, who’s on the cloud side, who has elasticity, architecturally, commercially, who can run algorithms … easy to use by business people, we will see the same democratization of AI that we saw for coding.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of IBM Think

(* Disclosure: TheCUBE is a paid media partner for IBM Think. Neither IBM, the sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU