UPDATED 15:56 EDT / JULY 11 2024

On the AI Insights and Innovation Pod: Is the mainframe a viable option for generative AI? mainframe AI AI

Is the mainframe a viable option for generative AI?

The history of the mainframe computer goes back to the 1930s, yet this powerful system remains just as relevant as ever with the dawn of mainframe AI.

Mainframes, large computers designed to process complex calculations and manage vast amounts of data, are commonly used by major enterprises and industries such as finance, healthcare and governments to perform tasks involving intensive data processing and large-scale computation. As the computing industry grew, mainframes became key resources for powering the tech world and they are in the conversation again with the rise of AI.

“You have to look at the different tools and technologies that are out there, including the mainframe, in trying to figure out which ones are going to be the right platform for the different components of a generative AI architecture,” said David Linthicum, research analyst for theCUBE, SiliconANGLE Media’s livestreaming studio. “There is no hard and fast definition about how to use this stuff. The architecture that you’re going to derive, leveraging mainframes or not, is going to be largely dependent on your requirements.”

The “AI Insights and Innovation” podcast is the go-to podcast for the latest news, trends and insights in artificial intelligence, including generative AI. In this edition, theCUBE’s Linthicum discusses the role of the mainframe in evolving generative AI systems.

Finding use for mainframe AI in training and computation

One functional use for mainframes in generative AI is as servers for training data, according to Linthicum.

“Mainframes have a function within generative AI because in many instances they’re where the training data is going to be stored,” he said. “There’s lots of interfaces, lots of middleware, lots of storage systems where you could access mainframe data using any number of ways. Object-based databases and structured databases and relational databases, all those sorts of things are within the realm of possibility.”

In addition to storing vital training information, mainframes can offer the computational power necessary to perform many of the complex functions required of generative AI.

“If we need to do manipulation of the data, we need to do complex calculations … the mainframe would be a very good candidate for running those sorts of things to support a generative AI system,” Linthicum said.

Another important factor when considering the role of a mainframe in AI infrastructure is that training data and inferencing often run on different platforms. This requires an ability to integrate different types of data and processing capabilities across systems, something the mainframe can facilitate.

“The accessibility of the mainframe is never going to be a problem,” Linthicum said. “It has a seamless integration system. We’re able to do so through common API systems that are very easy to implement, so keep that in mind.”

This cross-platform functionality will become even more critical as generative AI is more widely adopted, and the mainframe is positioned to be a key element in this model.

“Look at all the platforms in your portfolio and their ability to do what they do best,” Linthicum said. “There is not a reason to eliminate the mainframes. There’s a reason to look at their viable participation in this particular architecture. The ability to create an application instance like a generative AI system across many different platforms is going to be your path to success.”

Here is the complete discussion from David Linthicum, part of the “AI Insights and Innovation” podcast series on theCUBE:

Image: cybrain/Canva

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU