UPDATED 17:21 EDT / NOVEMBER 29 2023

AI

The next leap in AI: Tailoring language models for domain-specific applications

With OpenAI and ChatGPT creating that consumerization moment, artificial intelligence continues to be top of mind for enterprises when it comes to enhanced value addition.

As a result, large language models fit into the picture because they aid in the realization of industry-specific needs, tastes and preferences, according to Muddu Sudhakar (pictured), co-founder and chief executive officer of Aisera Inc.

“What I see as the next value will be on LLMs,” Sudhakar said. “It’s not the foundation model where the value will be. It’ll be more on the domain specific. What do you call LLMs, which are domain specific, which are small? I think the word that I heard Microsoft call it is SLM, which is small language models, which are very domain special. We’ll have one for IT, one for HR, one for finance, but one for each enterprise.”

Sudhakar spoke with theCUBE industry analyst John Furrier at the “Supercloud 5: The Battle for AI Supremacy” event, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed why LLMs will be at the epicenter of value creation in enterprises.

LLMs will take the platform-as-a-service route

Since LLMs will be between applications and the foundation layers, they will take the platform-as-a-service approach, according to Sudhakar. Securing them will also be vital because they will act as enterprises’ internet protocols.

“LLMs will become the PaaS,” he stated. “The application will be where you create the embedded AI, so your chatbots, your universal bot, your AI Copilot, your GPTs will be the application that are created. You have the middle layer, which is LLMs, you have the foundation layer, which will be your impression for AI. You don’t want to put it outside, it’s your IP … also how do you make sure that it’s socially responsible?”

As chips get smarter and understand how to optimize performance from a cost and energy perspective, graphic processing units will need to catch up with the algorithms. As a result, the next generation of GPUs will be needed to make LLMs faster, and this is where Aisera fits in, according to Sudhakar.

“With the latest gen AI, and with AI Copilot and ChatGPT, I would see companies like Nvidia really going to work with vendors like us or partners like us to really drive the next generation of GPUs,” he said. “At the end, it’s important that the next GPUs are really designed for your function’s LLMs. I would imagine one day there’ll be an Nvidia GPU for theCUBE LLM.”

Since generative AI is impacting both consumers and enterprises, it changes data management game. As a result, the fact that the data belongs to the business and consumers should not be in oblivion, Sudhakar pointed out.

“Both of the foundation model layers, you need the data with the knowledge graph, LLM also need the data,” he said. “Now this will be domain-specific data. Your data should belong to you. I should give an LLM to you, but I can’t take your training data and apply to somebody without your permission. I think that’s the other thing that you’ll see play out very nicely.”

With AI taking the web trajectory, the open versus closed issue lingers on. Nevertheless, the best of both worlds will be needed for optimal value addition, according to Sudhakar.

“I think open versus closed is going to be a debate that’ll continue,” he stated. “When the dust settles, there’ll be a few open-source models, and similarly a few closed like Microsoft Edge. You need both of them to do each uplevel. Each will have a value, but that is still at the foundation model.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the “Supercloud 5: The Battle for AI Supremacy” event:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU