Lenovo enhances AI democratization through a purpose-built infrastructure
With artificial intelligence taking the world by storm thanks to innovations, such as ChatGPT, enterprises are looking at the bigger picture of how they can take advantage of this cutting-edge technology.
To address the new paradigms in the AI industry, Lenovo Group Ltd has gone a notch higher by incorporating a hands-on approach through its AI Innovators program that offers AI-ready infrastructure for enhanced adoption, according to Robert Daigle (pictured), director and global AI business leader at Lenovo.
“What we’re trying to do is democratize artificial intelligence for industries and companies of all sizes,” Daigle said. “We’re doing that by leaning into the startup community through our Lenovo AI Innovators program, bringing full-stack solutions for industries. We have our AI Discover labs. You have to have purpose-built AI infrastructure, and Lenovo is now the third largest infrastructure provider for AI infrastructure today.”
Daigle spoke with theCUBE industry analysts John Furrier and Dave Vellante at VMware Explore 2023, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed how Lenovo is crafting a name for itself in the AI sector. (* Disclosure below.)
Software is vital in AI
When it comes to AI, software is fundamental because it acts as the engine, according to Daigle. As a result, open-source options should be availed, especially when dealing with large language models.
“I mean, software is critical,” he stated. “When you’re thinking about artificial intelligence, one of the things that we’ve seen is you have really two options out there. You have open-source software tools that are widely available.”
With training versus inference being a burning issue in the AI sector, Lenovo is looking at the deployment of inference models, given that they will play an instrumental role in the adoption of gen AI. This is advantageous because enterprises do not have to train their own bespoke foundation models, according to Daigle.
“Inference is going to be about four times the market size of training, long tail,” he stated. “It’s a huge market. The really great thing about it is that the inference models that we’re seeing run really well on PCIE accelerators. You can take GPUs and common general-purpose systems and get started with inference today.”
When it comes to generative AI construct, Lenovo adheres to privacy issues and legal compliance for optimal results, according to Daigle. Infrastructure performance is also top of mind for the firm.
“I’ve got over 17 generative AI projects ongoing internally at Lenovo that we’re overseeing,” he stated. “One of the things that we did is we launched a responsible AI committee to review all of the things that we’re doing that touch artificial intelligence. The compute matters more now than ever with artificial intelligence.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of VMware Explore 2023:
(* Disclosure: This is an unsponsored editorial segment. However, theCUBE is a paid media partner for VMware Explore 2023. VMware Inc. and other sponsors of theCUBE’s event coverage do not have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU