UPDATED 10:00 EDT / MAY 07 2024

AI

Red Hat taps into tech industry ecosystem to accelerate AI development

Enterprise software company Red Hat Inc. has long been seen as a champion of the open-source movement, and its collaborative spirit is now being extended to generative artificial intelligence.

At its annual user conference Red Hat Summit today, the company revealed how it has been collaborating with numerous technology industry partners to innovate and deliver a host of new generative AI use cases. In particular, It’s working with a host of partners to provide customers with access to the essential infrastructure required to run and scale up AI workloads.

Expanded AI infrastructure

Among the numerous partnerships and collaborations it announced today, Red Hat said it has been working with Run:ai Inc., a provider of AI optimization and orchestration tools, to bring enhanced, AI-powered resource allocation capabilities to the Red Hat OpenShift AI platform.

The idea with the partnership is to help AI developers maximize their AI compute resources to achieve more at lower costs. Run:ai specializes in graphics processing unit resource optimization, and its capabilities will be especially useful to anyone using OpenShift AI to develop generative AI models. GPUs are viewed as essential for AI training and inference, yet they’re incredibly costly and in high demand, and so the partnership with Run:ai can help teams to scale their AI workload to new, untold heights, the company said.

In addition to Run:ai, Red Hat is also working with Intel Corp. to bring the iconic chipmaker’s most advanced AI processors to its customers. The company said it’s making cloud-hosted versions of Intel’s Gaudi AI accelerators, as well as its Xeon, Ultra and Core central processing units, and Arc graphics processing units, available to users directly within the Red Hat OpenShift AI platform.

The integration will make it simple for OpenShift AI platform users to to develop AI models and deploy them on Intel’s infrastructure in any environment, from the edge to the cloud, the company said. Users will be able to access Intel’s comprehensive AI portfolio, including the Gaudi processors, to power to diverse AI use cases such as generative AI training, fine-tuning, retrieval augmented generation, inference and “confidential AI.” Red Hat’s support for Intel extends to edge AI deployments, where customers will be able to run resource-constrained AI models with local execution on Intel’s Core Ultra processors and Arc GPUs.

A similar partnership with Advanced Micro Devices Inc. extends the range of silicon options available to OpenShift AI users. The company said it’s working with AMD to bring that company’s AMD GPU Operators to Red Hat OpenShift AI, giving customers access to a range of AMD GPU resources.

More flexible AI development

On the AI development side, Red Hat said it’s now offering OpenShift AI users access to Stability AI Ltd.’s popular Stable Diffusion models, which specialize in image and video generation. The partnership will make it easier for OpenShift AI developers to unlock the potential of some of the industry’s most powerful large language models, Red Hat said. Its platform is providing the data, model training and fine-tuning capabilities to customize Stability AI’s suite of LLMs for tasks such as image, video, audio, code and language generation.

Meanwhile, Red Hat is partnering with Elastic N.V. to simplify retrieval augmented generation or RAG for customers. RAG is an essential technique for enhancing the power of generative AI models such as ChatGPT, by enabling them to tap into a company’s proprietary data to expand their knowledge. The partnership with Elastic provides OpenShift AI users with a native vector database that’s essential for making unstructured data available to generative AI models.

RAG will enable users to train models with private data without making any modifications to the model itself, while maintaining full privacy of that proprietary information. In other words, Red Hat and Elastic are paving the way for companies to create more advanced and capable AI models that can leverage their most sensitive private data.

In addition, Red Hat is working with Nvidia Corp. to help users optimize AI model inference using that company’s NIM microservices. In the coming weeks, customers will be able to use Red Hat OpenShift AI and Nvidia NIM’s inference microservices within the Nvidia AI Enterprise software platform to accelerate the delivery of generative AI applications.

Red Hat emerges as a key partner in AI development

Red Hat was also keen to show off its progress on the AI front so far. Although it’s still building up its AI development capabilities, it has already become an essential partner for various organizations, including the U.S. National Nuclear Security Administration. The company explained that the NNSA is using Red Hat Enterprise Linux as the operating system of its revamped exascale supercomputer, called El Capitan, which is located at the Lawrence Livermore National Laboratory, as well as its existing supercomputer platforms.

RHEL will act as the catalyst for El Capitan’s software stack, providing a crucial link between the supercomputer and the various AI and machine learning components it relies on. El Capitan will be used to address the safety, security and reliability of the U.S. nuclear weapons stockpile, while supporting research in areas such as climate science, computational biology, material discovery and high energy-density physics, Red Hat said. By standardizing its supercomputer platforms on RHEL, the NNSA will gain enhanced operational efficiency and flexibility, the company promised.

Another partner is Uruguay’s Agency for Electronic Government and Information and Knowledge Society, or AGESIC, which is responsible for directing the Uruguayan government’s digital services and strategy. AGESIC is using OpenShift AI to integrate a range of AI services into Uruguayan government agency platforms. In particular, it uses OpenShift AI to create and develop its AI models, manage their lifecycles and enhance security.

According to AGESIC, its decision to adopt OpenShift AI has helped various government agencies to streamline the development of key AI applications that automate the services they provide to citizens. For instance, it built a new ticketing platform that’s able to resolve the vast majority of tickets without human intervention, in less than 1% of the time it would take to do so manually.

Finally, Red Hat said it’s partnering with AI Sweden, which is Sweden’s national center for Applied AI, which brings together more than 120 partners across the public, private and academic sectors to explore the possible ways in which AI can help to improve people’s lives.

AI Sweden provides researchers with a vendor-agnostic sandbox where teams from various organizations can collaborate on projects together. By utilizing the Red Hat OpenShift AI platform, it offers teams a consistent AI development platform that supports data scientists, engineers and developers alike. Among its capabilities, it enables teams to experiment, host models for integration into applications, and deliver applications across the entire AI lifecycle

Through this collaboration, Red Hat said it and AI Sweden are playing a key role in facilitating the cross-sector transfer of AI knowledge, helping a vast number of organizations to get started on their AI journeys and accelerate adoption of the technology.

Image: SiliconANGLE/Microsoft Designer

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU