UPDATED 00:00 EDT / MAY 21 2024

AI

IBM pivots to focus on code generation with open-source Granite generative AI models

IBM Corp. today made clear its intention to stay at the forefront of generative artificial intelligence development, announcing a host of updates to its year-old watsonx platform, including a new family of open-source Granite models.

It also launched an initiative called InstructLab, which aims to accelerate the contributions of the open-source community to generative AI development.

Open-source is going to be a key element of IBM’s generative AI strategy, said Chief Executive Arvind Krishna. “We want to use the power of open source to do with AI what was successfully done with Linux and OpenShift,” he said in a keynote address at the company’s annual Think conference.

Not surprisingly given its traditional bent, IBM is especially focusing on businesses rather than the consumers many AI companies are trying to appeal to, though Mohamad Ali, senior vice president and chief operating officer of IBM Consulting, told reporters in a briefing that the company has been working with “all these AI companies.”

And those businesses need help. Some 42% of clients have started to put their AI pilots into production, he said, and 40% are “stuck in the sandbox.” He added that IBM is working on more than 300 AI projects with clients.

The open-source Granite models announced today also hint at another aspect of IBM’s generative AI strategy, which is to focus on algorithms that generate software code. According to IBM, they excel at coding tasks, demonstrating both efficiency and an ability to generate high-quality code that’s superior to many alternative large language models.

Granite LLMs excel in coding tasks

IBM said the Granite LLMs range in size from 3 billion to 34 billion parameters, and will be made available in both base- and instruction-following variants. They’re designed to perform a range of tasks, including code generation, fixing bugs, explaining and documenting code, application modernization, maintaining repositories and more. The company said the core models were trained on an astonishing 116 programming languages, which is why they achieve “state-of-the-art” performance compared with other LLMs across coding tasks.

The company shared a number of details regarding the Granite models’ coding performance, saying that they were more efficient across a number of industry benchmarks, including HumanEvalPack, HumanEvalPlus and GSM8K. They assess tasks such as code synthesis, explanation, editing, fixing and translation in popular programming languages such as C++, Go, Java, JavaScript, Python and Rust.

Indeed, the company is so confident in Granite’s capabilities that it used the 20 billion-parameter base code model to train watsonx’s Code Assistant for specialized domains. It also powers the Watsonx Code Assistant for Z, which is an AI assistant designed to help rewrite mainframe applications written in COBOL.

Additionally, IBM said the 20 billion-parameter Granite base code model has been fine-tuned to create Structured Query Language queries from natural language commands, in order to help users without SQL knowledge to obtain insights from their databases.

Gartner Inc. analyst Arun Chandrasekaran said IBM’s focus on generative AI code generation is not surprising, because many of its core customers are enterprise chief information officers and information technology leaders, who are particularly interested in generative AI’s potential in terms of IT modernization.

“IBM perhaps foresees an opportunity to help those customers with modernization,” he said. “While the move toward open source is important as it provides clients with transparency and customizability, whether IBM can monetize that opportunity is yet to be seen.”

Krishna said another big advantage of open-sourcing the Granite family is that the company will be able to leverage the assistance of a much wider community of developers, customers and other experts to build on their initial strengths and make them even more capable.

“Open means more eyes on code, more minds on problems and more hands on solutions,” Krishna said as he explained the company’s strategy. “For any technology to gain velocity and become ubiquitous, you’ve got to balance three things: competition, innovation and safety, and open source is a great way to achieve all three.”

According to Holger Mueller, an analyst with Constellation Research Inc., IBM’s decision to go all-in on the open source approach with the Granite models will have a major impact on the wider trend of AI code generation. “With all of the coding experience and exposure IBM possesses, it has created some of the best coding LLMs so far, and you can see from its partner momentum that they’re going to be extremely popular,” he said.

InstructLab to accelerate open AI development

In line with the open-source strategy, IBM said it’s teaming up with its affiliate Red Hat Inc. on a new initiative called InstructLab, which is a methodology for the continuous development of base generative AI models. It encourages the community to contribute to their development, with constant, incremental contributions improving the performance, efficiency and safety of AI models over time.

InstructLab provides tools and guides for open source developers to customize the Granite LLMs and other models for specific business domains, enabling them to use their own data to increase performance.

Meanwhile, IBM and Red Hat announced a new offering that will leverage these open-source AI contributions, called Red Hat Enterprise Linux AI, which is an enterprise-ready version of InstructLab. It provides access to the full suite of Granite models and the Red Hat Enterprise Linux platform to support the deployment of AI across hybrid cloud and on-premises information technology infrastructures, the company said.

New watsonx assistants

In another update to watsonx, the company detailed a new class of watsonx assistants, as well as new tools in watsonx Orchestrate to help customers build their own.

Some of the new assistants include the watsonx Code Assistant for Enterprise Java Applications, which will become available in October, and the watsonx Assistant for Z, which will launch in June and transform how users interact with IBM’s Z mainframes. Also in June, watsonx Code Assistant for Z will be updated with new features that allow it to understand and document applications in natural language.

Meanwhile, the watsonx platform will provide access to new Nvidia Corp. graphics processing units, specifically the GPU L4 and L40s processors, which will support the RHEL platform and OpenShift AI. Moreover, IBM said it will introduce “deployable architectures” in watsonx for teams to accelerate the time it takes to deploy generative AI models in a secure and compliant way.

Also new are the IBM Data Hub, Data Gate for watsonx and various updates to the existing watsonx.data platform, which are all slated to arrive in June. These will help to augment how companies can observe, govern and optimize their datasets for AI, IBM said.

“The fuel for AI is data,” Kareem Yusuf, senior vice president of product management and growth at IBM Software, said in the press briefing.

Watsonx automation, integrations and third-party models

Finally, IBM announced a new set of automation capabilities to enable AI-powered predictive automation of IT environments, and a host of third-party integrations and models that can now be accessed in watsonx.

The IT automation tools will be enhanced by the infrastructure-as-code capabilities of HashiCorp Inc., which IBM is acquiring for a reported $6.4 billion. Central to these automation efforts will be a new offering called IBM Concert, which acts as the “nerve center” of organization’s IT operations, delivering AI-powered insights from various applications and infrastructure platforms for performance optimization and troubleshooting.

“The release of IBM Concert will be a major step forward for customers running IBM’s and other systems,” Mueller said.

As for the third-party integrations and models, they include Amazon SageMaker and watsonx.governance to enhance AI governance on the Amazon Web Services cloud, and the availability of the watsonx platform on Microsoft Azure. The watsonx platform will also provide access to new models such as Meta Platforms Inc.’s Llama 3 and Mistral’s Mistral Large model, while the IBM Granite models will be available in Salesforce Inc.’s customer relationship management platform to power that company’s Einstein assistants.

IBM didn’t specifically mention other AI leaders such as Google LLC and Anthropic PBC, but Ali said IBM is working with them as well.

With reporting from Robert Hof

Image: SiliconANGLE/Microsoft Designer

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU