UPDATED 15:17 EDT / JULY 12 2022

AI

Researchers open-source neural network with 176B parameters

A group of researchers today released Bloom, an advanced natural language processing model that features 176 billion parameters.

The researchers have made the code for Bloom available under an open-source license.

The project began last year as collaboration between Hugging Face Inc., an artificial intelligence startup that recently raised $100 million from investors, and two supercomputing organizations in France. Hugging Face and its partners formed a research group called BigScience to lead the development of Bloom. More than 1,000 researchers from more than 70 countries participated in the effort.

Bloom supports 46 languages and 13 programming languages, BigScience researchers wrote in a blog post today. The AI can answer questions, summarize text, extract snippets of information from documents and perform a variety of other tasks. Bloom’s versatility is partly the result of the fact that it features 176 billion parameters.

Parameters are the settings that determine how an AI goes about performing a computing task. The more such settings an AI system includes, the more advanced the tasks that it’s capable of performing. With 176 billion parameters, Bloom is one of the most sophisticated natural language processing models in the world.

Bloom features more parameters than the advanced GPT-3 neural network that OpenAI LLC detailed in 2020. Like Bloom, GPT-3 is optimized for natural language processing use cases. It’s also capable of performing other tasks such as generating software code.

BigScience researchers trained Bloom using the Jean Zay supercomputer near Paris. The supercomputer, which includes AI-optimized graphics cards from Nvidia Corp., has a top speed of more than 28 petaflops. One petaflop equals a quadrillion calculations per second.

“This is the culmination of a year of work involving over 1000 researchers from 70+ countries and 250+ institutions, leading to a final run of 117 days (March 11 – July 6) training,” BigScience researchers detailed today. The development effort was supported by “a compute grant worth an estimated €3M from French research agencies CNRS and GENCI,” they elaborated.

Alongside the code for Bloom, the BigScience research group open-sourced some of the technical data that was produced during the development process. Developers can run Bloom on their own hardware or access a hosted version of the AI through an application programming interface provided by BigScience.

In the future, the research group plans to develop a new version of Bloom with even more advanced capabilities. BigScience intends to add support for more languages and optimize the AI to make it easier to run on a company’s own infrastructure. BigScience will also develop additional AI systems with more complex architectures than Bloom. 

Photo: Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU