Google upgrades Bard with technology from its cutting-edge PaLM language model
Google LLC has enhanced its Bard chatbot’s capabilities using technology from PaLM, an advanced language model that it debuted last year.
Google and Alphabet Inc. Chief Executive Officer Sundar Pichai detailed the update in a New York Times interview published today. PaLM will “bring more capabilities; be it in reasoning, coding, it can answer maths questions better,” Pichai said. Jack Krawczyk, the product manager for Bard at Google, added in a tweet that the update has already been released.
The new version of Bard is described as being more adept at solving math problems. The chatbot can answer “multistep” text prompts more reliably as well, Krawczyk stated. Further down the line, Google also expects improvements in Bard’s ability to generate software code.
PaLM, the language model that the search giant used to enhance Bard, was first detailed by its researchers last year. The model features 540 billion parameters, the configuration settings that determine how a neural network goes about processing data. The more parameters there are, the more tasks a neural network can manage.
The PaLM model demonstrated impressive performance in a series of internal evaluations carried out by Google. During one test that involved 28 natural language processing tasks, it achieved a higher score than OpenAI LP’s GPT-3 model. It also set new records in two math and coding benchmarks.
Google trained PaLM on two TPU v4 Pods hosted in its public cloud. Each TPU v4 Pod includes 4,096 chips optimized specifically to run AI workloads. Combined, those chips can provide up to 1.1 exaflops of performance, which equals 1.1 million trillion calculations per second.
During the development of PaLM, Google managed the AI training process using an internally developed software system called Pathways. The system distributes the computations involved in training an AI model across multiple chips to speed up the workflow. When running PaLM, Pathways used 57.8% of the underlying chips’ processing performance, which Google says set a new industry record.
The original version of Bard that Google introduced last month was based on an AI called LaMDA. Google first detailed LaMDA last January, or three months before it debuted PaLM. The former model supported up to 137 billion parameters at the time of its introduction while PaLM features 540 billion.
“We clearly have more capable models,” Pichai told the Times in reference to LaMDA. “To me, it was important to not put [out] a more capable model before we can fully make sure we can handle it well.”
Image: Google
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU