Meta partners with Microsoft to distribute next-gen LLaMA 2 AI model for commercial use
Meta Platforms Inc. today announced that it has partnered with Microsoft Corp. to distribute its latest artificial intelligence models for commercial use that will allow businesses to build their own AI-enabled apps and tools.
The new family of large language models, called LLaMA 2, is the next generation of Meta’s AI research into generative AI. These AI models are capable of understanding human speech and responding conversationally, as well as producing research, answering questions, writing poetry and generating computer code.
Meta has a long history of releasing its AI models open-source and the entire LLaMA 2 family is available today for free, which includes multiple pretrained and fine-tuned models with 7 billion, 10 billion and 70 billion parameters. According to Meta, LLaMA 2 was trained with over 40% more data than the original model and has improvements in its architecture, it was fine-tuned with more than 1 million human annotations to produce higher safety and quality.
“Open source drives innovation because it enables many more developers to build with new technology. It also improves safety and security because when software is open, more people can scrutinize it to identify and fix potential issues,” Meta Chief Executive Mark Zuckerberg said in a Facebook post related to the announcement.
As a preferred partner for the distribution of LLaMA 2, the AI models will be available for developers using Microsoft Azure to build with it and use cloud-native tools for training, fine-tuning and deployment. It is also available to use its internal safety features on Azure, such as Azure AI Content Safety, which can be combined with Meta’s own filters to monitor content to ensure a less harmful experiences.
Microsoft also said that the inclusion of the AI models on Windows means that it’s optimized for running on that operating system. Developers will be able to run it locally to build generative AI-powered applications.
This news comes shortly after a report that Meta was planning to release an AI model for commercial use, as the previous version of LLaMA was licensed only for research purposes. By stepping into the commercial market, Meta seeks to change the balance of power among big players such as Microsoft-backed OpenAI LP, which is behind the popular ChatGPT and GPT-4, and Google LLC, which builds Google Bard.
By having an open-source model, Meta’s original LLaMA has become the basis for numerous other open-source models after its parameters and weights were leaked broadly in March of this year. Its presence in the industry as open source also means that it can be used as the basis for a multitude of projects and make it easier to adopt, and with commercial viability gives developers easier access. “I believe it would unlock more progress if the ecosystem were more open, which is why we’re open-sourcing LLaMA 2,” added Zuckerberg.
The fact that OpenAI and Google’s AI models are so-called black boxes, or “closed source,” caused an engineer at Google to comment in May that the two companies have “no moat” when it comes to the industry. From the earlier report, sources said that Meta plans to keep the models free, but may release a service for training and fine-tuning for enterprise customers with a pricing scheme in the future.
Microsoft isn’t the only company Meta is partnering with on LLaMA 2 support. Qualcomm Inc. announced today that it will enable the new model on its smartphone chips and personal computers starting in 2024. Although generally, LLMs require large server farms to run, Qualcomm chips include a “tensor processor unit,” or TPU, which provides the type of computing power needed to do AI processing.
This will allow the smaller LLaMA 2 models to run easily on smartphones and PCs that have these types of chips in them. If the AI were running on the phone or PC instead of offloading its computing power to a cloud compute farm, it also means that the user would need to worry less about privacy and also lower costs for companies creating AI-enabled applications.
Image: Pixabay
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU