Hugging Face and AWS expand cloud partnership
Hugging Face Inc., the operator of a popular platform for hosting machine learning models, is partnering with Amazon Web Services Inc. to streamline artificial intelligence development projects.
The companies announced the partnership today. It expands upon an existing collaboration that Hugging Face and AWS launched in early 2021.
“Generative AI has the potential to transform entire industries, but its cost and the required expertise puts the technology out of reach for all but a select few companies,” said AWS Chief Executive Officer Adam Selipsky. “Hugging Face and AWS are making it easier for customers to access popular machine learning models to create their own generative AI applications with the highest performance and lowest costs.”
New York-based Hugging Face is backed by more than $160 million in funding. It operates a platform similar to GitHub that developers use to host open-source AI models, as well as related technical assets such as training datasets. The platform stores the code for more than 100,000 neural networks.
As part of the expanded partnership announced today, Hugging Face will use AWS as its preferred public cloud. Additionally, the startup is rolling out a new integration with the Amazon SageMaker machine learning platform. The platform includes more than a half dozen cloud services that developers can use to build, train and deploy AI models.
The newly announced integration will enable developers to deploy neural networks hosted by Hugging Face on SageMaker in a few clicks. After an AI model is uploaded to SageMaker, it can be trained using cloud instances powered by AWS Trainium chips. The chips are specifically optimized for AI training tasks.
Neural networks deployed from Hugging Face to AWS also work with other types of cloud instances, including those powered by the AWS Inferentia accelerator series. Inferentia accelerators are chips optimized to perform inference, or the task of running AI models in production after the training phase is complete.
“The future of AI is here, but it’s not evenly distributed,” said Hugging Face CEO Clement Delangue. “Amazon SageMaker and AWS-designed chips will enable our team and the larger machine learning community to convert the latest research into openly reproducible models that anyone can build on.”
The new integration will complement the Hugging Face AWS Deep Learning Containers that the companies already offer to developers as part of their partnership. The containers make AI models from Hugging Face available in a prepackaged format that is easier to deploy in public cloud environments.
Swami Sivasubramanian, vice president of database, analytics and machine learning at AWS, spoke with SiliconANGLE Executive Editor John Furrier about AWS’ plans to leverage the relationship with Hugging Face. “With this relationship, we’ll be able to democratize AI for a broad range of developers,” he said in the interview, powered by SiliconANGLE’s video studio theCUBE.
Catch the highlights of the interview here:
Image: AWS
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU