UPDATED 08:00 EDT / JANUARY 18 2024

AI

DigitalOcean debuts Nvidia H100 GPU instances to expand access to advanced computing for AI workloads

The developer-focused cloud computing infrastructure company DigitalOcean Holdings Inc. said today it’s making Nvidia Corp.’s most advanced H100 tensor core graphics processing units available to small and medium-sized businesses via its Paperspace platform.

The move will enable SMBs and startups to access the critical infrastructure required to develop next-generation applications powered by artificial intelligence, which require access to extremely powerful computing hardware.

DigitalOcean explained that the industry has seen extremely high demand for Nvidia’s H100 GPUs. The strong demand has led to high costs for GPU-powered instances on competing cloud platforms such as Amazon Web Services, Google Cloud and Microsoft Azure, making these resources inaccessible to smaller companies. That’s what DigitalOcean is attempting to change.

The company points out that Nvidia’s GPUs are unique in their ability to reduce drastically the time it takes to train new artificial intelligence models and increase model inference performance. As such, they’re the most sought-after resource for companies looking to perform large-scale AI model training.

To serve SMBs and startups that are constrained by limited resources and budgets, DigitalOcean said, its H100 GPUs can be accessed by customers on-demand, or else reserved for a specific term limit. As a result, smaller businesses can now access a more cost-effective consumption model to run their AI workloads.

That’s in contrast to other cloud providers, which have optimized their GPU offerings to serve the largest enterprises, said DigitalOcean’s general manager of AI and machine learning, Kanishka Roychoudhury. “As we look ahead, DigitalOcean remains committed to providing solutions that offer superior performance and keep costs in check, lowering the barriers of entry for smaller businesses looking to leverage AI to improve profitability and efficiency,” he said.

The company said customers will be able to access its GPUs as individual machines or as clusters, meaning it can offer the versatility required for use cases ranging from multinode model training to complex inference workloads. In addition, it said it has made “significant improvements” to its networking infrastructure, and can now offer interconnect speeds in excess of 3.2 terabits per second, ensuring the most optimal performance for customers that need to connect multiple instances.

DigitalOcean is a competitor to Amazon Web Services Inc. and Microsoft Corp. in the public cloud infrastructure market. However, rather than take on those giants directly, it has carved a niche for itself serving small businesses with its “developer cloud” that makes it easy for small teams of developers to create modern applications.

With the DigitalOcean App Platform, developers can deploy application code in production with a few clicks, in line with the company’s stated aim of keeping cloud computing simple. Its pitch is that it takes care of the cloud infrastructure and deployment side of things, so developers can maintain a focus on their code.

Customers have been eagerly awaiting this offering, ever since the company acquired Brooklyn-based Paperspace Co. in a $111 million deal last July. Paperspace operates a public cloud infrastructure platform with a very narrow focus, being specifically designed for AI projects. It provides access to multiple kinds of GPUs from Nvidia, in addition to software development tools that can help developers get started in building and deploying neural networks.

Alongside the GPUs, DigitalOcean’s customers will also be able to take advantage of Paperspace’s Gradient Deployments feature, which makes it easier to deploy and scale modern AI applications. It provides a simpler and more flexible container registry experience, as well as enhanced security for deployment endpoints with superior access controls, the company said.

Image: DigitalOcean

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU