UPDATED 20:18 EDT / JUNE 27 2019

CLOUD

Google debuts Deep Learning Containers in beta

Google LLC this week announced the beta availability of a new cloud service that provides environments optimized for deploying and testing applications powered by deep learning, a subset of artificial intelligence that tries to mimic the way the human brain tackles problems.

The service, called Deep Learning Containers, can be run both in the cloud or on-premises. It consists of numerous performance-optimized Docker containers that come packaged with various tools necessary to run deep learning algorithms.

Those tools include preconfigured Jupyter Notebooks, which are interactive tools used to work with and share code, equations, visualizations and text, and Google Kubernetes Engine clusters, which are used to orchestrate multiple container deployments.

The service also provides machine learning acceleration capabilities with Nvidia Corp.’s graphics processing units and Intel Corp.’s central processing units. Nvidia’s CUDA, cuDNN and NCCL machine learning libraries are also thrown in.

In a blog post Wednesday, Google software engineer Mike Cheng explained that Deep Learning Containers are designed to provide all of the necessary dependencies needed to get applications up and running in the fastest possible time. The service also integrates with various Google Cloud services, such as BigQuery for analytics, Cloud DataProc for Apache Hadoop and Apache Spark, and Cloud Dataflow for batch processing and streaming data using Apache Beam.

The service supports all of the major deep learning frameworks, including PyTorch and TensorFlow, Cheng said.

Besides running Deep Learning Containers on-premises, users have the option to host them on Google’s Compute Engine and Kubernetes Engine services, or on the Google AI Platform, which was introduced in April as a specialized cloud service for building, testing and deploying AI models.

Some might argue that Google is late to the game with today’s offering, as its rival Amazon Web Services Inc. launched its own Deep Learning Containers service in general availability, meaning it may be more mature than Google’s. Meanwhile, Microsoft Corp. has offered its Azure Machine Learning Workspaces service for some time.

But Constellation Research Inc. analyst Holger Mueller told SiliconANGLE that’s not the case as Amazon’s offering is more focused on running apps that have already been built, whereas Google’s offering is targeted at the developer side. As for Microsoft’s Machine Learning Workspaces, this doesn’t offer the same kind of standardization, Mueller said.

Mueller added that with Deep Learning Containers, Google is making machine learning environments easier for developers to set up and faster to access, which is different from its rival’s offerings.

“This will help CxOs to add the machine learning components they need to power their next-generation applications,” Mueller said.

 

Image: Google

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU