UPDATED 13:34 EDT / MAY 11 2016

NEWS

Amazon says its new deep learning library is 2x faster than Google’s

Though Amazon Inc. doesn’t make a habit of sharing its internally-produced software with the outside world, the large number of fellow web giants that had open-sourced their deep learning technology in recent months has apparently prompted a change of heart. And so the company quietly joined the fray yesterday with the release of a C++ library for developing neural networks that could make the task significantly faster than before for data scientists.

Dubbed DSSTNE, the framework owes its speed in large part to the parallelization mechanism that Amazon included under the hood to handle distributed processing. Most alternatives execute deep learning models by running separate copies of the code on each GPU at their disposal and synchronizing the activity using some sort of orchestration mechanism. Others will assign each major element of the algorithm to a different chip, which is slightly more efficient but still doesn’t make the most out of the available hardware. DSSTNE, in turn, implements an improved variation of the latter method to figure out the optimal number of calculations a given processor can handle and then distribute the load accordingly.

The resulting performance improvement becomes especially pronounced when the framework is used to analyze so-called sparse datasets that are missing a lot of details. Amazon optimized  DSSTNE for handling partial information to help speed the creation of recommendation engines, which typically don’t have access to all the information they’re programmed to weigh. The product suggestion feature on the retail giant’s website, for instance, can’t take a visitor’s buying history into account if they’re not logged into their account. Meanwhile, search applications will be able to exploit the framework as well to deal with the semantic gaps that often appear in user queries.

Amazon also plans on adding support for image and speech recognition algorithms later down the road in an effort to broaden the appeal of DSSTNE even further. The library already poses a threat to existing deep learning engines: The retail giant claims that it outperformed Alphabet Inc.’s popular TensorFlow system by 2.1 times during an internal benchmark test.

Image via blickpixel 

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU