UPDATED 11:00 EDT / DECEMBER 10 2020

CLOUD

Samsung adopts Google Cloud’s TPU chips to train its Bixby voice assistant

Google LLC today disclosed that Samsung Electronics Co. Ltd. has been using its Cloud Tensor Processing units, specialized artificial intelligence chips available via its cloud platform, to enable the voice assistant features it offers users.

Samsung is one of the world’s largest mobile device makers. It provides a voice assistant under the Bixby brand that competes with services such as Apple Inc.’s Siri and runs on more than 160 million devices worldwide.

TPUs, in turn, are a series of application-specific integrated circuits, or specialized chips, developed by Google for the sole purpose of running AI models. The Alphabet Inc. subsidiary originally developed the TPU to power the machine learning components of its own online services. In 2018, Google made the chips available to the public via Google Cloud, positioning them as a way for enterprises to increase the speed and reduce the cost of their machine learning projects.

Samsung’s motivation for adopting Cloud TPUs was to cut AI training times. To process voice commands it receives from users, the company’s Bixby assistant turns them into text using an automatic speech recognition engine. Samsung engineers upgraded the engine to a new, more modern speech recognition model not long ago and encountered a challenge: The new model took more sample data to train, which in turn increased the duration of training sessions.

Google says it helped the phone maker address the issue. After Samsung started using TPUs in Google Cloud, a training session that would have previously taken 180 hours to complete on the company’s on-premises AI infrastructure could be reduced to just 10 hours. That’s using chips based on the third-generation TPU design, the latest version of currently available in Google Cloud, which provides 420 teraflops of performance. One teraflop equals a trillion calculations per second.

Google says that the faster training times made it easier for Samsung’s engineers to implement the new speech recognition model in Bixby. The new model is one-tenth the size of the original, has 4.7% lower “word error rates” and, most important, runs 11 times faster.

It’s notable that though the third-generation TPU used by Samsung is the fastest version of the chip available in Google Cloud, it’s not the fastest version overall. The Alphabet Inc. unit in July previewed a fourth-generation TPU described as having 2.7 times higher average performance. Given that rival Amazon Web Services Inc. is actively enhancing the selection of AI chips available in its platform, it’s quite possible Google will eventually make its fourth-generation AI chips available to Google Cloud customers just like it did with the previous iterations.

Image: Google

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU