UPDATED 23:19 EDT / MARCH 28 2018

EMERGING TECH

Microsoft’s Bing search engine uses FPGA chips to provide more intelligent answers

Microsoft Corp. has posted an update about how its Project Brainwave deep learning acceleration project is being used to power Bing’s search service.

Project Brainwave is an extension of Microsoft’s development work on field programmable gate arrays, which are processors used in data centers that can be reprogrammed in real time for different computing tasks.

The project was first announced in August, when Microsoft said it was all about enabling “real-time AI” by transforming FPGAs into hardware microservices that can speed up deep learning processes. Deep learning is a type of artificial intelligence technology that aims to emulate parts of how the brain learns, enabling tasks such as image and speech recognition without explicit programming.

Microsoft uses Intel Corp.’s Arria and Stratix 10 FPGAs to power a system of deep neural networks that can produce intelligent search results in milliseconds, the company said Monday. Intel acquired the Arria FPGAs in 2015 when it bought their designer Altera Corp. for $16.7 billion. It made the Stratix 10 FPGAs itself by combining its 14-nanometer manufacturing process with its HyperFlex fabric architecture.

These FPGAs are now buried deep inside the servers that process Bing search queries. As a result, Bing can now gather information from multiple sources and present “intelligent answers” to users’ search queries.

1-multi-site-tundra_1

Example of Bing’s new “intelligent answers” feature

Project Brainwave also enables a lookup function for jargon, technical terms and other uncommon words, allowing Bing to recognize these and highlight them and provide a definition should users hover the cursor over the word.

2018-03-27_11-52-00

Microsoft is also working on using the FPGAs to power a new feature in Bing that provides multiple answers to how-to questions. The company said this will be useful in cases where the searcher’s query may not be specific enough to generate a conclusive answer. Microsoft said this feature will be enabled in coming weeks.

“Intel’s FPGA chips allow Bing to quickly read and analyze billions of documents across the entire web and provide the best answer to your question in less than a fraction of a second,” Microsoft said in a blog post.

Even so, the FPGAs Microsoft is using could soon find themselves being surpassed by new adaptive computing acceleration platform chips created by Intel’s rival Xilinx Inc. That company, which specializes in building FPGAs, announced a new proprietary chip earlier this month that it claims delivers a 20-times performance boost over its previous best-performing FPGA, the Virtex Vu9P.

Images: Microsoft

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU