Google Search gets hum-to-search and AI query upgrades
Google LLC has announced a set of artificial intelligence upgrades to its search engine that will enable users to find more types of information, as well as increase the accuracy of returned results.
Company executives detailed the enhancements at the company’s Search On virtual event on Thursday.
In cases when users want to find a particular song but can’t remember its name, they can now hum, whistle or sing parts of it and Google will try to identify the tune. That’s made possible by machine learning models that convert the audio into an abstract representation consisting of a series of numbers. This series is then compared to a database of number sequences generated from popular songs.
Looking for videos will become easier as well. Google’s algorithms can now identify key moments in clips indexed by its search engine, tag them and incorporate them into search results. For instance, if a user is looking for information about a step in a recipe, Google could not only surface a relevant culinary video but also flag the specific part of the clip during which the given step is discussed.
“We’ve started testing this technology this year, and by the end of 2020 we expect that 10 percent of searches on Google will use this new technology,” detailed Google Head of Search Prabhakar Raghavan.
Beyond making it easier to browse media content, Google wants to simplify the task of looking up statistics such as economic growth and smartphone adoption. The search giant has been working on a repository of statistics called the Data Commons Project since 2018 that includes billions of data points from sources such as the U.S. Census Bureau. Now, Google will start surfacing statistics from the repository in response to user queries.
Google also announced more general-purpose enhancements at Search On that focus on boosting the search experience as a whole rather than specific types of queries. Google’s built-in spellchecker, which detects when a query might be misspelled and suggests alternative wording, is receiving what the company describes as the biggest improvement in the past five years. Another upgrade will enable the search engine to return the specific passage from a web page that it deems to be most relevant to a user’s request.
Separately, the company is implementing new neural networks optimized to identify query subtopics. If, for instance, a user searches the term “smartphone accessories,” these neural networks would generate a list of subtopics that might include wireless chargers, headphones and phone cases. Google plans to incorporate a more diverse mix of subtopics into search results with the help of the technology to increase the chance of users finding what they’re looking for.
Google executives detailed a number of other product enhancements at the event as well, including improvements to business listings and the Google Lens app. Lens lets aim their phone camera at an object to receive more information about it.
A message from John Furrier, co-founder of SiliconANGLE:
Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
We are holding our third cloud startup showcase on Sept. 22. Click here to join the free and open Startup Showcase event.
We really want to hear from you, and we’re looking forward to seeing you at the event and in theCUBE Club.