Is Google manipulating search results in favor of Hillary Clinton?
Search giant Google, Inc. stands accused of manipulating its search results in favor of presumptive Democratic presidential nominee Hillary Clinton, according to a new report published Thursday.
SourceFed, a YouTube channel with 1.7 million followers and with no obvious political affiliation (it primarily publishes a mix of news, pop culture, and explanatory videos) makes the claim, demonstrating that when a user types Hillary Clinton and then starts to type another word into the search box, Google’s recommended autocomplete results, which are meant to reflect the popularity of a search term (that is the most commonly search terms) don’t provide the most commonly used search terms and instead scrub negative links for positive ones.
One example given is when a user types “Hillary Clinton crim” into the search bar: instead of being recommend a search such as “Hillary Clinton crimes,” Google instead suggests “Hillary Clinton crime bill 1994,” despite Google’s own trends tool showing that “Hillary Clinton crimes” is by far and away the most popular search term.
In stark contrast, typing “Hillary Clinton crim” into Bing and Yahoo results in the top suggestion being “Hillary Clinton criminal” and “Hillary Clinton criminal investigation” respectively.
Another example given is when “Hillary Clinton ind” is entered: Instead of bringing up “Hillary Clinton indictment,” it brings up Hillary Clinton Indiana, independents and India.
However, as Recode reports, Google avoids assuming criminality with people, even those such as Bernie Madoff who were in fact convicted of a crime.
Google denied the claim, telling MarketWatch in a statement that the company was not manipulating any form of search results in favor of a specific presidential candidate.
“Autocomplete predictions are produced based on a number of factors including the popularity of search terms. Our systems are periodically updated to improve search, and our users’ search activity varies, so the terms that appear in autocomplete may change over time … Additionally, our systems automatically filter a small set of offensive or inappropriate content from autocomplete predictions.”
Google’s says in its statement that “predictions are produced based on a number of factors including the popularity of search terms,” and yet the video clearly shows that there is a huge gulf between the popularity of the terms not being shown and those being shown; but hidden within that may actually be Google’s real excuse for what it is clearly doing: “Our systems automatically filter a small set of offensive or inappropriate content from autocomplete predictions” would suggest that Google’s perhaps thinks that negative Hillary content is offensive or inappropriate.
Image credit: Redsilverj/YouTube/CC by 2.0
Since you’re here …
Show your support for our mission with our one-click subscription to our YouTube channel (below). The more subscribers we have, the more YouTube will suggest relevant enterprise and emerging technology content to you. Thanks!
Support our mission: >>>>>> SUBSCRIBE NOW >>>>>> to our YouTube channel.
… We’d also like to tell you about our mission and how you can help us fulfill it. SiliconANGLE Media Inc.’s business model is based on the intrinsic value of the content, not advertising. Unlike many online publications, we don’t have a paywall or run banner advertising, because we want to keep our journalism open, without influence or the need to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from our Silicon Valley studio and globe-trotting video teams at theCUBE — take a lot of hard work, time and money. Keeping the quality high requires the support of sponsors who are aligned with our vision of ad-free journalism content.