UPDATED 14:50 EDT / NOVEMBER 04 2014

Facebook CEO Mark Zuckerberg NEWS

Facebook manipulating Election Day voter turnout?

Facebook Inc. isn’t neutral. The world’s largest social network is not a disinterested platform. Facebook has opinions and desires. The discussions that take place between users can be thought of as a marketplace of ideas, but this venue is biased, and it takes sides. Here on Election Day, should Americans be concerned with Facebook’s ability and motive to influence the outcome of elections?

Facebook is allowed to experiment on you. Check the terms of service. You’ve already agreed to it. You’re a social network lab rat, and you don’t even realize it. The company has demonstrated that it’s willing to mess with your head for little more than curiosity’s sake. When it was discovered that Facebook conducted an experiment to see if altering users’ news feeds to display more negative posts resulted in those users posting more negative items themselves, there was a public outcry. Facebook promised to be more transparent in the future, but it indicated that the experiments would continue.

If Facebook’s research team found conclusive evidence that increasing the number of negative posts resulted in users spending significantly longer periods of time on Facebook, is there any question as to whether or not they’d make permanent changes to their news feed algorithm to take advantage of that fact? Is it ethical to adjust news feed algorithms in order to achieve the company’s financial goals? What about its political goals?

Facebook’s political goals

 

We know some, but not necessarily all, of Facebook’s political goals. In 2013, Facebook CEO Mark Zuckerberg launched the 501(c) political lobbying group FWD.us to push for comprehensive immigration reform. Earlier this year, Facebook weighed in on the homosexual marriage debate by introducing a set of LGBT stickers to promote its support for marriage equality. With Facebook professing these political preferences, is it safe to assume that the social network has election result preferences? Can its news feed algorithm be tweaked to benefit specific candidates, or political parties that share its political views?

For the last six years, Facebook has employed what it calls its “voter megaphone” to encourage citizens to vote on election day. It’s a high profile graphic displayed on users’ news feed that prompts them, in various ways, to share the fact that they voted. It’s estimated that several million additional votes are cast due to this social encouragement. The research team tested groups of users, looking for the most effective voter turnout formula. They tested alternate phrases, such as “I’m a voter” and “I voted” to see what moved more users to share. They pushed more political news stories to some user groups, to see if that put them in the mood to perform their civic duties. By all accounts, its efforts are working.

Facebook knows how to increase voter turnout. Unfortunately, there’s no way to determine exactly what Facebook is doing with that knowledge. With the vast information it catalogs in its knowledge graph, Facebook knows your political persuasion, even if you haven’t explicitly stated it. Based on the articles that you like, and the comments that you make, Facebook can piece together who you are, and what you stand for. Facebook isn’t neutral. Are news articles that argue against its political goals purposefully underrepresented in user news feeds? Is it attempting to persuade undecided users with more articles that support its pet causes? Facebook is biased. Is it using its most effective voter turnout methods on users who share the same political goals? Is it using ineffective voter turnout methods, or perhaps none at all, on voters who are in opposition? All we know is that it possesses the power and the motive to do so.

Whether or not this article will survive Facebook’s Election Day algorithm remains to be seen. Will this falling tree make a noise if it’s relegated to an empty forest? Facebook can silence this article, and anything else it deems as opposition. Without transparency, we’ll never know if it does.


A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU