UPDATED 22:02 EST / JULY 27 2023

POLICY

Study: Facebook might not be as politically polarizing as we think

A series of studies published today tells us that Facebook and Instagram’s algorithms do not explicitly divide people politically, and tweaking the algorithms won’t change someone’s political attitudes.

The four studies, two of which were published in the science journals Nature and Science, address the concern regarding so-called “echo chambers” on social media platforms dividing people and attributing to the current political polarization we see in the U.S. The researchers called this “ideological segregation.”

For years now, there have been calls from regulators and lawmakers to address what they consider to be political division, which they say is created partly by social media algorithms. The studies, referred to in some media as “landmark” studies, show a different picture. The algorithms, the researchers say, are not driving the problem. Facebook, according to these studies, isn’t pushing people more left or more right.

The researchers analyzed the 2020 election in the U.S., with Meta Platforms Inc. cooperating in the various tests. Millions of Facebook users and the algorithmically curated content they engaged with was looked at.

When the researchers asked Meta to tweak the algorithm for certain people, the researchers said this had “no measurable effect” on how they thought politically. Facebook users, it seems, have already made their choice to read or view CNN, Fox News, Glenn Greenwald or The Guardian.

They did say that certain content has the “potential to reinforce partisan identity even if it is not explicitly political.” They also found that someone identifying as conservative had more chance of consuming a post labeled as misinformation than someone who identifies as liberal, although they didn’t analyze the said content. Turning off their share button, they said, cut down on the spread of misinformed news but didn’t change how people’s political attitudes.

The researchers found that when the Facebook algorithm was turned off for them, they often just went to other platforms, such as YouTube or TikTok — bad news for Meta! Interestingly, when Meta turned off the algorithm so it showed users the most recent content, not the content tailored to them, this didn’t affect their political attitude but did expose them to more content marked as untrustworthy.

These and the other results will certainly give Meta ammunition the next time the company is accused of dividing America, although this “landmark” research is surely not the end of the debate.

“The research published in these papers won’t settle every debate about social media and democracy, but we hope and expect it will advance society’s understanding of these issues,” said the company. “Its findings will be hugely valuable to us, and we hope they will also help policymakers as they shape the rules of the road for the internet.”

Photo: jt/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU