UPDATED 21:36 EDT / NOVEMBER 07 2023

APPS

Meta whistleblower testimony will push US closer to an online safety bill

A former engineering director at Facebook Inc. told a Senate Judiciary subcommittee today how he was ignored when he aired his concerns to management about how the platforms he worked for could damage the well-being of children.

This is hardly the first time we’ve heard about how Facebook, now Meta Platforms Inc., prioritizes growth and profits over safety. It is, however, another insider coming out and saying it. That may at some point lead to the U.S. adopting legislation that will significantly curtail the amount of time young people can spend on social media and what they can do and see on various platforms.

A bipartisan bill of this nature has already been introduced: the “Kids Online Safety Act” or KOSA. In October, 42 U.S. attorneys announced that they were suing Meta for the harm they allege such platforms perpetrate on the young, citing evidence of bullying, depression, anxiety and self-harm. It might just happen that the U.S. will soon see dramatic changes to how kids use social media, although privacy advocates concerned about civil rights and censorship have said such a law will become a heavy burden.

The whistleblower, Arturo Bejar, was the former director of engineering for a Protect and Care group when the company was still Facebook. He told senators that his time at the company was spent within what he called a “see no evil, hear no evil” culture. He first worked there between 2009 and 2015 and again from 2019 to 2021.

Béjar’s main responsibility at the company was to try to ensure that Facebook and Instagram were less harmful. His doubts became misgivings when he noticed what happened when his own 14-year-old daughter started using Instagram. That soon led to “unwanted sexual advances, misogyny, harassment,” he said, but he found out this was the experience of many young girls.

“We’re in a very extraordinary time where there’s consensus across the political spectrum about the urgency and necessity of passing legislation that protects our kids, all of our kids,” he told the committee. He explained that Meta could quite easily build a button so young people could report such issues, but this hasn’t happened in part because there’s “no transparency about the harms that teenagers are experiencing on Instagram.”

“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” Meta said in a statement to the media as a response. “Working with parents and experts, we have also introduced over 30 tools to support teens and their families in having safe, positive experiences online.”

Given this new testimony, and the issues of psychological manipulation to keep kids pressing buttons, along with sleep deprivation and a culture of envy and more mental health concerns, it’s becoming more likely the safety bill will become law. Connecticut Sen. Richard Blumenthal said what’s coming will be similar for Big Tech to what happened to Big Food, although there are real concerns that telling children what content is “harmful” to consume could equate to overreaching censorship.

Photo: Plann/Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU