Leaked audio reveals Mark Zuckerberg’s feelings about potential Facebook breakup
More than two hours of audio was obtained Tuesday by The Verge that revealed how Facebook Inc. Chief Executive Mark Zuckerberg intends to fight anyone going up against his company.
Much of the conversation Zuckerberg had with his employees – confirmed as genuine – was about the potential breakup of Facebook and how he feels about that. Presidential hopeful Elizabeth Warren was discussed, in particular her opinion that some tech giants are too big for the greater good.
“So there might be a political movement where people are angry at the tech companies or are worried about concentration or worried about different issues and worried that they’re not being handled well,” said Zuckerberg. “That doesn’t mean that, even if there’s anger and that you have someone like Elizabeth Warren who thinks that the right answer is to break up the companies.”
He went on to say that if she gets elected as president a legal challenge may come, but he expects to win. “At the end of the day, if someone’s going to try to threaten something that existential, you go to the mat and you fight,” he said.
Following the leak, Warren responded with a tweet, attempting to mirror Zuckerberg’s informal parlance. “What would really ‘suck’ is if we don’t fix a corrupt system that lets giant companies like Facebook engage in illegal anti-competitive practices, stomp on consumer privacy rights, and repeatedly fumble their responsibility to protect our democracy,” she said.
In the following statements, he joked about competitor Twitter Inc., saying that it faced some similar problems, but it didn’t have the money to invest in safety like Facebook does. “Our investment on safety is bigger than the whole revenue of their company,” he said.
Zuckerberg also spoke candidly about its Libra cryptocurrency and the rise of the social media video app Tik Tok, but what is perhaps more interesting were his thoughts on his refusing to stand up in front of governments and talk about privacy issues.
“When the issues came up last year around Cambridge Analytica, I did hearings in the U.S.,” said Zuckerberg. “I did hearings in the EU. It just doesn’t really make sense for me to go to hearings in every single country that wants to have me show up and, frankly, doesn’t have jurisdiction to demand that.”
He said that given the size of Facebook, he will always face some kind of criticism. Much of that criticism will be centered around regulating big tech, and Facebook will always be in the firing line. “That’s a lot of the social discussion that’s going on, and there’s a lot of merit to that discussion,” he said. “And we need to engage in it humbly.”
The conversation then got a bit personal, with Zuckerberg asked what employees should do when faced with negative views about the place where they work. He said people will always accuse Facebook of just being greedy and trying to make money or having biases about certain issues.
“I think it’s tough to break down these perceptions and build trust until you get to a place where people know that you have their best interests at heart,” he said. He added that in his case, he just sits down and talks with people to let them know “the problems and acknowledge that there are issues and that you’re working through them.”
Lastly, he discussed the investigations earlier this year relating to Facebook’s moderating teams. Those investigations revealed that enforcing Facebook’s ever-changing community standards was a dirty job and some moderators were on the verge of breakdowns. It turned out that sifting through human depravity all day has led to post-traumatic stress disorder in some of those moderators.
Zuckerberg acknowledged that those employees need support and that their job can be difficult, but he also downplayed the findings of the investigation.
“Some of the reports, I think, are a little overdramatic,” he said. “From digging into them and understanding what’s going on, it’s not that most people are just looking at just terrible things all day long. But there are really bad things that people have to deal with, and making sure that people get the right counseling and space and ability to take breaks and get the mental health support that they need is a really important thing.”
He said the 30,000 moderators across the globe do have a big task at hand, with around 100 billion pieces of content a day posted across all the company’s services. His belief is that more support for those people will iron things out.
Facebook Chief Technology Officer Mike Schroepfer also had something to say about this issue, telling the employees that detection technology was being improved and as time goes by fewer people will be exposed to lurid content because the artificial intelligence will spot it first.
“We’ve done a lot of research to show how can we still get the appropriate decisions on the content without having the same sort of emotional impact on the person viewing it,” said Schroepfer. “So there’s a ton of work that I can’t represent in 30 seconds here, but it is a key focus for all the tools teams to sort of reduce dramatically the human impact it would have by looking at this terrible stuff.”
Image: Christoph Scholz/Flickr
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU