YouTube CEO promises consequences for creators who cause ‘significant harm’ to the community
Google LLC’s YouTube has tougher punishments in store for creators who violate the video platform’s content policies, an ongoing problem that came to a head with Logan Paul’s video of a suicide victim.
YouTube Chief Executive Susan Wojcicki published a blog post today outlining the platform’s five priorities for its creators in 2018, which include more transparency and support for creators, as well as new investments in educational content and additional augmented and virtual reality features.
“And we’re also currently developing policies that would lead to consequences if a creator does something egregious that causes significant harm to our community as a whole,” Wojcicki said. “While these instances are rare, sthey can damage the reputation and revenue of your fellow creators, so we want to make sure we have policies in place that allow us to respond appropriately.”
YouTube’s new policies are likely a direct result of the criticism it received over how it handled the Logan Paul controversy. The company’s original statement on the matter only reiterated its policy against “violent or gory content posted in a shocking, sensational or disrespectful manner,” but because Paul had already voluntarily removed his video, YouTube did not take any immediate action against him. Following the public outrage over the video, YouTube announced more than a week later that it had decided to punish Paul after all.
“It’s taken us a long time to respond, but we’ve been listening to everything you’ve been saying,” YouTube said at the time in an open letter to its community. The company removed Paul from its Google Preferred premium advertising service, and it also dropped him from YouTube Red. However, YouTube did not ban Paul’s channel, and he can still make money from his videos through other advertising services.
The Logan Paul controversy was yet another mark against YouTube in its ongoing efforts to assure advertisers that their brands will not show up alongside inappropriate content. Last year YouTube faced a mass exodus of advertisers after the U.K. government learned that some of its ads had been shown on extremist channels. The U.K. pulled its ads from the site, and YouTube owner Alphabet Inc. had to appear before officials to explain what happened.
YouTube promised tougher moderation, but the machine learning system YouTube uses has proven to be a bit heavy-handed. The automatic moderation often flags perfectly acceptable videos for demonetization, which excludes them from YouTube’s advertising program. Wojcicki said today that solving this problem is a “top priority” for YouTube this year.
“While we worked hard this year to provide an appeals system and quicker responses to creators when a video is demonetized, we’ve heard loud and clear that we need a better system,” said Wojcicki. “We’re currently working on a more accurate solution that includes more human review of your content, while also taking your own input into account (since you know your videos best).”
Wojcicki added that YouTube will continue to share updates about this system with creators throughout the year, but she did not elaborate on what consequences rule breakers might face under the new policies.
A message from John Furrier, co-founder of SiliconANGLE:
Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.
We really want to hear from you, and we’re looking forward to seeing you at the event and in theCUBE Club.