UPDATED 22:02 EST / JANUARY 08 2020

APPS

Facebook’s ban on deepfakes doesn’t impress critics

Despite Facebook Inc.’s vow Monday that there’s no room for deepfake videos on the platform, critics think its ban doesn’t go far enough.

The company acknowledged in a blog post that manipulated media is a rising concern, especially as the 2020 U.S. presidential election approaches. Last year, Facebook came under pressure after a manipulated video of House Speaker Nancy Pelosi surfaced on the platform and it wasn’t immediately taken down.

“As a result of these partnerships and discussions, we are strengthening our policy toward misleading manipulated videos that have been identified as deepfakes,” Facebook said in the post. “Going forward, we will remove misleading manipulated media.”

The criteria for a removal was simple enough. Facebook said it would take down manipulated media that might confuse average people so they think a person is saying something they actually didn’t. Media will also be removed if artificial intelligence has been employed to merge, replace or superimpose images into a video.

“This policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words,” said Facebook. Content that isn’t immediately removed will still be subject to review and may later be taken down. If by that time the video has been shared numerous times, anyone who has shared it or viewed it will be notified that it’s a fake.

On Wednesday, a hearing was held by the Committee on Energy and Commerce to discuss such deceptions in digital media, with a representative from Facebook attending the meeting. During the hearing, big tech was accused of failing to respond adequately to the “grave threat” of deceptive media.

Others have said that though deepfakes take some technological nous to create, “shallowfakes” or “cheapfakes” can be made by just about anyone who can grasp how certain apps work. The Nancy Pelosi video, people have pointed out, didn’t require a mastermind to create it. Facebook’s new policy doesn’t discuss these lower-quality fakes.

“I am concerned that Facebook’s latest effort to tackle misinformation leaves a lot out,” subcommittee Chairwoman Jan Schakowsky said at the hearing. Facebook also came under fire for not detecting such content fast enough. The representative said that was still a problem, but Facebook was improving.

Photo: Alan Levine/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.