UPDATED 18:51 EST / SEPTEMBER 14 2023

https://pixabay.com/illustrations/world-ciber-attacks-keyboard-cyber-2030121/ SECURITY

Deepfake cyberthreats keep rising. Here’s how to prevent them

As expected, this summer has seen a rise in various cybersecurity threats based on deepfake audio and video impersonations.

Despite warnings from the Federal Bureau of Investigation in June, it’s now quite common to experience these types of threats. The fakes are used to lend credibility to larger exploits, such as for a phishing email lure or a request from a superior. These can run the gamut of executive impersonation, performing various forms of financial fraud and obtaining stolen account credentials.

One early case of using deepfake audio was with the targeting of a chief executive of a U.K.-based energy company. It was used to steal $243,000 in 2019. But this seems almost quaint now, thanks to fakes that leverage better computing power, easily customize applications, and provide more thorough integration with artificial intelligence techniques and models.

Earlier this week the U.S. National Security Agency, the Cybersecurity Infrastructure and Security Agency and the FBI jointly published their suggestions in the document Contextualizing Deepfake Threats to Organizations. The posting has an overview of various deepfake media threats, techniques and trends. The document cites numerous examples of how deepfakes have entered the cyber criminal’s toolkit, such as applications that “have the ability to lift a 2D image to 3D to enable the realistic generation of video based on a single image,” and fake audio clips that can be generated with just a few seconds of a genuine recording of a speaker.

How to spot and stop deepfakes

Tracking down the fakes isn’t impossible. Back in 2022, Kerry Tomlinson of the computer security training organization SANS Institute published this early guide to carefully scrutinize of images carefully, such as studying the background, a subject’s eye and strangely attached jewelry or misshaped shoulders or other clues.

But there are some new tools available, and a large section of the joint government document is devoted to offering recommendations for security professionals to spot and potentially stop the fakes from proliferating across their organizations.

These include:

  • Using real-time verification procedures and screening apps, including using better authentication protocols, little-known personal details, reverse image searches and biometrics.
  • Using advanced examination tools such as those that can analyze the underlying metadata of the video and audio to reveal if the media was manipulated or stripped from the file. This leverages some of the same skills and tools that a typical cybersecurity forensics examiner would employ, as outlined in this paper co-authored by University of California at Berkeley computer science professor Hany Farid, who has long studied deepfake production and identification.
  • Protect public data of high-priority individuals, including watermarking images and conducting tabletop exercises to plan and perfect incident response strategies in case of an executive impersonation. One such watermarking tool was announced in August by Google Inc. called SynthID that embeds a watermark directly into an image’s pixels. Another helpful guide is MIT’s Center for Advanced Virtuality’s “Media Literacy in the age of Deepfakes,” a series of training modules that is aimed at educating its students on the topic. One of the fakes used is a video production that uses a real speech recorded by President Nixon in case the Apollo 11 astronauts perished in their mission.

“Threat actors are constantly evolving and finding ways to circumvent defensive measures. This calls for the need for a multifaceted approach,” said Veridas CEO Eduardo Azanza. Defensive measures are works in progress, and the challenge will be whether they can stay ahead of attackers who are building more advanced methods to create ever more convincing fakes.

Image: daniel_diaz_bardillo/Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU