UPDATED 23:15 EDT / AUGUST 05 2021

POLICY

Privacy advocates raise concern over Apple scanning iPhones for child abuse images

Apple Inc. will now scan iPhones and iCloud in the U.S. for illegal images of children, a move that has worried privacy advocates.

According to the Financial Times, which first reported the news today, a technology called neuralMatch will be introduced. It will scan devices and when it flags images of what it detects as child abuse, a human reviewer will be notified. That person can then assess the image and contact law enforcement if necessary.

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” wrote the Financial Times. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”

In a blog post, Apple said that it wants to protect children from predators by limiting the spread of  Child Sexual Abuse Material, CSAM. “This program is ambitious, and protecting children is an important responsibility,” Apple said. “Our efforts will evolve and expand over time.”

The company said it will also now use new tools to detect if a sexually explicit image has been sent or received via the Messages app on a child’s phone. The photo will be blurred when the child receives it, with a warning also being issued. The same will happen if a kid attempts to send such an image. Devices can also be set up so parents receive a warning.

These new tools, despite their noble cause, have compelled some people to ask if this could not be abused by governments. “It’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children,” said the Electronic Frontier Foundation. “As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”

In a series of tweets, John Hopkins University professor and cryptographer Matthew Green said the scanning tool is a “really bad idea,” adding that in all likelihood it will become a “key ingredient in adding surveillance to encrypted messaging systems.” He admitted that it will no doubt find illegal images on phones, but asked, “Imagine what it could do in the hands of an authoritarian government?”

Photo: Laurenz Heymann/Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU