UPDATED 23:42 EST / AUGUST 09 2021

POLICY

Apple tries to clear the air over its CSAM photo-scanning child protection technology

After a torrent of criticism hit Apple Inc. last week, the company today defended its new CSAM system to scan devices for illegal child sexual abuse materials.

When the company announced a few days ago that it will introduce scanning technology to flag such images saved in iCloud in the U.S., the move wasn’t exactly met with much support – despite its noble intention.

Critics, including the Electronic Frontier Foundation, said such a technology could open the door for privacy abuses. The monitoring technology continuously scans saved images and if its flags a number of them, the authorities may be contacted. Parents can also be notified if their child has sent or received images containing nudity.

No sooner than the announcement was made than a letter was signed by 6,000 people and organizations, including privacy and security experts, academics, legal experts and cryptographers. The signatories asked Apple for two things: Halt the technology and reaffirm your commitment to end-to-end encryption.

“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” the letter concluded. “We ask that Apple reconsider its technology rollout, lest it undo that important work.”

In an FAQ published today, Apple said that the scanning tech will not open a door for abuse by governments. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” the company said. “We will continue to refuse them in the future.”

One thing that has worried those criticizing Apple is the possibility that some governments may expand on what meets the criteria for an illegal image. Apple said this will not happen, saying, the “technology is limited to detecting CSAM stored in iCloud.”

Critics were quick to mention that in 2018, Apple did accede to a country’s demands, China, even though that went against its own policies. “While we advocated against iCloud being subject to these laws, we were ultimately unsuccessful,” Apple said back then.

The company said today that users’ photo albums won’t be scanned on their phones, only images saved to iCloud will be monitored. To further address concerns, Apple said the chances of an image being incorrectly flagged is “less than one in one trillion per year.”

Still, this won’t appease the majority of privacy advocates who already understood most of this. In the words of EFF, the technology is a “slippery slope” which could lead to an “expansion of the machine learning parameters to look for additional types of content.”

Many others agree. “Apple’s new iPhone contraband-scanning system is now a national security issue,” tweeted exiled security expert Edward Snowden. “Hard to understate how disastrous this new system is for iPhone security. Tim Cook needs to intervene.”

Photo: Nick Carter/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU