UPDATED 20:36 EDT / SEPTEMBER 03 2018

EMERGING TECH

Google launches new AI-based tool to help combat child sexual abuse material

Google LLC today released a new artificial intelligence tool that aims to assist organizations in identifying and removing online child sexual abuse material.

The Content Safety API is a toolkit that uses deep neural networks for image processing to identify the material quickly while minimizing the need for human inspections. That’s a cumbersome process that often involves researchers having to go through thousands of images manually.

“Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse,” Google engineering lead Nikola Todorovic and product manager Abhi Chaudhuri said in a blog post. “We’re making this available for free to NGOs and industry partners via our Content Safety API, a toolkit to increase the capacity to review content in a way that requires fewer people to be exposed to it.”

In testing, the tool is said to provide a great improvement in the speed of review processes of potential CSAM. Reviewer times, the time it takes to find and take action on material, improved by up to 700 percent.

VentureBeat reported that the announcement comes shortly after Google had been criticized by U.K. Foreign Secretary Jeremy Hunt for not doing enough, strangely drawing parallels with Google’s controversial decision to return to China with a censored search engine.

“Seems extraordinary that Google is considering censoring its content to get into China but won’t cooperate with U.K., U.S. and other 5 eyes countries in removing child abuse content,” Hunt wrote on Twitter. “They used to be so proud of being values-driven.”

In a canned statement, at least one group working in the area is positive about the announcement. “We, and in particular our expert analysts, are excited about the development of an artificial intelligence tool which could help our human experts review material to an even greater scale and keep up with offenders, by targeting imagery that hasn’t previously been marked as illegal material,” said Susie Hargreaves of the Internet Watch Foundation, a U.K.-based organization that fights against abuse material. “By sharing this new technology, the identification of images could be speeded up, which in turn could make the internet a safer place for both survivors and users.”

NGOs and similar organizations can get access to the tool via this form.

Image: Google

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU