UPDATED 14:51 EST / DECEMBER 18 2017

EMERGING TECH

AI will review child abuse images so police won’t have to

Police must often review hundreds of disturbing images found on devices belonging to child abuse suspects, and these investigations can take a toll on officers’ mental health. London’s Metropolitan Police Service wants to spare officers from this trauma by developing an artificial intelligence system that will review child abuse images for them.

According to Mark Stokes, the head of digital and electronics forensics for the MPS, London’s police had to investigate more than 53,000 devices last year for evidence of child abuse, and any images found on these devices had to be graded for sentencing purposes.

Stokes explained that the department’s existing image recognition system is not sophisticated enough to handle this job on its own, which is why human investigators must still view the images. “You can imagine that doing that for year-on-year is very disturbing,” Stokes told The Telegraph today.

The MPS wants to replace its outdated tech with an AI that can take over most of the review process, but since its data infrastructure is already struggling to keep, the MPS has decided to migrate its data to cloud platforms such as Amazon Web Services. Stokes said that this move will allow the MPS to apply modern machine learning methods to its data, and he believes that an AI can be trained to analyze child abuse images in two to three years.

While the goal of the MPS is admirable, the department’s AI project may have some legal hurdles to overcome, particularly when it comes to storing illegal images on civilian services. Stokes said that the MPS has been working with cloud providers to meet legal requirements, adding that “we think we have it covered.”

The MPS will likely also need to prove that its AI can grade images as accurately as human reviewers if not better, but this might not be as easy as the organization hopes. Google LLC developed a similar AI to review YouTube videos, hoping to appease brands whose ads showed up alongside inappropriate videos. Unfortunately, Google’s trigger-happy AI has incorrectly flagged thousands of videos, stripping many content creators of a large proportion of their income for no reason.

If Google, one of the world’s leaders in machine learning technology, has been unable to perfect its content-reviewing AI, the MPS may find its own AI project to be a tough sell.

Photo: Brandon Grasley Facepalm via photopin (license)

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU