Catching child abusers with artificial intelligence

Our toolkit – iCOP – helps police catch child sexual abusers by automatically identifying new criminal files online

hand-on-keyboard.jpg

When I was just starting as a junior researcher interested in computational linguistics, I attended a presentation by an Interpol police officer who was arguing that the academic world should focus more on developing solutions to detect child sexual abuse media online. Although he clearly acknowledged that there are other crimes that also deserve attention, at one point he said: “You know those sweet toddler hands with dimple-knuckles? I see them online … every day.”

That statement sent chills down my spine, especially as a mother. I knew I had to do something to help, and artificial intelligence was the answer. Working with colleagues at Security Lancaster (Prof. Awais Rashid and Dr. Carl Fischer) and in collaboration with the German Research Center for Artificial Intelligence, DFKI (Dr. Christian Schulze) and University College Cork in Ireland (Dr. Margaret Brennan), we developed software that can identify child sexual abuse media online, helping police catch offenders.

In a new paper in Digital Investigation, we present iCOP: an artificial intelligence toolkit that automatically detects new child sexual abuse photos and videos in online peer-to-peer networks.

Flagging child sexual abuse media

Unfortunately, the chilling picture the Interpol officer painted is very real. The internet makes it easy for offenders to share criminal media in peer-to-peer networks. Every second, there are hundreds of searches for child abuse images worldwide, and people share hundreds of thousands of child sexual abuse images and videos every year.

Intercepting these images and videos can help law enforcement officers find child sexual abusers; the people who produce child sex abuse media are often abusers themselves. According to the National Center for Missing and Exploited Children in the US, 16 percent of the people who possess such media had directly and physically abused children. Spotting them early by flagging new files can therefore help stop them more quickly.

But in reality this is enormously challenging: the sheer volume of activity on peer-to-peer networks makes manual detection virtually impossible. There are already a number of tools to help law enforcement agents monitor peer-to-peer networks for pedophile activity, but they usually rely on identifying known media. These tools are unable to assess the thousands of results they retrieve and can’t spot new media that appear.

The process needed to be automated, and we realized that to accurately identify files containing criminal content, we would need to design software that could look directly at the files.

iCOP: artificial intelligence policing

We developed an approach that combines artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media and packaged it in a toolkit called iCOP. Police can use iCOP to identify new or previously unknown child sexual abuse media without having to trawl through files manually.

Our new approach combines automatic filename and media analysis techniques – both artificial intelligence – in an intelligent filtering module. The software can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

We tested iCOP on real-life cases, and we asked law enforcement officers to trial the toolkit. One issue with automatic systems is that they can identify safe content accidentally, resulting in additional work for the police and potentially reputation damage for the person sharing the media. iCOP was highly accurate, with a false positive rate of only 7.9 percent for images and 4.3 percent for videos, which shows it can distinguish between different types of content.

Importantly, it was also complementary to the systems and workflows law enforcement officers already use, which means it would not be an additional administrative burden for them. And since the system can reveal who is sharing known child sexual abuse media, and show other files shared by those people, it will be highly relevant and useful to law enforcers.

I am relieved to see how far technology has come since that chilling presentation. I believe we are well on the way to being able to intercept all child sexual abuse media and catch the perpetrators. With iCOP, we hope we’re giving police the tools they need to catch child sexual abusers early based on what they’re sharing online.

Read the study

Elsevier has published this article open access:

Claudia Peersman et al:iCOP: Live forensics to reveal previously unknown criminal media on P2P networks,” Digital Investigation (September 2016)


Digital Investigation covers cutting-edge developments in digital forensics and incident response from around the globe. This widely referenced publication helps digital investigators remain current on new technologies, useful tools, relevant research, investigative techniques and methods for handling security breaches. Practitioners in corporate, criminal and military settings use this journal to share their knowledge and experiences, including current challenges and lessons learned. This journal is published by Elsevier.


In the media

This article has also been featured in news media, including:

Tags


Contributors


Comments


comments powered by Disqus