Join the global collaboration to detect child abuse images

Let's stamp out child abuse iamges togetherAs pedophiles use internet advances to their advantage, the scale of child sexual abuse material (CSAM) found online is a growing problem across the globe.

In 2016, the Internet Watch Foundation found 211 new websites every day containing child abuse images. The sheer volume of images found can be staggering too. 30 million+ images were seized in a recent police sting in Scotland, which led to the arrest of 77 people. The personal collections of CSAM for those arrested averaged out at nearly 390,000 images each. This compares starkly with figures realized just four years ago, in 2013, when the UN Office on Drugs and Crime said a collection of 150,000 images was 'quite standard' [1].

These figures show the scale of the challenge global law enforcement agencies face when trying to apprehend suspects. These illegal images are often hidden amongst huge numbers of legal images on websites, servers and hard drives which means the task of finding incriminating images in a suspect’s collection can take police months.

The newest needle in the haystack

The real challenge though is spotting newly-produced abuse images online. These give law enforcement agencies and child protection agencies the fresh evidence they need to secure a prosecution, protect child victims, and identify abusers.

To aid agencies in this crucial work, Crisp launched a worldwide collaboration at the 2017 Crimes Against Children Conference in Texas this week.

By partnering industry with law enforcement, this collaboration uses artificial-intelligence-based technology to help agencies pinpoint new child sexual abuse material hidden amongst millions of images quickly. This means they can apprehend suspects faster, identify victims and ultimately, stamp out CSAM online.

A real step-change in CSAM identification

Until now, efforts to identify CSAM online have been hampered by legal restrictions on sharing material across agencies. Using advanced machine learning, our free service will be the legal solution to identifying unknown CSAM. We will:

  • share and distribute machine learning, without sharing images
  • ensure images never leave law enforcement or company machines
  • allow for collaboration whilst staying within the laws of your jurisdiction
  • solve the problem of needing a large corpus of data for machine learning

 

Join us in eradicating CSAM online

Become a partner and feed into the AI-based technology. Click here to join the global collaboration now.

We know that by working together, we can put an end to CSAM distribution online faster.

[1] Reported by Dr Joe Sullivan, a forensic psychologist who works with child sex offenders, at a 2013 UNODC meeting.