Tackling the V-word in child abuse

CACC-2017-blog.jpeg

When you have over 4,000 people coming together to work towards banishing child abuse, a number of strong themes are bound to emerge. At this year’s Crimes Against Children conference in Dallas, the V-word was the dominant theme above all the rest - Volume.

Emma Monks, Crisp's Head of Moderation, Trust and Safety, went along to this year's Crimes Against Children conference, which is now the world’s largest gathering of law enforcement, child protection and industry representatives. This year the main topic of discussion was on how they can tackle the sheer volume of reports made by the public about child sexual abuse material found online.

The push by social networks, governments and non-profits to educate the public and to make reporting of child sexual abuse material (CSAM) easier to do, means that report volumes are escalating and resources are more stretched than ever.

The National Centre for Missing and Exploited Children (NCMEC), one of the main avenues for reporting CSAM, estimates it will receive around 12 million reports in 2017 (that equates to over 32,000 a day). Many of these are from industry. A good number of these reports, however, cannot be reviewed by NCMEC analysts due to Fourth Amendment violations, leaving millions of reports that still need to be reviewed every year.

Tackling report volumes is a thorny issue. Everyone wants to encourage reporting, but few have the resources to cope with the resulting avalanche. With the amount of child abuse online escalating, the report volume challenge must be overcome. But how?

The only answer is triaging; ensuring the most serious reports get attention soonest. As there aren’t enough resources for human triaging, this means triaging using technology, where Artificial Intelligence and Machine Learning crunch the volume and draw attention to the worst issues, fast.

Crisp has long used technology to find and prioritize grooming conversations on social media. The next evolution for industry, government and non-profits is recognizing child abuse imagery and differentiating it from images depicting sexual acts between people of a legal age. To enable this, Crisp announced a global collaboration at the Crimes Against Children conference to stamp out child sexual abuse imagery. Crisp will do this by lending its expertise in Machine Learning freely and calling for others to do the same.

To find out more read our blog, and to join the initiative, click here.