Struggling social workers need online moderation help to keep kids safe

Posted by Crisp on

Crisp Thinking. Online moderation, content moderation, UGC moderation, user generated content moderation, kids' moderation, social media moderation, comment moderation, image moderation, forum moderation, virtual world moderation, Facebook moderation, Social media management, social media analytics, Facebook analytics, social media tools, social media management tools, social analytics, community management, social media dashboard, social media management software, social media analytics tools, social media management company, social media management services, social media customer service, social media crisis management, moderation service.

With the scary revelation that less than half of social workers know how to spot or deal with signs of online child abuse and the number of cases rising, it’s more important than ever for online communities to take responsibility and protect their young users from potential threats.

The majority of cases remain undiscovered until damage has already been done, but a preventative course of action must be taken by detecting suspicious behaviour before a situation escalates.

Social media moderation, user generated content moderation, forum moderation, virtual world moderation and online moderation in general should all be in place where ever kids are likely to be communicating with faceless strangers online. No parent would send their child out to a playground loitered by masked strangers making idle chitchat with youngsters, so why should it be acceptable online?

A survey carried out by the British Association of Social Workers (BASW) and children’s charity the NSPCC turned up some worrying results last week. It found that:

  • 50% of social workers don't know how to recognise signs of online sexual abuse of children
  • 50% said a quarter of their sexual abuse cases involve online abuse
  • 36% feel they don’t know how to identify or assess signs of online abuse
  • 30% don’t feel confident dealing with cases using the internet
  • A third don’t feel confident understanding the language used by young people online
  • 47% admitted they didn’t know much about how young people communicate online.

It’s clear that social workers can’t and shouldn’t be expected to detect something they know so little about. The responsibility of kids’ moderation falls with the social networks that create these virtual meeting places –just as the safety of public play areas falls to the councils who provide them.

With so many social workers worrying that they’re struggling to deal with online abuse, kids’ moderation can dramatically reduce the number of cases they have to deal with. Using a kids' moderation service removes risks as early as possible, reduces the pressure on social services, protects children and basically stops online abuse before it can ever really start.

Online communities for kids and teens need to provide a safe online environment but it’s also understandable that they don’t want to overly restrict the interaction that brings children to their sites. There’s a simple solution to this - if social media platforms embrace kids' moderation by working together with social workers and social media moderation providers like Crisp, together we can all make the internet a much safer place for our children to play.

Click here to find out more!


Written by Crisp

Crisp’s mission is to provide the fastest detection of critical issues and crises to protect global brands and platforms. From supporting PRs in reputational management and helping pharma brands to remain compliant, to protecting vulnerable individuals from the exploitation of bad actors... wherever social media has the potential to trigger a crisis, you can be sure we have expertise to share.

Read more posts from Crisp »

Related Posts

Related Topics