Social media moderation teams and managers can find themselves faced with content that poses a real risk to their online community. Whether it is a threat from a vulnerable user, graphic or sexualized image, cyberbullying or a sexually inappropriate comment, the team will expect intervention from the social channel to ban the user, remove the content or help a vulnerable user.
The brand’s social media guidelines will probably tell the team to report the content to the social media platform so they can take the appropriate action. However, as there isn’t one catch-all ‘report’ button on platforms, the way you report content can make all the difference to whether it is removed or not.
Context is key
Social channels are often in the headlines for not taking action against a user or not removing offensive content. But take Crisp’s experience of reporting content to Facebook as an example, we find that their moderation team only review the piece of content you report to them – they do not investigate further. (With the overwhelming volume of content reported to Facebook, you can see why.)
This means that if a brand’s social media team came across a Facebook page with a suggestive name which contained hundreds of images of provocatively posed underage girls dressed in school uniform, the way in which the team reports it is vital.
If the team select one image and click ‘report image’, only the image would be sent for moderation, not the whole page. Without the other images or the page name, Facebook’s Content Reviewers cannot make the right decision.
However, if the social media team click ‘report group’, ‘report page’ or ‘report’ next to the profile’s name, the whole of the page will be sent for moderation. Facebook’s Content Reviewers would then be able to see the full context, including the group name, improper comments, the volume of similar images, and the type of people who have liked the page.
To help a brand’s social community thrive, it is important to keep channels free from inappropriate content. So we’ve put together these simple rules to help social media moderators and managers report content effectively.
Golden rules for reporting social content
- Don’t report in the heat of the moment – take a few minutes to consider which ‘report’ button will best convey the full context of the offending content to the moderation team.
- Know your reporting options – social platforms have different reporting options, but on the whole these are Report Group, Report Profile, Report Page, Report Post, Report Image and Report Video. Each only reports that specific item.
- Be clear why you are reporting it – you will be asked why the content is offensive so that it can be sent automatically sent to the right team. Familiarize yourself with social platform’s community guidelines and be clear in your comment which principle the content violates.
- Consider your options – as content will not be removed simply because you report it, platforms give you other options such as blocking specific people, posts or pages.