How to effectively report inappropriate content to a social channel

Social media moderation teams and managers can find themselves faced with content that poses a real risk to their online community. Whether it is a threat from a vulnerable user, graphic or sexualized image, cyberbullying or a sexually inappropriate comment, the team will expect intervention from the social channel to ban the user, remove the content or help a vulnerable user.

Read More

Should risky content always be removed?

We read every piece of content from our clients' social media channels looking for complex word combinations that trigger any of over 100 risks, such as a bomb threat, hate speech or illness after taking prescribed medication. When a risk is found, the comment and its context are reviewed by one of our skilled Risk Analysts. It’s their job to understand the real intent of the comment and to tag it appropriately so the right action can be taken.

Read More

UGC moderation and community management spell success for news sites

News, which was once simply broadcast to the public, has now become an open conversation. It used to be that news organisations could simply tell people what was happening, but they’re now expected to not only talk about it, but also listen to the reactions it provokes. This makes social media an invaluable resource for journalists, not only opening up an important line of communication with the audience, but also providing a platform on which to push out content on a global scale.

Read More