Facebook moderation – A requirement or a good practice?

Posted by Crisp on

Crisp Thinking. Online moderation, content moderation, UGC moderation, user generated content moderation, kids' moderation, social media moderation, comment moderation, image moderation, forum moderation, virtual world moderation, Facebook moderation, Social media management, social media analytics, Facebook analytics, social media tools, social media management tools, social analytics, community management, social media dashboard, social media management software, social media analytics tools, social media management company, social media management services, social media customer service, social media crisis management, moderation service.

In a recent case, the Australian Advertising Standards Bureau had received complaints about some of the User Generated Content posted on the Smirnoff Australia Facebook page (full case detail).  It was deemed that a page owner can exercise ‘a reasonable degree of control’ over the content on the page and therefore a business’s Facebook page should be considered the same as any other marketing communication tool. The verdict means that owners of Australian Facebook pages are responsible for moderating all of the User Generated Content posted on their page.

As a marketer this made me think;

Facebook Moderation should be seen as a good practice to protect your brand and not a burden resulting from legislation.

We spend a lot of time and resources making sure our communications are perfect, say exactly what we want them to say, provide a positive impact on our audiences and just as importantly, make sure they offend nobody. So why should a Facebook page be any different?

Facebook pages allow you to broadcast your messages to an engaged audience. However, this audience is a community and can choose to engage with you & others who have also chosen to Like your page. Regularly you’ll get to know what they think about your product, your company and their experiences.

Unfortunately (for some brands) there is a negative side to this; the door is open for abuse as; abusive, profane and racist comments to be posted on your page, as well as spam and inappropriate images or videos. This exposes your community to unwanted User Generated Content and is damaging to your brands reputation and your Social Media strategy.

You wouldn’t allow profanities, sexual or inappropriate content, and racism or hate speech to be included in any other form of marketing communication, so why would you through Facebook, whether it is your message or not?

Not effectively moderating your Facebook pages will inevitably have a negative impact on your brand. You can’t expect your customers and prospects to continue engaging with you and openly receive your messages while you allow those messages to be polluted with offensive content posted by others.

But how does removing content affect the community?

My first and primary conclusion to this is that it protects the community AND YOUR BRAND. People haven’t subscribed to your brand to be bombarded with inappropriate content not relating to your company, products or services in any shape or form. It’s your company that has gone to the considerable efforts of creating valuable content the community will be interested in. They have engaged with you and other like-minded people and not to those who want to use the size of your community to be a joker or to broadcast their own unwanted opinions, beliefs, images or videos.

What if the content was innocently put there without any malice?

It’s simple – you must still moderate your content around the clock and remove anything that is offensive or against any other advertising legislation. In the case of Smirnoff Australia, where there were images posted of people who appeared to be drinking alcohol under the age of 25 and there is legislation prohibiting advertisers using such images in their marketing (and in Australia, this now applies to Facebook pages). So I would remove the content and post a quick message to thank them for the content and engaging in your social community, but you had to remove the content as it was against guidelines and to protect the longevity of the community. After all you’re building a community and all communities have rules!


Written by Crisp

Crisp’s mission is to provide the fastest detection of critical issues and crises to protect global brands and platforms. From supporting PRs in reputational management and helping pharma brands to remain compliant, to protecting vulnerable individuals from the exploitation of bad actors... wherever social media has the potential to trigger a crisis, you can be sure we have expertise to share.

Read more posts from Crisp »