YouTube has rolled out its latest moderation tools – and it’s good to see the site is making an effort to protect users and brand channels. However, despite creating the impression that it’s our spam-fighting knight in shining comment moderation armour, I can’t help but feel these developments are somewhat lacking when it comes to providing comprehensive protection for brands and their fans.
UGC is being created constantly – so protection must also be constant.
This is why Crisp Thinking offers a 24/7 comment moderation service, ensuring that no matter what time of day or night and what time zone you operate from, your website is always protected from inappropriate content and potential PR crises that can easily arise at any time.
Over the last couple of years there have been some huge PR crises for a number of major kids’ entertainment providers and social networks. Some of these PR disasters were so damaging that they led to the downfall of a number of businesses – businesses which up until that point had been highly successful. The main issue that led to their untimely demise was inadequate protection of their customers (or users whichever you prefer) from issues like cyber-bulling, trolling and sexual predators. Something as relatively harmless as SPAM also played its part in having a damaging affect on their essential advertising revenue due to lack of effective content moderation.
Here at Crisp we’re really excited to announce that we’re teaming up with new and funky forum and commenting platform Moot!
Moot is pretty new to the industry and its fresh approach and forward-thinking completely re-imagines online discussion. The new platform is intuitive, real-time and blazing-fast – just like Crisp! So it’s clear this is set to be a match made in internet heaven.
Recent controversy surrounding social media moderation has made it more apparent than ever that online communities must take proper precautions to protect both their users and their business.
The tragic loss of lives linked with social netoworking sites has highlighted the need for social platforms to be moderated and regulated. With anonymous users posting abusive messages with minimal effort and maximum impact, online communities have become dangerous places where users are vulnerable to trolls and bullies, and the damage they inflict.
User generated content is critical to all digital publishers and although comments from users drive community engagement and increase traffic to a site and publicity to a brand, they only have a positive effect if properly moderated. Without moderation, comments and discussions can cause significant and lasting damage, but real-time moderation is a demanding, expensive and time-consuming chore.
As the Daily Mail rightly reports the ‘fury as child abuse picture goes viral on Facebook with 16,000 ‘shares’ and 4,000 ‘likes”, we still see too little focus on strong moderation as a brand protection solution. Whilst it’s the most extreme forms of pornography which continue to make headlines, unregulated spam content of many kinds can be damaging to brands.
In a recent case, the Australian Advertising Standards Bureau had received complaints about some of the User Generated Content posted on the Smirnoff Australia Facebook page (full case detail). It was deemed that a page owner can exercise ‘a reasonable degree of control’ over the content on the page and therefore a business’s Facebook page should be considered the same as any other marketing communication tool. The verdict means that owners of Australian Facebook pages are responsible for moderating all of the User Generated Content posted on their page.