In an industry as visual as cosmetics, social media is the perfect platform to showcase products. Just a quick search on Instagram for #beauty brings back over 250 million posts, and each week 2 million unique users search for beauty related content on the platform. When you combine that with the fact that 74% of consumers now rely on social media to make a buying decision, it is more important than ever that cosmetics brands make sure their social media strategy is working as hard as possible to reach the right people with the right message.
After five years of development, and real-world roll-outs with some of the largest digital kids platforms, Crisp has now officially launched its fully managed ‘Kids and Teens’ safety & moderation service. This service is set to transform the way Social Platforms deliver safe experiences for their younger users.
Read time: 5 mins
The problem of graphic images and videos appearing on social media is not a new one, but unacceptable, toxic and illegal videos for all to see on social media are certainly hitting the headlines more frequently. The self-video of a Thai man murdering his baby daughter on Facebook Live was up for almost 24 hours this week. Only 10 days previously, Steve Stephens posted a video on Facebook of himself shooting Robert Godwin Sr. in Cleveland. That video was live for three hours before being removed.
Social media moderation teams and managers can find themselves faced with content that poses a real risk to their online community. Whether it is a threat from a vulnerable user, graphic or sexualized image, cyberbullying or a sexually inappropriate comment, the team will expect intervention from the social channel to ban the user, remove the content or help a vulnerable user.