It may be difficult, but open and transparent discussions will lead to more consistent decisions.
You can listen to Customer Experience Leaders Chat also on:
Any company with a social media presence or an online marketing plan is likely to be impacted by the content they don’t directly produce. Whether it’s a site showing ads, a review, or a public-facing forum, any brand can be related to any content. Monitoring your brand to avoid association with vulgar, offensive, or violent content is an onerous task.
As part of an outsourcing company, Hannah Steiman, Chief Operating Officer at Peak Support, helps oversee many programs tackling this unfortunate problem. She believes that companies in all industries need to be conscious of content moderation, its potential impact on your business, and the processes to reduce the risk.
In Conversation with Hannah Steiman, Chief Operating Officer at Peak Support
In our conversation, Hannah highlights how content moderation is an ever-evolving task. New terms, symbols, or techniques to spread harmful ideas are constantly changing. However, it isn’t just an issue with the speed of change. Content moderation is fraught with nuance. Hannah asks: “What is offensive content?” Based on your lived experiences, your view of what is objectionable might differ from your coworker’s. Therefore preparedness is vital. What are the decision-making guidelines?
Hannah suggests using real-world examples and providing a safe environment where people can feel comfortable sharing them. It may be difficult, but open and transparent discussions will lead to more consistent decisions.
Finally, we discuss what profile to hire to make this successful, recognize we are all biased and why the work from home trend benefits your content moderation team. Please join our important discussion on a topic relevant to us all.