Community moderation and management

It takes a lot of work to create and manage a strong and engaged online community. The challenge is that what humans do well - empathise, understand context, make and communicate informed decisions  - doesn’t scale. And while AI and technology can scale, detecting patterns across vast swathes of content, it’s not so good at the human stuff. 

This theme will explore issues including:

  • How can we go further than “reactive” moderation- ‘Taking stuff down’ is a blunt weapon that does not tackle the cause of the problem. How might moderation tools better support the type of individual customer relationship-building that is a feature of brands interaction on social media channels?
  • Where can the introduction of tech actively help users?  For example,  deliberate use of chatbots has been effective in nudging vulnerable people - for example those considering self-harm - to speak to a helpline.
  • What does good look like - How can we best publicise best-practice guidelines such as those produced by The Samaritans, and understand other areas in which moderators need support?
  • Protecting mental health of moderators - what more can we do in this vital area?

Next steps

On 4 December 2020 we will be holding an online workshop with expert panellists, at which we will review issues and needs, and start coming up with suggestions for ways in which the Safety Tech Innovation Network can make a difference - for example by developing guidance, or sharing and promoting best practices.

Sign up for the Community moderation and management workshop now

Want to know more?

Join here