Coral by Vox Media Guides

6. Manage a Successful Community

By Jessamyn West, former Director of Operations at MetaFilter

Adapted from a blogpost that first appeared on the Coral Project blog

Communities — and community managers — can be accidental.

Maybe you used to answer support email or file bug requests or work on social media strategy, but now your role is expanding to cover parts of all of those tasks, and more. You become the inadvertent public face of the company.

I was a community member, moderator, lead moderator, and finally Director of Operations for MetaFilter.com for 10 years from 2004 to 2014. Even though I’ve moved on from my moderation position, I’m still an active participating member of the community.

I look at the moderation role as primarily setting the expectations of the site, whatever those may be, clearly. Behavior can fall into a number of different categories.

Behavior can be:

1. Encouraged. Users get positive feedback for this type of behavior from community and/or moderators.

2. Discouraged. This type of behavior is allowed to happen, but discouraged through community norms or responses from moderators. This type of behavior may also be allowed to happen once or twice, but if it becomes a pattern, it becomes Against The Rules.

3. Actionable. Users exhibiting this kind of behavior will find some sort of sanctions coming their way directly from the moderators, not just from the community.

MetaFilter is a massive community blog. Our only hard-and-fast rule, besides “No spamming,” is “Don’t be an asshole.” All the other guidelines spring from trying to enforce nuances of that basic rule.
Being able to build a site from the ground up and create a bunch of moderation tools and functions for the moderators and for the community has helped us do our job of keeping the community running smoothly. Here are a few of the things that have helped us:

Identity

Your community starts with individuals. What are you going to require from them in order for them to interact with your other community members? A photo? Their gender? A name? A real name? Credentials from another website? A phone number? Each piece of information that you make mandatory is one small piece of friction in somebody’s sign-up process. If you’re asking for information, make sure there’s a reason. And think really hard about whether you need personal information such as gender for any legitimate reason. If so, offer more than two options and “Other.”

MetaFilter is unusual in that it requires a small fee at sign-up time but has no annual costs to users. This fee is negotiable for people who come from places where it is actually an obstacle to their participation, but it means that people have an investment, even a tiny one, in being there. It also means that people think twice before they spam the community because there is a cost associated with spamming. We primarily use PayPal to process this fee (now expanded to Square and Amazon), which often means we have a small amount of real-life information about the people who participate. We don’t have a real name policy; we allow people to be as pseudonymous as they want to be. However, the rule is that they have to have one consistent identity (except for the occasional sock puppet joke accounts) on the site.

On the back end, we keep track of log files (including IP addresses) so we can tell if people are trying to get around the one-user-one-account guideline. Of course, this wouldn’t work against the dedicated scam artist, but our basic principal is to trust people first. Trust, but verify.

When you have this kind of information about your users, you also have to have some rules in place about the level of discretion employed by the moderators. Privacy of this sort of user information is ironclad, mods do not share it. Similarly, in a situation where you have users who may know each other in real life, you also have to be very clear about whether it’s acceptable or tolerable to bring people’s real-life information (real names, outing, doxxing, inadvertent info leakage) onto the main site.

Users have profile pages that contain information that’s viewable by other users and not the world at large. It’s a little tricky but we do try to enforce peoples’ ability to keep profile information private. Profile pages contain links to the users’ contributions on the site as well as ways to get in touch with them via other social media. Encouraging on-site and off-site bonds between users is another net positive for the site. We have a section of the site called IRL that is entirely for coordinating meet ups. We have an off-site wiki that is entirely for user-generated meta-content, help files, lists, orientation pages, that sort of thing. The more connections your users make and maintain, the more your community coheres.

MetaFilter also has an unusual policy called the Brand New Day policy. Simply put, if you close your own account (a feature that I believe is necessary to put in the users’ hands) or if you get banned, you are welcome to come back to the site with a new account and moderators will not link your old and new identities. If a user decides to do that, it’s up to them. Of course, this is conditional on them not repeating the behavior that got them banned in the first place, and people have varying levels of ability in doing this effectively. Saying “the door is always open” is an important part of trusting our users and makes our community stronger.

Most users respect these guidelines, and appreciate that they’re in place mostly to keep people communicating and interacting smoothly.

Enforcement

It is very important to us to try to allow anybody who sincerely wants to be part of the community to be a part of the community. Sometimes someone is having a bad day or a bad week, and we really try to work with them and explain what the problems are, and not just ban first and ask questions later.

One of the tools we use the most is simply emailing a user to say, “Hey, is everything okay?” Sometimes just sending a sympathetic email to someone, even as you are deleting their comment, can soften the blow and turn an irate user into just a slightly crabby user who decides to go take a nap. Ultimately, we have very few actual technology tools on MetaFilter. Moderators can

  • Delete a comment or a thread
  • Contact a user directly through email or private message
  • Leave a public comment in a thread of the “Hey, please cut it out.” variety
  • Leave a private note about a user in a thread or in the admin-facing part of their profile such as “If they start that same old argument with them, it’s time for them to take some time off”
  • Give a user a time out (night off or similar)
  • Ban a user for the short term or permanently

Banning is probably the easiest of all of these options, but it is our feeling that having a more high-touch approach leads to a better, healthier long-term community.

So when we have to enforce the guidelines we always start small, maybe with an email or a single comment deletion. In order to be able to make this work with distributed team of moderators, we have multiple notes fields on the back end where we can leave each other notes about a particular user or in a particular thread.

The community helps with all of this through flagging tools. Making it clear what the real issue is can really help working out disputes since some users will view any sort of sanction as an implicit “You are a bad person” statement. A few examples.

  • “We’re not the thought police and you can think what you like but on the site you need to treat other members with respect.”
  • “There is a difference between being angry and responding angrily to something.”
  • “Your behavior seems like you’re trolling. If you are not trolling, could you please change your behavior?”

These statements can seem a little strange and patronizing out of context, but they’re surprisingly effective. Ultimately, our users are young adults and adults, and are allowed to think what they want and (within reason) behave how they want off-site. However, it is appropriate for moderators to explain and reinforce what the guidelines and expected behavior are for the community.

The one other thing that I think is essential for community enforcement is having a place where the community can discuss moderator actions. We have a part of the site called MetaTalk where people can bring concerns about moderator actions that did or did not happen. In this entirely-optional part of the site, the entire community can weigh in on disputes and moderators participate in those discussions. Having moderators that are actual contactable community members is an important part of community trust. And on the back end, having tools so that moderators can respond quickly and effectively to problems also increases community trust that the system is working.

I really believe that moderator “soft skills”, such as knowing how to re-rail conversations and interact with aggrieved users, is actually more important than any technological tool that could be built.

You also have to hire enough moderators so that you have 24/7 coverage – unless you want to say “This feature is only available between these hours” – and they should be able to respond at least somewhat quickly and personally to issues that arise. Giving your moderators a robust set of tools, as well as a good job where they are fairly reimbursed and well-treated, may not seem like it’s a scalable worthwhile investment, but it returns extensive benefits in the health of your community.

Jessamyn West does library technology work in Central Vermont and works for the Internet Archive’s Open Library project. She has previously written about moderation on Medium: Bad Comments are a System Failure.

See also
Matt Haughey (Metafilter founder) at Gel 2010
If your website’s full of assholes, it’s your fault by Anil Dash

 

How can we make these guides better? Let us know
Subscribe to our newsletter for updates