Coral by Vox Media Guides
Support system & Sky light arrangement of Main Dome Jami Masjid

Support Your Moderation Team

by Anika Gupta

The summer of 2015 seemed like the summer that online comments – at least on news articles – were finally going to die. Several news organizations and media companies abolished or suspended their online comments sections,. Nilay Patel, editor-in-chief of The Verge, explained why they chose temporarily to remove comments from the site: “What we’ve found lately is that the tone of our comments (and some of our commenters) is getting a little too aggressive and negative – a change that feels like it started with GamerGate and has steadily gotten worse ever since.”

Challenge accepted. I wanted to understand why comments got so bad, and how to make them better. I decided to spend my summer during graduate school in Washington DC and New York, researching online comments sections on news websites, to learn about the problems and to try to explore some solutions

At the time, existing research on comments had focused on some of the ways that journalism – in particular, the writing and editing of stories – had changed as a result of online comments. I was interested in a different question. Comments in almost all situations are read and moderated by human beings. It’s a job that technology can help with, but currently doesn’t manage without the help of moderators . Who are those moderators, what challenges they facing, and what motivates them to keep going?

Even as they shut down comment sections, news organizations and media companies continue to invest in discussion and engagement. They expand their audience engagement teams dealing with social media and analytics, host reddit conversations around big stories, and ramp up their tweets, Facebook posts, Snapchat stories and more.. Even if all comment sections go the way of the dinosaurs, dealing with users – intimately, immediately, in spaces where those users are already expressing themselves – is becoming an increasingly important aspect of journalism.

Comments aren’t dead – at least, not yet. They offer fantastic insight into how to better organize, structure and reward the rest of the newsroom’s engagement work, while also offering a greater ability to control the boundaries of the conversation.

Moderation is a broad category, so to narrow my research, I defined a moderator as someone who dealt with comments in an online forum on a daily basis. I categorized the work of my interviewees work in three ways : volunteer internet moderation, professional community management, and a mixture of the two. Daily tasks included banning users, negotiating with users, reinforcing terms and norms, deleting comments, and responding to comments. I soon found out that although news organizations weren’t good at supporting this work institutionally, they weren’t alone.

For several of the moderators I spoke to, their worst days at work happened when toxic online debates, usually around controversial topics like race or gender, collided with total lack of institutional support or understanding. One moderator at a large legacy news organization, who moderated the comments on a blogpost she contributed to, described the problem:

“We had one article about [a female] scientist…all the comments – it was almost universally from men – and were all about how incredibly ‘unfuckable’ she was.”

As a woman, she said, she found the experience “painful.” Moderation tools, such as banning users by their IP address, turned out not to work. Nor did these tools take into account the intense emotional burden of dealing with sexism on a daily basis. She said that her company never acknowledged the burdens on staffers, and instead prioritized the traffic and engagement that came from comments. When I asked her why she continued to moderate, she said, “I wanted [the blog] to be an inviting space. I didn’t want people to come onto [it] and see the comments and see horribleness, I wanted it to be positive. Because I loved it.”

She explained what she would have preferred that her work allowed her to lessen the burden while achieving metrics:

“It would have been really effective and felt nice – from a straight- up feelings perspective – if the organization had been like ‘this is yours, you are fully within your rights to act as a human being.’”

Sexism wasn’t the only issue that the moderators had to handle. Confronting negative comments often challenged interpersonal relationships, as another moderator at a large digital media company told me:

“I think, when I was moderating comments, the hardest thing for me was when I got to know a writer, and then I would see someone be like ‘someone should tell that fatty to shut the hell up’ [in the comments on that writer’s article].”

These emotional conflicts were the norm, not the exception. One moderator at an online community put it succinctly: “That’s a moderator’s job – you attract the negative energy of the site, so people aren’t using it against other people.”

In listening to these moderators discuss their work, I found myself drawn to a parallel from nearly 40 years ago. In her book The Managed Heart: The Commercialization of Human Feeling, sociologist Arlie Hochschild coined the phrase “emotional labor”, and defined it as follows:

“This labor requires one to induce or suppress feelings in order to sustain the outward countenance that produces the proper state of mind in others – in this case, the sense of being cared for in a convivial and safe place.” (1983, p. 7)

Hochschild studied how flight attendants’ jobs required them to create positive emotions in customers. These positive emotions benefited the company, but while maintaining them, workers risked becoming estranged from their own feelings. Hochschild estimated that many American jobs, particularly those in the service industry, relied on or required some form of emotional labor.

The more I spoke to moderators who managed online communities, the more I saw useful comparisons between online moderation work and emotional labor. Many moderators’ key goals revolved around getting members of their community to experience certain emotions. One moderator told me his target was “giving [our top users] tips or incentivizing them… I follow a lot of them on Twitter, and they have this casual contact to reach out to [those] whom they feel like they know (Member, 2015).”

As with Hochschild’s flight attendants, this emotional labor had a consistent dark side , especially when moderators had to suppress their own emotions in order to manage a community. The moderator I cited above who dealt with sexist comments told me, “On some levels, I think we actually consciously disengaged.” But that disengagement led to disillusionment: “I would love to think that I could go onto a news site and actually have a conversation [in the comments]. There’s a part of me that doesn’t think that’s possible.”

Another moderator told me that “you have to be able to dispassionately relate to [a comment section] in order to effectively moderate it. That was difficult for me because I have strong political leanings.” Several moderators talked about getting “burned out” from their work.

This type of work, while essential, often isn’t well paid or particularly visible. I spoke to one moderator who’d worked on a popular subreddit, who put it succinctly:

“A lot of social media sites… neglected any sort of moderation for a long time… probably because venture capital doesn’t see value in community management. They have no problem if you want to [pay] 200k a year for your engineer.”

Her comment suggests a broader theme: a consistent undervaluing of emotional labor relative to other technology work. Policies fail to acknowledge the fact that moderation work is difficult and emotional, and that it carries different risks for different people. Also, the moderation tools that exist are usually created post-hoc, and are incomplete, while on social platforms, people who work for news organizations have far fewer tools available to moderate comments .

I heard common concerns from the people I interviewed, as well as some important steps that companies can take in order to build a stronger culture around community management and audience engagement. I hope that this list of moderators’ suggestions can be used as a starting point for a wider conversation.

  1. Create core values
    Companies need to create a core set of values when it comes to what they want their communities to deliver, and to make sure that vision goes all the way to the top of the organization. As one moderator told me, the central tenets that everyone should agree on are: a) this is what we want the community to look like, and b) this is how the community team should be achieving those goals.

  2. Identify the right skillset
    Companies need to define a clear set of hiring criteria, and consider how to promote these skills internally into management roles. When I asked what made for a good moderator, I got a variety of responses. One moderator told me, “the better writers do better,” while another said that “you have to be able to read and synthesize quickly” as well as “put [your] own ego completely aside.” But hiring for these skill sets also means recognizing and promoting them internally. An editor at one large, traditional media company explained the challenge: “We’ve struggled sometimes to retain people because they didn’t see real opportunities for promotion, for what would come next after social media or community editing. The main trajectories might be moving to be a desk editor, which isn’t the same skill set, and for people who enjoy this kind of work, that wouldn’t be the best use of their skills.”

  3. Pay for the work
    Companies need to value moderation work upfront by paying decent wages for it. Communities have a direct business value for every company’s bottom line. Good moderators will put hours of time and energy into crafting and reinforcing community norms, which separate an actual community from just a group of people using a platform independently.

  4. Build the right tools
    Companies need to dedicate the technical resources to building tools that their moderators need, on an ongoing basis – just as they would for other crucial internal systems. As one moderator put it, “It’s so disempowering to say ‘we could run this so much better if we had the ability to move stuff instead of just delete it’.” Dedicated technology resources will go a long way to alleviating community management problems before they become crises.

  5. Articulate your values
    Companies need to reinforce their values throughout their coverage in ways that go beyond just the Terms of Service or User Agreements, including in public and internal statements by senior leadership. One moderator explained this process as “disseminating ideas and values that go beyond a user agreement that no one’s ever going to read.” When the
    Guardian published “The Web We Want” and released data about which of their writers received the most online harassment, they articulated an organization-wide explanation and commitment to their values.

  6. Provide support
    Managers need to offer regular opportunities for their team members to talk about the challenges that they face while moderating, and create methods of support, including between team members, for when conversations or threads threaten their health and wellbeing. One moderator said to me that the attitude often conveyed by their leadership is that moderators should just “toughen up,” a mentality that promoted burnout.

  7. Back your team
    Companies need to recognize and support moderators in order to build morale, and also to negotiate with community members in order to build trust. One moderator described what this looks like in practice: when a moderator messes up, the organization needs to back them publicly and also explains to the community: “This is the plan, this is the restitution, now you need to chill out about it.” Backing moderators builds a culture of support internally, and being transparent about decision making respects the agency of the community members.

  8. Invite community members to be a part of your help
    Companies should consider inviting core community members to take on roles of greater responsibility for the community. “Core” community members could be people who have a record of leaving good comments, and so their comments get automatically preferred or promoted (published without pre-moderation, or given a badge, for example). Companies can also invite members to take a more active role in welcoming new members, and articulating the values of the community to them if they step out of line. Perhaps they even have the ability to remove comments for moderator attention. When such systems of visible reward exist, as one moderator put it, “people strive to be one of those [community members].”

  9. Have clear goals
    Companies need to have clear goals for their comments, or for any other form of engagement – or they shouldn’t do it at all. A lot of the frustration around comments sections evolved because news organizations didn’t
    want to host comments sections, but felt like they were obligated to because comments sections had become the norm in other parts of the Internet. They had no goals and strategy about the space, and gave no direction either for what success could look like, nor how to measure it.

 

I spent several months interviewing moderators, as well as the people around them who supported and enabled their work. It was fascinating and difficult to see how mods have shaped so many of our conversations and possibilities online, often without acknowledgement. One of the people I interviewed described best why moderation work still held so much fascination to them: “It gives you a window into the human condition, watching the things people say, versus the things they do, versus the sanctions we have for them.”

In the case of news organizations, these sanctions have too often been insufficient, and support for the moderators too limited. But those are temporary conditions. I believed then, and still believe now, that by recognizing their labor, by rewarding good behavior, and by empowering both moderators and commenters more effectively, we can together create a strong, open, and better web.

Anika Gupta is a product manager at National Geographic in Washington DC, where she works on Your Shot, a user-generated photo community, as well as social products. (She doesn’t moderate the comments!) She graduated in 2016 with a Master’s from MIT, where she studied online comments and community at news organizations. This post was adapted from her graduate thesis. You can find her on Twitter @DigitalAnika.

 

Photo by Sushant Savla (Own work) [CC BY-SA 4.0], via Wikimedia Commons

How can we make these guides better? Let us know
Subscribe to our newsletter for updates