Before you make any changes in your community strategy, you need to decide how you’ll know if your changes are working. That’s where metrics come in – numbers that measure some aspect of what is happening on your site.
Start with a plan.
- First, define the strategy for your community.
- Describe some of the changes you need to see from the current situation in order for that vision to be met. Eg. “We need more people who live locally to be involved in our community.” “We need more people to engage with our journalists on this topic.” “We need our journalists to respond to community members more often.” “We need to receive higher-quality comments.”
- Consider how you could measure if these changes are occurring. A single data point isn’t meaningful – change over time is.
Some change is easy to measure. You could find out how many locals participate in your forums through a survey or by reviewing subscription data — and you could perform the same review six months later to gauge how your community evolved.
Other change is harder to quantify. Let’s say you want to make your newsroom more community focused. One way to measure that would be to give reporters and editors a survey that asks what they think about the audience and how often they interact on site. To make it quantifiable, you could ask them to rate their audience engagement on a scale of 1 to 5 — and to gauge change over time, make sure to survey them before you start a new engagement push, then again after it’s underway.
You need to focus not on improving the numbers for their own sake, but on understanding what they mean as the basis for your next steps. A drop in a number isn’t a failure – it means that either you need to refine your community interactions, or what/how you are measuring the change, or both.
Numbers can be interpreted in different ways, and no single metric will give you the answer to complex questions such as “Has my community improved?”. For example, as part of Talk, we have worked to make it clear how to report abusive or spam comments. This could mean that, at least in the short term, newsrooms using Talk see an increase in the number of reports of abuse.
Does that mean there is more abuse (a bad sign of community health) or merely an increase in reports of abuse (a good sign of community health)? Or both? Each metric you collect is not an answer in itself, but a path to further questions, and other measurements you can take. If your curiosity starts to tingle… follow it.
Here are some ways to measure the kinds of strategies we talk about in our Community Mission Statement piece.
- Is the behavior of our community improving?
Possible metrics include: Asking Mechanical Turk workers to rate civility of 1000 randomly chosen commenters every few months; asking frequent commenters to rate the civility of the space; measuring the average number of Ignores / Deleted Comments per 1000 comments alongside the number of returning commenters
- How many journalists are active community members?
Possible metrics: number of journalists writing comments at least x times per week; number of journalists reporting/liking comments; number of journalists who have responded in private or public to a community member’s concerns in the past y weeks.
- Is our community helping our mission?
Metrics will depend on what your stated mission is.
- Is interaction with our journalists encouraging community members to return/contribute more often?
Possible metrics: average frequency of logins/comments before a journalist replied to them vs afterwards
- Does community membership lead to subscription? Are the followers of our community more likely to be subscribers?
Possible metrics: % of commenters who are subscribers over time; time taken from account creation to subscription; correlation between comment reading/responding and subscription.
It’s important to measure what is happening before making any changes, ideally a few times to generate a fairly reliable average, in order to have something to measure against. Also try as much as possible to compare like with like – engagement around an election is likely to be different in tone and quantity than at other times, for example.
Don’t try to optimize for the metric straight away. If the numbers don’t go as you hoped/expected, this leads to more questions. Are the metrics not telling the whole story? What can you learn from user interviews to help explain the behavior you’re seeing? Every measurement is an opportunity for learning and improvement, while also only telling part of the story. Once you’ve settled on some cohesive Key Performance Indicators, track them carefully – and continue to consider other contextual metrics that might tell you more of the story.
To learn more about metrics and ways to measure success, look at the work of The Engaging News Project to see examples of measuring different aspects of online behavior and engagement. You can also read the comprehensive study of comment spaces conducted for us by the Engaging News Project on online comment spaces here.
Thanks to Kelsey Arendt at Parse.ly for her advice on this piece.