Community building is table stakes in the success of any open source project. Even outside of open source, community is considered a competitive advantage for businesses in many industries—from retail, to gaming, to fitness. (For a deeper dive, see "When community becomes your competitive advantage" in the Harvard Business Review.)
However, open source community building—especially offline activities—is notoriously hard to measure, track, and analyze. While we've all been to our fair share of meetups, conferences, and "summits" (and probably hosted a few of them ourselves), were they worth it? Did the community meaningfully grow? Was printing all those stickers and swags worth the money? Did we collect and track the right numbers to measure progress?
To develop a better framework for measuring community, we can look to a different industry for guidance and fresh ideas: political campaigns.
My metrics start with politics
I started my career in political campaigns in the US as a field organizer (aka a low-level staffer) for then-candidate Senator Obama in 2008. Thinking back, a field organizer's job is basically community building in a specifically assigned geographical area that your campaign needs to win. My day consisted of calling supporters to do volunteer activities, hosting events to gather supporters, bringing in guest speakers (called "surrogates" in politics) to events, and selling the vision and plan of our candidate (essentially our "product").
Another big chunk of my day was doing data entry. We logged everything: interactions on phone conversations with voters, contact rates, event attendance, volunteer recruitment rates, volunteer show-up rates, and myriad other numbers to constantly measure our effectiveness.
Regardless of your misgivings about politics in general or specific politicians, the winning campaigns that lead to political victories are all giant community-building exercises that are data-driven, meticulously measured, and constantly optimized. They are well-oiled community-building machines.
When I entered the world of open source a few years ago, the community-building part felt familiar and natural. What surprised me was how little community building as an operation is quantified and measured—especially with offline activities.
Three metrics to track
Taking a page from the best-run political campaigns I've seen, here are the three most important metrics for an open source community to track and optimize:
- Number of community ambassadors
- Number of return attendees (people who attend your activities two times or more)
- Rate of churned attendees (the percentage of people who attend your activities only once or say they will come but don't show up)
If you're curious, the corresponding terms on a political campaign for these three metrics are typically community captains, super volunteers, and flake rate.
Community ambassadors
A "community ambassador" is a user or enthusiast of your project who is willing to consistently host local meetups or activities where she or he lives. Growing the number of community ambassadors and supporting them with resources and guidance are core to your community's strength and scale. You can probably hire for these if you have a lot of funding, but pure volunteers speak more to your project's allure.
These ambassadors should be your best friends, where you understand inside and out why they are motivated to evangelize your project in front of both peers and strangers. Their feedback on your project is also valuable and should be a critical part of your development roadmap and process. You can strategically cultivate ambassadors in different tech hubs geographically around the world, so your project can count on someone with local knowledge to reach and serve users of different business cultures with different needs. The beauty of open source is that it's global by default; take advantage of it!
Some cities are arguably more of a developer hub than others. Some to consider are Amsterdam, Austin, Bangalore, Beijing, Berlin, Hangzhou, Istanbul, London, NYC, Paris, Seattle, Seoul, Shenzhen, Singapore, São Paulo, San Francisco-Bay Area, Vancouver, Tel Aviv, Tokyo, and Toronto (listed alphabetically and based on feedback I got through social media. Please add a comment if I missed any!). An example of this is the Cloud Native Ambassadors program of the Cloud Native Computing Foundation.
Return attendees
The number of return attendees is crucial to measuring the usefulness or stickiness of your community activities. Tracking return attendees is how you can draw a meaningful line between "the curious" and "the serious."
Trying to grow this number should be an obvious goal. However, that's not the only goal. This is the group whose motivation you want to understand the clearest. This is the group that reflects your project's user persona. This is the group that can probably give you the most valuable feedback. This is the group that will become your future community ambassadors.
Putting it differently, this is your 1,000 true fans (if you can keep them).
Having hosted and attended my fair share of these community meetups, my observation is that most people attend to be educated on a technical topic, look for tools to solve problems at work, or network for their next job opportunity. What they are not looking for is being "marketed to."
There is a growing trend of developer community events becoming marketing events, especially when companies are flush with funding or have a strong marketing department that wants to "control the message." I find this trend troubling because it undermines community building.
Thus, be laser-focused on technical education. If a developer community gets taken over by marketing campaigns, your return-attendees metric won't be pretty.
Churned attendees rate
Tracking churned attendees is the flipside of the returned-attendees coin, so I won't belabor the point. These are the people that join once and then disappear or who show interest but don't show up. They are important because they tell you what isn't working and for whom, which is more actionable than just counting the people who show up.
One note of caution: Be brutally honest when measuring this number, and don't fool yourself (or others). On its own, if someone signs up but doesn't show up, it doesn't mean much. Similarly, if someone shows up once and never comes back, it doesn't mean much. Routinely sit down and assess why someone isn't showing up, so you can re-evaluate and refine your community program and activities. Don't build the wrong incentives into your community-building operation to reward the wrong metric.
Value of in-person connections
I purposely focused this post on measuring offline community activities because online activities are inherently more trackable and intuitive to digital-native open source creators.
Offline community activities are essential to any project's journey to reaching traction and prominence. I have yet to see a successful project that does not have a sizable offline presence, regardless of its online popularity.
Why is this the case? Why can't an open source community, usually born online, just stay and grow online?
Because technology choice is ultimately a human decision; therefore, face-to-face interaction is an irreplaceable element of new technology adoption. No one wants to be the guinea pig. No one wants to be the first. The most effective way to not feel like the first is to literally see other human beings trying out or being interested in the same thing.
Being in the same room as other developers, learning about the same project, and doing that regularly is the most effective way to build trust for a project. And with trust comes traction.
These three metrics work
There are other things you can track, but more data does not necessarily mean clearer insight. Focusing your energy on these three metrics will make the most impact on your community-building operation. An open source community where the number of ambassadors and return attendees are trending up and the churned attendees rate is trending down is one that's healthy and growing in the right way.
This article originally appeared on COSS Media and is republished with permission.
2 Comments