See all posts

4 min read

Community-Driven Product Development - Community Engagement During Testing and Development

We're collaborating on a six part educational series with Mark Tan, Director of Product and Community at Wyze. We'll discuss trends and takeaways when it comes to building products in tandem with your community. Mark will cover everything he's learned about community-driven product development, so you as a product manager (or other team leader!) can get as close to your end users as possible.

This week, Mark discussed how to integrate community members into the testing and development phases of your product lifecycle. We've also excerpted questions from our follow-up Slack Q&A. You can find new videos posted to our Community-Driven Product Development YouTube playlist every Tuesday afternoon, and you can join our following live Slack Q&As for the next 3 Thursdays. See you there!

You mentioned making sure you communicate your beta programs plans far in advance to ensure you recruit the right members. How far in advance do you typically do that? And what and how do you communicate to them in the initial outreach?

Anywhere between 2-4 weeks depending on the product complexity and project team readiness. You want to reduce changes on your scope and have a clear set of instructions as much as possible because it can be very confusing for your community members if you are changing your plans frequently.

Also, people are busy so your community members will forget if you do it way ahead. If you need to ship instructions, swags, etc, you also need to think about the shipping time.

We have onboarding email to welcome them for joining our beta program, then it points them to the guidelines, instructions, and videos on how to help us with our testing.

Once you have the participants recruited, you mentioned you receive different types of feedback: Issues, Ideas, Praise, and Other. Once you receive the feedback, how do you share it with stakeholders? Do you share each bucket with different people in different formats?

Product Managers can access these directly, but we also have reports generated and published in an email. So we have one summary report that goes out to the project team, then we share specific files to the right group. There’s some project management involved to make sure that we keep track of progress.

It looks something like this. Note - this is not ours, just grabbed this from the web to illustrate a similar tool that we use.

Related question - How do you measure the impact of your beta and pilot programs? Are there certain metrics you track and report to leadership internally?

We measure impact by collecting survey responses and rating the product readiness score (NPS). A score of 4.0+ is a good sign that testing is going well, and we report this to leadership as part of project tracking.

In the video you discuss the importance of setting expectations with testers so they know what to expect. Are there key pieces of info you'd suggest everyone include?

I’d break it down into the following:

  • Overall objectives
  • Scope (features that will change so that your testers know which part to focus on)
  • Out of scope and limitations (especially since test environments don’t always simulate production environments)
  • Frequency and duration of testing (include the most critical dates that your participants should be aware of)
  • How to provide feedback

These are the most important elements, then each one has its own set of instructions.

You end the video with three things to help ensure a successful beta/pilot program - Rapport, Information, and Encouragement. Specifically for encouragement, you call out rewarding those members who have been very helpful. How do you do that?

Yes! A simple thank you note, or shoutout goes a long way. Track the number of issues, messages, and quality of feedback submitted and use that as a basis for reward. You’d want to keep it simple because you also don’t want to turn your testing into a competition. Your community will appreciate simple gestures so start by paying attention to their contributions.

And on the flip side, you also mention providing support for those who need help along the way. How do you do that?

By reaching out to them if they are stuck or need help. Usually, those are compelling cases because you can figure out any functionality or usability issues. In that case, you need to prioritize it even more.

Check their activity level, number of issues posted, and feedback to see which testers need the most help.