All Product

239 Zest member survey completions: how we did it and what we learned

January 21, 2019


239 Zest member survey completions: how we did it and what we learned

Some users stick, some churn. This is the story of every product, isn’t it? Last year, I decided that I wanted to know why. So in December of 2018, we asked 1,254 Zesters to fill a survey which would, ultimately, take their Zest experience to new heights. I documented the whole flow and thought process so I could fill all of you in on the details later.

That’s what this article is about. You’ll find a clear process and to do list that you can apply when approaching such a large scale, intrusive operation as a user survey with your audience. Along the way I also share visualized data, methods and advice for your survey.

Bottom line:

  • We segmentized 1.3k Zest members based on their usage
  • Reached out to them via email (see templates below)
  • Got 239 completions (19% completion rate)
  • Re-worked our roadmap according to their feedback
  • Got 65 NPS score which is pleasing like drinking a fresh lemonade on a white sandy beach here


August of 2018 was the kick-off date for our 2019 strategy with retention-based growth as the backbone. (Check out the Zest Marketing Strategy Gantt for details.) Our roadmap was constructed with the help of our Member Advisory Board (MAB). The preparation and the kick-off stage moved a bit slower due to our turtle-ish transition to the new mindset.

Nevertheless, the goal was for each member of the project to choose a member segment to represent and fight for (we have 3 segments: consumers, contributors, chiefs). I’m Team Consumers and Katarina is Team Contributors. The CCO–Yam (Chief Chief Officer).  

Since we do not have a dedicated Product Manager to work with, our roles included making sure every step in the process of growing our segment was covered. That meant developing a “secondary-product roadmap” per segment which should then be integrated with the main one.

But as many of you can guess, the results of this many map strategy can become a clusterf**k of uber-specific features and how-to briefs that deliver little to no value either to the team or the end user.

On top of that, we were all muddled by the fact that there was no survey, no feedback, nothing to support the “roadmap” which at this point we were still calling ‘a list of requests’.

Because of it work got very messy, and unorganized.

Eventually we mustered our thoughts and said, “Alright, this isn’t going anywhere, let’s stop, and conduct a survey”.

The main elements we used for our survey

Authenticity, Transparency & Empathy

The best thing you can do to get people to help you with a survey is to be honest, authentic and transparent. People don’t like frauds, and they can see through it.

We acknowledge individualization and differences between us. And, we’re a remote team. That means, whenever we communicate, we approach members in our natural, personal tone of voice. There is no fake fabrication of emotion. Our default is to come out authentic and transparent.

However, transparency and authenticity can be a sign of vulnerability and weakness if your audience doesn’t feel the pain and see your survey as the ultimate solution.

That’s where empathy comes in.

If you’re a Product Manager, you must know how to empathize with your users. Flip the sides and try to make people understand and empathize with you. (Hint: the shared pain and the promised solution will most likely be the reason that brought users to you in the first place).

How does that help?

Being authentic, transparent and talking in a personal tone of voice helped us spark the connection with our members. Honesty and transparency is simply being yourself and it’s very natural. Because of that, our members click with us on a deeper level. They’re not helping a company, they’re helping a person. In fact, they’re helping a friend, and that friend is working hard for a cause that provides a solution to the user. And that amplifies the need for help.

For example, Zest members are marketers who share the pain of drowning in content overload. We are the team that is passionate to relieve this pain as we experience it ourselves. Any communication with our members is imperative to us as we’re both marketers with the same pain, and we’re both in for the ultimate solution. We need the feedback, hence the personal, authentic and transparent call “I need your help with the survey”. What pain do you share with your users?

Actionable Takeaways:

  • Find out what moves your users. (Hint: It will most likely be the reason they’re using your service in the first place, the pain, the end vision of solution).
  • Approach users in a personal, authentic and transparent manner. Let them empathize with you and understand the need of the survey.
  • And most importantly, use your heart <3

Bonds & Sense of belonging

What I didn’t mention before is that you can also use deep pain to create bonds with users and build an unbreakable tribe which will benefit you in the long-term.

Shared pain brings people together to cooperate for a greater good.

Bonding wasn’t used specifically in the email. But it’s a seed we plant right from the moment when someone first joins Zest as a member, and it consistently helps us get higher engagement.

How do I know?

The most engaged and helpful members on Zest are the ones we communicate with regularly. The people who we congratulate and provide with feedback are the same ones who thank us for resolving their content overload pain. Many of those people become not just fellow members of the Zest tribe but also our friends. (If only they knew–Zest wouldn’t exist without them <3).

How to create these ‘bonds’?

Whatever we do, we always focus on interaction with the community. We love the Zest community and, because of that, the community loves us. Our interactions strengthen the bond between our team and the Zest tribe and encourages them to contribute to the Zest cause and vision to defeat the content overload.

To further strengthen these bonds, we create a sense of belonging. A bunch of standalone things we’ve done (and some we still do) include:

  • Connecting with members on LinkedIn as soon as we make contact. There may not be a lot of interaction through this channel, but members attach a sentimental value to the connection.
  • Creating a Slack group to cross-connect bloggers with writers, individuals and brands
  • Mentioning our members in articles, quote their expertise and insights
  • We connect members with one another through email if there’s any connection between them.
  • Hosting a physical gathering of Zest closest members (Yam and Idan’s juicy idea) which brought over 130 people together! Without any advertising!
David Yahid, one of our dear Zest members asks a question on a facebook group

yesssss, David you know it ✌️

But most importantly, we create a…

Sense of belonging

“Humans are lonely, and they want to be seen and known. People want to be part of something. It’s safer that way, and often more fun.” – Seth Godin

Everyone wants to belong somewhere. Whether it’s a tight group of friends or an online community, it is a basic and necessary human need that gives us a sense of fulfillment and helps us pull through the day.

Zest is a community of 15000+ WAM and 7 passionate Zesteam members who juggle all of the communication alongside their other tasks.

This community is full of people with whom we interact on a daily/weekly/monthly basis. I would be wrong if I said it wasn’t something around 4-6% of our total members.

We have created several micro-tribes of hundreds of Zesters whom we look to for insights on how we can give back to the Zest community. And we are always on the lookout for opportunities to invite the members who are eager to participate. The survey was a perfect way to do so.

Creating a sense of belonging is a valuable tool that doesn’t involve offering members a monetary reward. But it isn’t without cost. Managing an engaged sub-community is much more difficult than giving away a $100 gift card. We chose to offer community membership via micro-tribes to the respondents to our survey because we were in it for the community from the start and our goal was to develop long-term relationships.

Monetary rewards is a great way to engage a bunch of people, as it’s an extra incentive (maybe you will win a PS4), but that takes the user away from the survey and puts his eyes on the prize. That means if you prepared an extensive survey with open-ended questions to really dig deep, chances are your users will only scratch the surface of what they could actually say. Why should they?

So, when choosing how to engage your users, think about your goal – are you in for a short-term gain or long-term relationships?

Anyway, to enhance Zest members’ belongingess (yeah, I’m just going to make up my own words), we created the Zest “Le Móns” micro-tribe. Le Móns Zesters get early-access to new features, mentions on Medium and other cool things. Now, here’s how this relates to our survey. Before we surveyed Zesters, this group only had 2 people in it because we had slowly hand-selected select who to invite. After the survey, we realized that there were a lot more Zesters who wanted to belong there!

Actionable takeaways:

  • Use the pain your product solves to build stronger bonds with the users. They didn’t come to you because of a specific feature you built, they came for the vision and story.
  • Find ways to create sentimental value for the user.
  • There’s never too many users you can create bonds with and it won’t feel like any extra load for you at all. It’s human communication, it helps you and your company grow.
  • A monetary reward for a survey is fine if you’re looking for a short-term gain, but that won’t win over your community, only exhaust it.
  • Create a sense of belonging for your users. Look for ways to segmentize your total audience into fragments you could engage more on a regular basis. If you can make at least 1% of your total community happy, you’re in line for great brand advocacy.

The email invitation

The email we used for invites had a bunch of elements in there, so let’s dissect it one element at a time.

From the top:

  1. As you can see, the personalization for this email was horrible. We couldn’t even get a name in there. Time was short, speed was the key. Sorry 😭 (bad practice)
  2. We took care to mention other members of the segment and the cause of needed help. Empathy!
  3. We let people know what they were getting into by highlighting that it was a Typeform survey– well-known for the amazingly fluid UX–and adding the time it would take to complete. (At first, we expected 2.3 mins and we actually said it. But some testing proved us wrong so we changed it.)
  4. Extra incentive – note that the email doesn’t say anything about monetary rewards, it is an invite only.
  5. True to our commitment to transparency, we included one short sentence of what the survey’s about. There’s no need to fluff.

Overall, the email was very short and simple. Since our audience is super-busy professionals, we’re always in a race against time. So we cut down all the unnecessary fat, kept it very straightforward, authentic, and transparent and made sure it required as little cognitive effort as possible.

We could keep the email sparse because most of what we wanted to say was in the subject line: “yellow experience”. That phrase let our audience know that something important was on the line. We kept the subject line as short as possible for readability. Then the email copy further explains the context engaging the receiver and helping them to empathize with our pain.

The subject line also served to put recipients in the right mood to provide us with relevant results. We continued this messaging throughout the survey.

Don’t forget follow ups

To encourage people to respond, we followed up once with the high, mid and low-active consumers and twice with the churners. We hadn’t planned to do a double follow-up, but not surprisingly, our churners were less invested in responding.  

Based on our experience, though, I wouldn’t suggest following up more than once. The chances are that if someone got your email the first two times and didn’t bother to respond, the third time’s not going to change their mind. Following up 2 or more times is only acceptable is if you can filter out people who have already opened your email and follow up with those who haven’t. We didn’t have this luxury, so we didn’t risk annoying our members.

Let’s have a look at that follow-up email.

  1. Again, we refer to the Typeform survey due to the UX and give members a specific time for completion.
  2. We added a sense of missed opportunity – we weren’t lying when we said we’re making good progress. One glance at the results had me insanely excited. 🤩 – authenticity, transparency!
  3. Adding the touch of personal communication – I emphasize that it’s me (your good buddy, Karolis) who really wants your help on this project (and there’s plenty of people in the lists that I had communicated with) – bonds, authenticity, empathy!
  4. Finally, the incentive is mentioned to remind the member. In hindsight, we failed on this one as it indicates there’s some magical surprise, not an invite. This led to some disappointment. What? No T-shirt? (bad practice)

Personally, I don’t think these emails were as good as they could’ve have been – there were so many slip ups in the process and we’d like to apologize to our members (next time we promise no slip-ups):

  • No personalization
  • Copy issues (typos, mistakes)
  • Following up with members who had already completed the survey
  • Lack of clarity in places
  • Still lacked sense of benefit for empathy (why do we REALLY need you)
  • Intro is very dry: “You read a few articles recently, right?”
  • Closing is pretty dry: “It’s about how you use Zest and your likes/dislikes.”

And the worst of all – the timing.

We set up the survey in three days, one week before Christmas.  We launched it by bcc’ing people we wanted to survey on a Thursday–just 5 days prior to Christmas. Needless to say, our expectations as to response rates were not high.

But, we were surprised with what we saw…

Bonds, senses, emails, whatever – why does it matter

When I wrote that we build relationships with the community, maybe you thought I was pulling smoke and mirrors. However, when we read the responses to our survey, we saw something interesting.

We sent emails to 4 different segments: High-active consumers, Mid-active consumers, Low-active consumers and Churners (who weren’t actually churned, but use Zest on a (bi-)monthly basis). These 4 segments were separated using the most simple metric we could grab: clicks. Clicks for Zest mean viewing frequency. The more clicks a member makes in a week’s time, the more frequently they are engaging in consuming content from Zest.  

Since we had 4 segments, we could easily measure the difference in engagement between the segments and how bonds, transparency, authenticity, empathy and sense of belongingness affect it. It wasn’t difficult as all members received the same exact email, same offer and same survey. The key difference was how much they conformed to the Zest vision and how well they knew and empathized with the Zesteam. Unfortunately, because we sent bcc’ed emails, we had no idea how the email performed with each segment. Until, that is, we looked at the Typeform metrics. That drawn out click-through-rate obtained via Typeform showed a veeery sleek curve of engagement.

In the end, this email invite and the follow-up got us a total 19% survey completion rate, incredibly in-depth and relevant feedback, and inspirational ideas from a tribe who understands the Zest mission. (Thank you, guys).

After reading all this, you might think Zest is a psychological communication war machine. But it’s the contrary. As I’ve said before, it’s a group of people who are extremely passionate about alleviating this pain for marketers.

One thing you might’ve noticed, though, is that the three factors I’ve mentioned have crossovers between them. Think of it as a Venn diagram with the middle being the pinnacle of user love for you and your product.

Let me help you visualize that

How to prepare and analyze surveys, the right way

As important as it is to get people to the survey, improper analysis will doom your product roadmap to gather dust in the basement.

There are 10 things you want to keep in mind when preparing and analyzing surveys:

1. Define a clear and precise goal

The goal you define directly translates into what you want to ask your users. So, if you decide your goal is to discover what users want, then you are likely to start and stop with questions that limit your survey to just that. But, consider the possibilities if you look for a higher goal instead. What do you need to achieve?

Our goal was to find out two things: what makes our members stick and churn. By using these questions as our North Star, we were able to develop the right set of questions (see below) and to deeply analyze our strengths and weaknesses (similar to a SWOT analysis).

2. Watch your language

Not every user knows the terminology that you do. If you use jargon, or feature identifiers that your audience isn’t familiar with, they will find their own interpretations. And their word is final. Failing to adapt your survey language to suit your users can heavily affect the accuracy of its results.

We messed up by not differentiating our ‘tag’ feature. Zesters started to mix up the meaning of ‘tag’ with that of categories, filters, and industries. That means if we have any other categorizing feature, such as ‘sort by clicks’, ‘filter by ebooks’ and our members say “I really love your filters”, there’s no clear evidence that he is referring to tags, filters or sorting.

To avoid a disaster, write your survey questions as if the person responding has used your product only once. Once you clear the uncertainties, make sure there’s no feature overlap, when interpreted. Consider the possible synonyms, or if you have specific terms for that, simplify them (e.g. Content Stream=feed).

3. ‘Ghost-guide’ with Multiple Choice -> Open-ended Qs

Asking a sequence of multiple choice questions followed by an open-ended one is the equivalent of delivering a jab, hook, and the knockout. Following the idea that the person you’re surveying has used your product only once, you can do what I call ‘Ghost-guiding’, a tactic to dig deep into the pain of the user. (Sorry for the bad name.)

Ghost-guiding is different than leading. Leading a survey is wrong. Leading is the type of question that forces the respondent to answer in a specific way (e.g. ”Are you worried that we will never send a Zest t-shirt to you again?”) It results in straight up bias and poops on your results. If you look up what not to do in a survey on Google, you’ll find leading. However, as I mentioned, Ghost-guiding is different.

Ghost-guiding is an alternative way of saying “I embed context, guidelines and suggestions in the survey which let the user predict what they are going to be asked, without bias”. It takes the form of a multiple choice question first, inserting predefined answers to help the user understand the context of the questions

When using multiple choice questions, though, it’s important to have as few options as possible. Otherwise, your survey respondents will experience decision fatigue But you still need to provide users with enough contextual options and the choice to select ‘other’ so they aren’t forced to  make a random selection to get past the question.

Check out this example of what I mean.

one of the survey questions which asks - what are Zest's key weaknesses?

In the question, we took 4 essential features that make up from a consumer’s perspective to scratch the surface of what they felt wasn’t good enough.

After this warm-up, we asked the open-ended question. Being asked to select a weak feature began the process of asking members to think about why a particular feature was weak or strong, which is what we asked them in our follow-up question.


one of the survey questions which asks why does a member think his way?

Now, as I’ve since learned, asking ‘why’ is considered provocative and bad form.  That’s a fail on my side. (Oops) A better approach would be to ask, “What makes you think so…?” which is a less aggressive way to ge the same answer.

4. Warm yourself up before the survey

Observe any behaviour with your product in relation to your goal, if possible. To warm ourselves up before writing the Zest member survey, we participated in multiple 30 min sessions of watching Hotjar recordings to see where Zesters mouse movements indicated that they were thinking “how the hell do I find this” and “ooh, I like this.” This pre-survey research helped us understand and anticipate what members might say in response to our survey.

5. If a question is optional – mention it

Your users will get used to answering the required questions of the survey, but you might ask something, such as “Do you have anything else to say about us?”. To spare your users’ time and cognitive effort, mention if the question is optional. Trust me, you’ll avoid plenty of “No, I don’t” answers.

6. Make sure the survey is fluid

Remember that Ghost-guiding I mentioned? Your survey should smoothly transition respondents along a journey. Make sure the question in the survey are related to one another to keep your audience focused and moving forward.  Nothing is more off-putting than having to answer a cluster of random questions that confuse and exhaust you.

We found that the best practice is to connect your questions in relation to the user journey. We used Typeform because of the great UX flow it provides which prompted us to start by questioning usage, then likes/dislikes, followed by any ideas for new features, and finally asked what stopped the member from reading Zest more.

7. Avoid reacting to user feedback

I’ve said before that it’s very beneficial to find a pain you share with your users. That helps amplify the need for their help and your users to empathize and understand. However, that applies only to the preparation of the survey. It is natural to react to individual answers to the survey–particularly the answer to your open-ended questions. But if you do this, you will end up focusing on the small details and not the bigger picture.

The amount of information conveyed in the feedback might make you feel overwhelmed or that some feature of your product is outright crap and needs fixing. Slow down. There was no screaming red alarm before the survey, neither is there one now. Calm down, assess the feedback, and re-use it as context.

What I did to help myself avoid getting lost in the minutia was to read through all the feedback first–the good and the bad, and then re-sort them into categories. (more on that below)

8. More data = more noise

As you create your survey questions, keep the goal of the exercise in mind. Try to highlight questions of the survey that will answer your core questions (remember, we had two) and keep you total number of questions to a minimum.

For our survey, we ended up with a total of 17 questions (quite a lot). 7 of those were necessary to cover the user journey and to find out our holes, but only 3 were essential to our roadmap. The rest were wanted and useful questions, but weren’t absolutely necessary. We’ll know next time not to bombard our members with so many.

9. Don’t take what your users say word for word

As you assess the answers to your survey, bring out your inner Sherlock Holmes and build your own ‘crazy wall’. Whenever a user mentions some specific feature, keep in mind that their ‘want’ can be different from the ‘need’ you have to satisfy.

Giving users what they say they want isn’t always the right fix. Often, their want is more of an indicator that can point you to a bigger pain. Take those pains, create a vision that one-ups your user’s wants, relates to the bigger picture of their needs, and double down on the execution.

As I evaluated what survey respondents said they wanted, I identified and then categorized every pain by segment. Most wants related to Content feed, search, content-suggestion process, and New-tab extension. Then I connected those pains with the underlying need they expressed (e.g. multiple requests for more tags indicated a need for content personalization. I used the survey’s wants to identify product-based problems that we could solve for our members.

10. Don’t forget your team

Once you’ve finished your analysis of the survey results, remember that you are filled to the brim with information, but the rest of your team isn’t. You need to deal them in. Your job isn’t to convince them to agree with you. In fact, it’s to share the data and let your team interpret it in the context of their own knowledge and department needs.

As part of our survey analysis, we had a team meeting and at the end we played a minigame. After I introduced the survey results, we set aside 5 minutes and each team member (department) was asked to list what they thought the survey revealed as being the most important issues for members.

Funny enough, each of our answers showed how we focus on what’s relevant to us (and our department):

  • Idan, CTO, found technical stuff (speed, efficiency, getting from A->B quick) to be the most important;
  • Yam, CEO/CMO, was focused on creating value for the member (how can you teach me to be a better professional?);
  • Katarina, Community Growth Manager/Contributors segment, highlighted how our suggestion process needs to be improved.

Witnessing this process, helped me understand that what we think the roadmap needs is completely based on how we consume the data and our own biases. They were all right, and pointed out key facts as to what needs to be done, but it was a zoomed-in way of interpreting the survey.

To help your team look at other the parts of the picture and avoid potential disagreements, try doing what I did (in this order):

1. I took the list of respondents’ answers to our open-ended question asking why they had named a particular Zest weakness from the survey and added them to one spreadsheet. Since I had used an open-ended question, it was not easy to quantify the results, but I gave it my best shot. I then transformed it all into an affinity diagram.

2. Next, I took the list of features Zesters would like to see (“If you could add any feature, what would it be?”) and compiled them into a list and categorized them by the pain point they expressed.

3. Then, to get to the relevant information, I filtered out the 0 value answers like ‘none’, ‘don’t have to add’, ‘no idea why’, and separate the irrelevant answers (e.g. someone asking for a Zest billboard) into ‘Others’ which you could potentially use for inspiration some day–creating a mental-backlog.

4. For my next step, I quantified everything by category and kept the answers. The answers you’ll gather in this step are the requests and prayers from your users who put effort to help. I later used the list of requests that I had distilled from the answer set to create ideas and support them.

5. I took the rest of the data (usage, demographics, etc.) and compiled a customer persona (and had some fun with it.)

For the customer persona, I narrowed the information down to the simplest visual representation of what I thought was needed the most. To do this, I summarized every single pain revealed for my segment (consumers) and then connected each one to the existing Zest features.

For example, a month before the survey I had hypothesized that Zest consumers’ needs are a combo of added value and user experience:

Luckily, the survey answers matched my hypothesis. So, I didn’t need to create a new graphic. But if I did, it would’ve looked quite similar.

If you think about it, the steps in this process represent something of a funnel – you take lots of qualitative data, compress it, filter it out, and narrow it down into categories. Then use the information to create a customer persona, and then – a simple, digestible image.

Once this process is complete, it’s time to share it with your team. You can present your team the thinking behind all by going through the six steps above, from 1-5.

However, I chose to:

  1. Let the team read through the open-ended questions to fully absorb what the members had to say.
  2. Then, I introduced the customer persona that’s squished the data into easily understandable sentences (Be sure to ask your team for feedback, as there may be bias all over the place in this step.)
  3. After that I showed the team the visualization of what the members need, to fully simplify everything (this visual will makes a good guide later on).
  4. And lastly, I created a suggested Product Roadmap to grab features, discuss and accept/reject them, or simply draw inspiration for the future. The roadmap includes a bunch of vague and very specific features, picked from the insights of the survey.
  5. You might ask, why a new roadmap? Well, this roadmap had one purpose – to re-prioritize the existing roadmap, adding new features and removing unnecessary ones, if needed.

Here’s what it looks like (it was created by taking inspiration from Jeff Patton’s User Story mapping):

The ones in gray are released in various form (Alpha, Beta, Public) 🙂

I split the roadmap into 4 parts, based on the user’s journey within Zest that we had picked up from data. The essentials are really within the “make it or break it parts”. These main points serve as a guideline so that any features listed below would make sense.

Then, I split the the features into three different releases, which is an abstract timeline, just because anything can happen and sticking with hard deadlines is a self-stab in the back.[

The results we saw

General survey results
  • In-depth, relevant answers to open questions
  • NPS score: 65
  • 65% of total users said lack of time stops them from reading Zest more often
Open-ended question answers
  • 18% of total users indirectly asked for personalization
  • 20% of total users indirectly asked for more community interaction
  • 11% of total users directly asked for a mobile app
“Why do you think so?”
  • Contribution Process: unknown criteria, speed, notifications;
  • Search: product (not advanced enough);
  • The Content Stream: content quality, content overload
  • Chrome Extension: intrusive, makes it hard to do other things

What’s coming next?

  • Making the content smarter: personalization and tighter quality control
  • Making search smarter
  • Email newsletter
  • Mobile app
  • Improved sharing utility
  • Improved saving utility
  • Community interaction (TBA)

As for the team’s highlights: everyone is finally 100% aligned to the goal, current situation, context, and the solution. Our Asana projects are filled with tasks and we never looked happier despite being this busy. Our communication has significantly improved and meetings are now more productive, time efficient and we are better aligned across all departments. The confusion in the air has completely diminished and we feel reinvigorated.

Want to join the hyper-speed train?

As we’re moving hyper-fast to make your lemony experience better than ever, we need your help. There’s only so much a 7 lemon team can do alone. We have a bunch of things going on, and surely we seek to reward our helpers in one way or another.

Feel free to email me at to chip in and help the Zesteam. 🤘🍋🤘

Neil and Eric recommending