How to Optimize Your Customer Satisfaction Surveys
Also published at projectmanagement.com
Customer satisfaction surveys are one of the most used feedback mechanisms. I have conducted several surveys for internal tools used by engineers within the companies that I worked at, and here I summarize my experience. While I talk about internal surveys, most of what I describe here is applicable for external surveys as well.
Before starting any survey, think through the three questions—why, what and how:
1. Why are we counting?
It takes up valuable time creating a survey, administering it, analyzing the results, and acting on it. Respondents must spend time as well. Without a clear “why,” it’s a waste of time and effort. So always start with the “why.”
2. What are we counting?
The next obvious question is the “what.” Determine what you are going to count. Ensure there is no ambiguity in the attributes you plan to count.
Also determine which metric you are going to use. There are several metrics: Net Promoter Score (NPS), Net Satisfaction Score (NSAT), Customer Satisfaction Score (CSAT), etc. Based on my experience, NPS is often used for external surveys, and it is often just one question followed by an optional open-ended question for feedback. This might not give you a good enough signal for internal tools. NSAT and CSAT are the most common ones that are measured for internal tools.
3. How are we counting?
To eliminate any biases or fallacies, we need to determine how we are going to count. Here are some sub-questions to think about:
- How many people are we going to survey? This is to make sure we have a statistically significant sample size before we draw conclusions, and we are not prey to any base rate fallacy.
- Do we have a representative sample? We need to make sure the survey studies different personas that use the internal tools. Example: If the tool is a reporting tool, executives, engineers, researchers etc. might be some of the personas involved.
- Are the definitions clear? This is to ensure that people do not interpret definitions differently. If you use any abbreviations or acronyms, elaborate what they mean in the survey.
- Framing the questions will impact the survey responses. Keep the following in mind:
- Pseudo opinions – People give an opinion even if they do not have any opinion. To prevent this, include options like “Don’t know enough to say” or “Don’t know.”
- Answer sets – Open answer sets allow people to give their automatic perceptions. Closed answer sets provide options that the user might not have thought about. Closed answer sets will get higher completion rates and have the potential for more extreme answers. Ensure the surveys are a mix of both closed and open questions.
- Response scales – Scales will skew the data. Example: If you are looking to determine how many times the users use the tool, the answer set could be daily, weekly, monthly, or once a week, twice a week, thrice a week. So, think through what makes more sense for the scales.
Here are some dos and don’ts to keep in mind when creating a survey:
Dos:
- For every question you want to include in the survey, think about what you are going to do with the responses.
- Keep the number of questions to the absolute minimum.
- Anonymous surveys ensure that the respondents are candid; however, the drawback is that if you have any follow-up questions, you will not know who submitted the feedback. My recommendation is to go with non-anonymous surveys for internal tools.
- Always follow up on the feedback coming out of a survey and publish the results. Let the respondents know how the survey results have been used. This encourages them to submit the survey the next time.
- Be mindful of the number of times you send out a survey and carefully choose the cadence. I have seen quarterly, half-yearly and yearly cadences. Choose the one that gives you enough time to act on the feedback.
Don’ts:
- Do not ignore survey fatigue. It is real, particularly for internal surveys.
- Do not use a survey if there are other ways to get meaningful feedback.
- If you are not going to use the responses to a survey question in any meaningful way, do not include that question in the survey.