Do You Employ Actionability Thinking in Survey Design?
Guest Blog Post by
Annette Franz, CCXP
You’ve been running your voice of the customer (VoC) program for the last couple of years, and you’re frustrated because you can’t seem to move the needle on the customer experience.
Why are your customers still unhappy? Why is the service so bad? When will those product issues be resolved? Why don’t your customers recommend your company? Why don’t they buy again? What’s going on?
We need to get to the root cause of this situation.
I think I can safely narrow it down to this: your surveys. Don’t be offended. I’ve seen a lot of bad surveys over the last 25 years, and I can tell you that if you don’t get them right, it’ll be a challenge to improve anything. Consider this: Are you asking the right questions? Are the (right) questions you’re asking actionable? Do you know what to do with the feedback you’re getting?
In order to improve the customer experience, you definitely need to listen to your customers. That’s a given. You need to understand who they are and what they are trying to do. And how well you’re helping them achieve what they’re trying to do. But you need to be sure to structure your survey questions in such a way that the feedback is meaningful and actionable, that it truly helps you understand the experience and how well you’re helping the customer do what he needs to do.
Data that is not actionable is just data. -Unknown
When you’re designing your surveys, are you thinking “actionable?”
What does that mean?
When you’re thinking “actionable,” you’re considering the following as you propose and design the questions:
- What will we do if this question is rated low (or high)?
- How will we act on it?
- Who owns this question?
- Who else needs this information?
- Who will act on it?
- How quickly can we make changes?
- Is this something we can actually change?
- Why are we asking this?
Asking for feedback about something you can’t change – or in such a way that you’re not sure what you need to change – is pointless. You’re wasting your customers’ time and your company’s time. If you can’t succinctly answer these “actionability” questions, then reconsider what you’re asking.
Once you’ve thought about – and clearly answered – these higher-level questions, it’s time to think about question design. How are you going to ask your survey questions to ensure that you can effect real change for the customer experience?
Here are a few survey design tips – still using our “actionable thinking” approach – to make sure you’re asking meaningful questions.
1. Don’t ask double-barreled or compound questions. If you’re not familiar with this phrase, I’ll give you an example: “How satisfied are you with the speed and quality of the solution you were given?” You’re asking about speed of the solution and quality of the solution, two very distinct things. First, this will confuse the respondent. What if speed was great, but the quality wasn’t? or vice versa? Next, it will frustrate whoever needs to act on it because it’s not clear what needs to be fixed, one or the other or both. Keep your questions to just one thought/concept.
2. Don’t ask leading or biased questions. “We know you loved our new soft drink. How much did you love it?” OK, silly example, but you get the point. Don’t bias the question wording by putting a positive or negative spin on it. Simply ask what you want to ask; don’t lead the witness.
3. Don’t ask generic, high-level questions that aren’t specific enough to drive change. For example, asking customers to “rate your overall satisfaction with our website” without additional detailed attributes about the site or without an open-ended question to understand the why behind the rating is not helpful.
4. For open-ended questions, be specific. Ask exactly what you want to know, e.g., “What can we do to ensure you rate us a 10 on overall satisfaction the next time you do business with us?” Or, “Tell us the single most important reason you recommended us to your friends.”
5. Make sure your questions are not ambiguous. Write questions clearly. If a respondent pauses and says, “What do they mean by that?” then the question is poorly constructed. Poorly constructed questions result in responses that are not actionable; nobody really knows what they mean.
6. Your question response choices and rating scales should be mutually exclusive. When response choices overlap or don’t make sense, they become meaningless; and that means they are also not actionable.
7. Do your homework. Make sure you provide a complete list of response choices. I hate when the one answer that should be there is missing. Be sure to provide an “Other (please specify)” when appropriate. Not offering this latter option either forces people to skip the question or to select something that may not be accurate – and that’s not actionable; it’s misleading.
8. Only ask questions that are relevant to that customer and his experience – don’t mix in a bunch of marketing research questions or other nice-to-knows. Those questions are out of context and are also not relevant to what you’re trying to achieve, which means they aren’t actionable for your cause.
9. For future question ideas, review verbatims for emerging and actionable topics. These verbatims are a rich source of information, for a variety of reasons!
If you really want to improve the customer experience, then you need to start with good data! You need to ask the right questions: relevant, meaningful, and actionable questions. Then analyze to identify the key drivers and next best actions. And don’t forget to act!
Insight alone does not cause change. Change requires action. -Lolly Daskal