Improving customer experience should be an iterative and seasonal process that frequently tests new initiatives and drives decision-making. This process should consist of seven intricately-linked steps. So far in this series, we’ve talked about program design, project design and sample design. Now, we’ll talk about the fourth step: survey design.

The most important question when authoring a survey deployed as part of your CX program is an internal one:

“What am I really asking?”

While a customer will never see this question, they will instinctively know when it was not asked, and the health of the program will suffer over time. Compliance or methodology are unlikely to impress your customers in the long run. A clear outcome or benefit will. A conversation focused on benefits that results in a visible positive change for the respondent may help to drive engagement, goodwill and revenue.

A Question to Guide Your Customer Survey Design

Customer research is no longer a novel experience. In the worst-case scenario, it is a numbers game where metrics become detached from reality. Optimally, it amplifies each customer’s voice and results in changes made by and for the customer base.

In order to use your CX program as a customer engagement tool, each survey goal and each survey item should answer the question: “What am I really asking?” This question appears to be straightforward, but it is not. Some important benefits of asking this question to evaluate each survey item include:

  1. Careful Planning: Definitions matter. They matter even more when speaking to 384 people at once. Taking the time to operationalize your concepts is paramount. The question that contains the dependent variable must be carefully planned and considered in context. Each subsequent question a CX practitioner creates will drive insights based on the depth of that consideration (or lack thereof).
  2. Clarity: The unspoken words in a question and the assumptions made about the answer can substantially influence the aggregate answers and the actionability of those results. The human brain will find patterns and fill in gaps naturally. Your job is to state a question in a way that is unambiguous and answerable so that a respondent does not have to think. (Note that this improves data quality while respecting the time of the customer!)
  3. Style: The words you do use (especially when they are commonly used) can have the same significant influence. Respondents have varied experiences, opinions, backgrounds and interests. Surveys require universality. Pretest with a diverse group to make sure they all read your words the same way.
  4. Perspective: Every question is in equal parts simple and complex, depending on perspective. A practitioner may be asking the question in a certain way because that is the way they have always asked it, assuming that the listener is able to understand any accumulated nuance. The interrogator and the respondent may have vastly different reasons and expectations to interpret the answer.

Related Article: How Thoughtful Surveys Generate Valuable Customer Feedback

The Art of Asking Questions

There is an art to asking questions, and every survey author should be continually honing the skill. Context should lead your decision-making. A question that is generic enough to have been used elsewhere is unlikely to yield actionable information that will meet the needs of decision-makers. In any context, undefined concepts or too many questions asked at one time can drive cost and ambiguity, influencing the positive perception of future research endeavors.

Learning Opportunities

Reconsider your objective in terms of what you are really asking for and why. What assumptions are you making about respondents? Is the question relevant to each individual’s experience? Is it designed for the customer’s benefit or for yours? What do you need to know?  What is “nice to have” information that goes unexamined? What is a client willing to change? What are you willing to change? What will be the business impact of those changes?

We tend to be too cautious when it comes to changing words in questions. Many practitioners can be dismissive of the influence of survey design on a CX program, usually because it seems so straightforward: “Ask a question, get an answer.” If the story is the same or the result is unchanged, the question is a security blanket that provides no real value for you or your customers. It is much better to change a question to improve its quality than it is to keep an unproductive one for fear of a recalibration of results.

A fundamental question about communication is, “Do we really ever hear what another person is saying?” Communication is more difficult than it appears and is often noisy. Here are some practical suggestions about how to separate the signal from the noise in CX:

What Am I Really Asking?

  1. Clear questions are clear for everyone. It takes time to make a concept clear.
  2. Ask about measurable, evident and noticeable activities or behaviors.
  3. Make rating scales easy. Avoid long lists, confusing scales or “don’t know” or “not applicable” options. If a customer doesn’t know, they should have been skipped.
  4. Your survey should make sense. Logic and flow are noticeable to a respondent. Write from general to specific, or vice versa. Skips based on previous answer choices save frustration and emulate a good conversation. A preamble “break screen” can help change gears.
  5. Read your question aloud. How does it sound? 
  6. Pretest. Ask a few uninvolved individuals to answer. See if you can clarify further.

Survey design is about reducing the cognitive effort for your respondent. Organizing around this “most important question” will also have a far-reaching impact on customer engagement, response rates and material improvements to delivery of your product or service. Happy surveying!

Editor's Note: This is the fourth article in a seven-part series on customer experience design. Check back soon for the next installment.

fa-solid fa-hand-paper Learn how you can join our contributor community.