It was an interesting week of customer experiences in the fast food world.
A Hot and Cold Customer Experience
On the way to a concert Sunday evening, my wife and I stopped to grab a quick meal at one of our regular tex-mex fast food restaurants. After patiently waiting in line, we got to the front and the server asked, “Can you wait a few minutes while I fill these on-line orders?”
I was not impressed with them giving priority to online customers over those who took the time and effort to be in the store, so I tweeted my displeasure. By the time we were sitting in our seats at the concert an hour later I had received an apologetic response from the company’s social media team and a promise to follow up.
The follow up was an email on Monday with a generic “please rate your experience” survey attached. The good feeling created by the social media team's quick response was undermined by the apparent disconnect with the customer service process.
Bland Food, Seasoned Service
Contrast a few evenings later when I ordered delivery from another local fast-food chain. They emailed me when the order left the restaurant with a link to an interactive app where I could actually track the driver’s progress towards our house.
The food arrived ahead of schedule, hot and well presented with a print out on each container with the details of each individual order. But it wasn’t as tasty as we felt it should be based on our in-store experience — none of the food had enough of the sauces that give the chain its distinct flavor.
The following day I had a survey call from them too. Not a generic email, not even a robo-call automated survey, but an actual person who listened. When I mentioned the lack of flavor she said she’d pass it on. An hour later I had a call from the manager of the local restaurant asking for details of why we weren’t 100 percent satisfied with the taste of our meals.
Guess which chain will get our service next time we want some fast food?
Suffering From Survey Fatigue
There's a well-worn saying that we have two ears and one mouth so we should listen twice as much as we speak. Too many companies use surveys to try and measure the customer experience, and in doing so say they are listening to, and capturing, the voice of the customer. But the truth is that these don’t really work.
We are all suffering from survey fatigue. Every single store on a recent shopping trip asked us to fill in a survey by going to a URL the sales associate helpfully circled on the receipt. It sometimes feels as if you can’t undertake any retail transaction these days without being surveyed.
And how many of these do you fill in, or respond to? I’ve seen and heard industry statistics that suggest that up to 90 percent of customer experience surveys are ignored. Why?
Some of the most common reasons for the low response rates include:
- Too many surveys from too many sources
- Sending surveys after too much time has elapsed since the customer’s interaction
- Expecting the customer to initiate the action (i.e. “Please go to this website and let us know what you thought”)
- Customers don’t see any changes due to feedback they give, so they stop giving it.
The last point speaks to the heart of the problem: companies are collecting data, but they are not listening to what the customer is saying.
A survey on voice-of-the-customer programs revealed that:
- 75 percent of companies are only collecting or analyzing data without deriving much actionable insights
- 46 percent are only collecting data without analyzing or doing anything relevant with it
- 23 percent collaborate around this data with other groups
- Only 2 percent transform their business using collected data and insights derived from it.
The 'What' and 'Why' of Customer Experience
How can you change that? Firstly by acknowledging there are two distinct types of questions you can ask in order to measure the customer experience: “What?” and “Why?”
"What" questions may be the best way to capture data. They pose questions such as: What is the level of customer satisfaction? What is the likelihood to recommend? By collecting answers to these and similar questions, surveys can provide answers related to why satisfaction ratings are high or low, but are often without context.
"Why" questions provide context and sentiment: Why is this customer calling? Why is that customer pleased or upset? Why, exactly, does this customer want to return, cancel or upgrade? Interaction data includes customers’ open cases, phone calls, helpdesk tickets, sales orders or any other customer interaction information that gets recorded and tracked.
In order to derive the most meaningful insights from collected data, companies need not only to understand the “what” and the “why” of customer interactions, but must also be able to correlate the two. Customer experience managers need to take a holistic approach and consider both the feedback and interaction data as one unified data set.
By taking a holistic viewpoint of measuring the customer experience, it becomes easier to identify and plan actionable changes. A customer experience manager can start from a single customer mention of canceling service in a call recording and roll up from there to view survey scores, Net Promoter Score responses and other related feedback.
The magnitude of the problem can then be assessed by rolling up to satisfaction levels related to certain features of the product or service in question. If the issue is compelling enough to merit a response beyond that to the individual customer, it will be easy to define an action plan accordingly.
The Start of a Relationship
When the customers know they are listened to, and that their feedback is bringing measurable results and changes, they are more likely to continue to respond and develop an ongoing relationship.
After all listening is the key to any successful relationship.