Today's website success metrics rarely include the quality of the site experience as a distinct concept and according to a recent report from Forrester, that’s a big mistake: "Without dedicated customer experience metrics, companies can’t tell whether the site experience actually got better or how changes in the quality of that experience affected the site’s business performance." Here's a summary of three practices you can start to fill this gap:
Forrester says that at a minimum, web customer experience (WEM) professionals should track what customers thought of the following:
- The overall visit experience: Overall satisfaction is the most common metric. Ways for determining include surveys that are signal or emotion based.
- The basics: Customers tend to focus on three basic things when evaluating a website: usefulness, ease of use and how enjoyable it is. Metrics should measure these criteria with completion rates and survey questions.
- Other criteria for a “good” experience: "Companies should use observational research to figure out what else their customers need to deem a site visit “good” and include at least one metric for each of those criteria," says Forrester. "For example, visitors to a financial services site might expect it to be secure and up-to-date. In that case, customer experience pros would need a way to measure whether or not visitors felt as though the site lived up to those descriptions."
Even if a complaint hasn't been made, today's web analytics tools are advanced enough to reveal when an experience or visit isn't as good as it could have been. Forrester says recognizing these tiny red flags is essential to improving experiences, and can be done by measuring how often customers:
- Complete processes: Check in with multistep processes, like a shopping cart or an online application. Track the percentage of people who start the process and actually finish it. Low start-to-finish completion rates are a sign that you could be asking too much of your visitors.
- Have technical problems: Behavioral analytics systems often record errors which may not keep visitors from doing what they came to do, but are still signs of distress (404s, 505s). Counting the number of technical errors as well as the percentage of visits during which there was problem and making sure these numbers decrease over time is an obvious but healthy practice.
- Make data entry errors: Measure how often visitors make format mistakes where the patterns are. Detecting problematic fields is useful, as they may need clearer labels or contextual help to make them easier to fill out.
Post Visit Behavior
Forrester claims knowing the quality of the website customer experience is only half the battle. Professionals should should also be able to connect the dots between experience quality and the site’s value to the business, which requires tracking data such as:
- What visitors did after their visit: Prove that visitors who have better experiences do more things that benefit the firm. This can be accomplished by utilizing a unique member ID or customer number which links survey responses to records of future behavior in point-of-sale, customer relationship management, or other back-end systems.
- Their resulting intentions: If it’s too hard to track actual behavior, asking customers to report what they intend to do can be a useful substitute. Measures such as likelihood to recommend a site, likelihood to purchase from the site again, or likelihood to return to the site are among the most common.
In the end, it sounds like it's about common sense and not letting any stones go unturned. Forrester's report, What Are The Right Web Customer Experience Success Metrics? can be found in full here.