Here's the dilemma: We know that great customer experience is tied to something responsible for speed. Historically we look to the content management system (CMS) to provide that speed.
When it doesn’t, we diagnose the causes in terms of the CMS.
We note that content can become too large (a fact that endangers my own welfare), that transactions can become too slow, and that design can become too haphazard. It seems like a simple enough proposition that the designers of a website should take the necessary steps to make things faster and easier for the customer.
When that doesn’t happen, we attach an application performance monitor (APM) to our websites, and we measure the traffic flow, transaction rates and display times. We create hard numbers.
Surely those numbers will communicate the message, and somebody will finally do something.
Meanwhile, under our feet, the foundations of our websites are being renovated. Web applications are moving to the cloud, and we're migrating the databases that contain our customers’ personal and private data from data warehouses to big data stores.
The core components of our web apps are moving — or have already moved — off the browser on the customer end and onto the server on our end.
And “our end” may have even moved elsewhere, outside of our own data centers and onto those of cloud service providers.
All of a sudden, those hard numbers are communicating the wrong message to the wrong people.
“There are multiple aspects of defining the technical quality associated with the delivery of that experience,” explained Aaron Rudger, senior director of product marketing for APM maker Dynatrace, “that live in the IT organization, that they use to help guide them in terms of investment, process, remediation and service management decisions.
“All of those tend to be very arcane to the IT discipline.And that’s what ends up creating this disconnect.”
Reconciling Marketing, IT Perceptions
In a very candid discussion with CMSWire, Rudger acknowledged this “disconnect” — a gap between the marketing division’s perception of performance and IT’s perception — is getting worse. In many organizations, the outcome is two sets of largely incompatible key performance indicators. Progress, ironically, is the cause.
“Because the two hemispheres tend not to focus on sharing both sets of KPIs — the ones that relate to the attainment and achievement of certain digital outcomes,” continued Rudger, “and what it takes to actually deliver the technical quality that supports the delivery of those digital outcomes, that’s how we end up having this disconnect.”
It would be wrong to say IT has ended up measuring the wrong things, he believes. The business goals that IT aims to serve, continue to be laudable. But the challenge today for the APM field, including Dynatrace, as he articulates it, is to rephrase performance in modern and applicable terms, that serve the same goals.
Consider Shipping and Handling
It’s outside the scope of this publication (thankfully, as my editor here might interject) to explain the technology shift that led to this disconnect in technical terms. (Heaven knows I’ve tried.) So forgive me if I resort to an analogy that makes the basic point.
I’ll use 1970s technology for this analogy. Suppose our customer service were to help the consumer produce a more effective family budget. We built our reputation on shipping our customers the best desktop calculators for free. And we provide them with the most explicit instructions on how to use those calculators, by way of that most competent of services, the post office. And we show customers how to use those calculators on a regular TV show on public broadcasting.
We’re measuring our effectiveness in terms of the speed of postal delivery. Yet let’s say, in a period of just five years’ time, the following technical advances happened:
- Instead of a one-way TV show, we could now communicate with our customers using live, two-way video.
- So we don’t have to ship our customers free calculators any more. A customer can tell the operator what she needs, and the results of her calculations appear on the screen. She still sees her calculator, but it’s at the other end of the discussion.
- The customer can share her appreciation (or her disgust) directly with the operator, instead of filling out a form and mailing it back — so we have immediate customer sentiment analysis.
'Customer Transactions' Have Evolved
The entire meaning of “customer transaction” has changed in this hypothetical analogy. The extent to which it has changed in reality, for modern websites, is at least this broad.
So here’s the problem: Measuring our real-world digital transactions the way many organizations continue to do today, is the equivalent of expecting the post office to speed up our interactions with customers.
“It’s a big problem. We’ve been thinking a lot about how we start decomposing it,” said Rudger, “so that customers can start moving toward a better, more holistic, integrated relationship.”
As this series continues, we’ll explore how that decomposition and the subsequent re-composition might work.
For More Information:
- Website Obesity is Eating Your Business by David Hsieh
- How are We Coping with Technology in the Digital Workplace? by David Lavenda
- Can a ‘Customer Performance Index’ Be for Real? by Scott Fulton
Title image DX Service ad (circa late 1960s)