View all the CX Decoded podcast episodes.
Sean Albertson, head of client experience measurement and analytics with Charles Schwab, began his CX journey on the front lines in contact centers in the mid-1990s. He managed service teams, quality and training programs, and created knowledge management tools early in his career.
Moving into this century, Albertson began to place a focus on surveys, analytics and market research. He continued to expand his influence through the early 2000s to improve customers’ experiences in roles across marketing, product and pricing, and even spent some time in IT translating CX concepts into technology and digital solutions.
Today, he focuses on developing CX programs that balance quantitative and qualitative research to identify and execute CX solutions across all facets of the organizations he supports. He's leveraging advanced language and journey analytics to reach well beyond survey results to conduct deep analysis of all experiences at both the physical and emotional level.
Albertson caught up with CMSWire Editor-in-Chief Rich Hein and Managing Editor Dom Nicastro of CMSWire's CX Decoded to discuss his approach to measuring CX success.
Note: This transcript has been edited for space and clarity.
Rich Hein: Welcome once again to another edition of CX Decoded. As always, I'm joined by my co host and colleague at CMSWire Managing Editor Dom Nicastro, How are you doing today?
Dom Nicastro: I'm doing fine, Rich. I hope all is well. Great to be here again.
Let's just waste no time. We don't have a lot of preamble here. We want to get right into information on our guests, with our guest rapid fire. Who do we have on today Rich?
Rich: It is Sean Albertson, head of client experience measurement and analytics with Charles Schwab.
Dom: Excellent. Why is he here today?
Rich: He's going to discuss with us how he and his CX teams measure success, Dom. Sean co-leads Schwab's client Research Center of Excellence. And these teams integrate Schwab's Net Promoter Score program, the Customer Effort Score that his team initiated in 2019. And that along with some other survey insights together with marketing research, brand tracking industry research, and they use all this to try and form a holistic understanding of clients, their needs and their experiences.
In addition, leveraging advanced language and journey analytics, these teams reach beyond survey results to try and conduct deep analysis of all the experiences of both the physical and emotional level. It's pretty interesting stuff.
Dom: It is and that's actually so similar Rich now that you read it out, and I listened, to our last guest, you know, Luis, from American Express, that was so similar, these CX practitioners in the trenches and really wanting to go beyond surveys. So I am excited for this. Sean, welcome to CX Decoded.
Sean Albertson: Hey, thanks for having me. I really look forward to talking with you guys today. And I think this is an exciting time for us in the CX space and love to share what Schwab's doing.
Dom: It absolutely is. So happy to have you on. But we know all about you a little bit, our listeners might not. But we know for sure because you were one of CMSWire's CX Leaders of the Year, our inaugural program that we highlighted last fall at our event, the CMSWire DX Summit.
Hey, congratulations with that one, Sean.
Sean: Absolutely, really appreciated that, both the nomination and recognition. And absolutely, I think we are super excited about what we're doing and want to share it. But gosh, I was humbled by the nomination. So really appreciate it.
Rich: Yeah, I got an opportunity to read all of the award nominations, and yours really stuck out and I knew as soon as I read it, we wanted to get you in here. It's taken us a while to do it. We had to jump through some hoops. But we're here today. And we're super excited.
Related Article: Celebrating the First Customer Experience Leader of the Year
Lessons from Early Days of Contact Centers
Dom: Yeah, that was a lot of fun. Going through those nominations. We got story ideas, Rich, we got podcast guests. Here he is Sean right, case in point.
So all right, Sean. So you know, let's get into it. We know you began, you actually began on the frontlines in contact centers in the mid 90s. You're managing service teams, quality and training programs, you know, you have told us before you have a passion for enabling great customer service for your employers and employees.
Tell us about those early days of contact centers, and how they differ from the challenges of 2022. Any lessons still applicable today?
Sean: Yeah, absolutely. Well, as you mentioned, I started very early on in, in customer service, and call center, you know, is almost pre-contact, right? It was all about phones, because that's really what we had back then is maybe storefront, but really it was about call center support. And that's how we took care of customers for most of us at the time, you know. The concept itself of customer experience was still in its infancy, we weren't really measuring a lot of things. Definitely not in any sort of holistic way around experience. But our focus, right, was we want to answer the phone, we want to keep handle time low, and we want to train agents on process and procedure.
Well, ultimately, those basics of treat people, right, our customers, resolve their issues and do it fast, those are still alive today. There's nothing really wrong with that. That is what continues to drive.
But a lot has changed, right? I think first off, we've added a lot more channels to the mix of our client contacts, right. A recent Microsoft study, for instance, found that 66% of customers use at least three different contact methods with their businesses for customer service, with digital investments on websites, apps, social media, they have to now work with the traditional channels of phone chat, text messaging. Add to that AI tools like bots, chatbots or other kind of self service options, and this experience environment has gotten a lot more complex.
So not only we have a lot more channels, but we also now measure everything right? I mean data is everywhere. There are hundreds of different experience metrics you can deploy to analyze your experience across these contact channels. You know, we have more today than probably we can do with. And definitely, with all that data, sometimes it's harder to find the right data to really identify the improvement projects you need to do.
And then lastly, really, it's it's also about the kind of this modern acceptance that everyone in an organization owns some part of the customer experience, I think, earlier in my career, it was kind of one group or another really drove the experience. But, ultimately, across my career alone, I've always focused on studying the customer experience and working to implement improvements. But in doing that, I've had roles, like I said, in customer service, in marketing, field operations, and product and pricing, I even did a stint in IT.
But, ultimately, it's really, that understanding that modern understanding that experience is driven at some level by everyone. So not only do you have to work across all the channels, you've got to work across your entire organization to be successful.
Related Article: Call Centers vs. Contact Centers: Understanding the Key Differences
How Teams Split Customer Experience Mandates
Rich: So you said that everybody in your organization is kind of responsible for CX. I'm curious to know, is there a person or a team within your organization where the rubber meets the road?
Sean: And I would look at that and say, that's probably our team, as the hub, if you will, hub and spoke to say, we are the group that brings all those different experience teams. Think of them more as the action-oriented teams. And then we're the researchers that bring them all together to say, here's what we need to focus on. Here's for you in this channel, here's for you in this channel, here's how you can work together.
We're kind of a hub that helps kind of that dialogue be successful. But ultimately, each group takes that knowledge, that analysis, and then they run with it based on what's important and specific to them via their channel or business unit.
Dom: Excellent. Thanks for laying out the land there, Sean.
What's the North Star of CX Metrics?
You guys have won JD Power Awards, other recognition for customer experience. But the end of the day, you know, your team is responsible for totally measuring the success with particular CX metrics. So is there like that one North Star metric? Or is there a lot of metrics across the board here?
Sean: We use a little bit of a blend, we do look at a lot of things, but we really also focus on a few things. But I've been at Schwab for four years now. So not there from the beginning, but really excited about being able to build on such a great history, as you said.
Schwab was recognized in 2011 in Fred Reichheld second book about Net Promoter Score, The Ultimate Question 2.0. Schwab was very early adopters with NPS, worked closely with Bain research organization back in the beginning. And ultimately, our story is we increased our Net Promoter Score by 60 points from 2004 to 2011. Huge improvements there, and we're 20 points higher today.
So ultimately, although there's been a lot of public articles and talk about the demise of Net Promoter Score as a measure, for Schwab, it really works. And we've got not only that history, but we can literally show the value of it today. So for instance, we can statistically prove that our Net Promoter Score predicts actual loyalty behavior by our clients. So for instance, we know if they're a deep detractor, for instance, in Net Promoter Score, they're four times more likely to attrit and or four times more likely to have negative net assets, reduce their assets.
So again, we know that there's a lot of that focus on Net Promoter Score, and a lot of history there. So the organization did a ton of things to improve those scores over the years, from major to small ones, price reductions to major transformation initiatives, like digital transformation. But we've had that constant focus on doing what's right for the clients, increasing their loyalty and help them achieve better results. In fact, from Charles Schwab himself, Chuck, the strategy of through client's eyes is paramount to what we do at Schwab. And that really says we look through the clients eyes and the decisions we make, many of those things being measured through Net Promoter Score, etc.
Related Article: What Is the Net Promoter Score?
Launching the Customer Effort Score
However, like, kind of people talk about the negatives about Net Promoter Score, sometimes we struggle to get to actionable insights. It's a relationship metric: likely to recommend. Well, in 2019, as was mentioned, in the preview, we launched the Customer Effort Score, CES, and that was the one presented by Matt Dixon in the effortless experience. We did find that the ability to measure whether a transaction was easy or hard was key to those actual insights. In fact, what we found was if a client said, it was hard for me to do something, they're four times more likely to be a detractor, and therefore four times more likely to have disloyal type of behaviors. Well, that's almost right out of the book. We were able to replicate that in our industry and in our business to say, when we make things hard, we create this loyalty and the Net Promoter Score and the actual physical results come to follow.
So there's nothing more actionable for our teams to rally around that experience and focus on making things easy. And that really has enabled us so well. We measure a lot of different things and we use them in different ways. For us the two cornerstones is the Net Promoter Score really from that relationship, that loyalty measure and client effort, score Customer Effort Score to really understand physically, how are we impacting that relationship in the day-to-day activities.
Rich: I know you can't get too specific, but can you share some real-world examples? Ways organizations can improve NPS and Customer Effort Score.
Sean: I mean, the devils in the details, right? So if you think about client effort score and the ability to really understand the experiences you create as an organization. One, you got to be able to measure it first, right? So for instance we measure across all the different channels. And then ultimately, as we study that content, as we study the results of the client easy score metrics, we understand what those issues are. And in fact, there's a whole lot of what we'll continue to talk about as we go, but it's really about understanding and driving the measurement.
So ultimately, as we look at, for instance, we have at Schwab, a lot of people that call us, and, you know, especially kind of the older clients, they still like to engage with that human connection. We're also doing an accelerated engagement across digital. We had about 3.6 billion retail digital logins in 2021. And that's up 33% year over year, you know, 90% of our retail accounts were opened digitally. So we're starting to see that digital transformation take place, which was in many parts driven more by COVID. Gonna go and bring it up, because we have to, but yeah, a lot of that digital focus.
But CES has really allowed us to measure across those channels, and then ultimately measure across what they're trying to do to identify what was easy, and many of those tasks can really be understood. So a good example, again, to your point, I can't get to the real specifics. But we know if someone goes online and gets their issue resolved, they have what I'll just say, is a really high easy score, high percentage of easy activities, because they were able to successfully complete their task online.
However, if they call us directly, they're going to be about five points lower. So already, you can kind of see by channel digital is easier than the phone channel. Ultimately, now there's reasons for that digital, you know, I do a lot more self service things online, maybe some more task oriented transaction oriented items, when I call maybe that does share a little bit more challenge. But what was really interesting is understanding and seeing the results that said, if I was online, and then I called I was full 10 points lower in overall effort, we would see that. That's really goes right back to ease of channel switching is the more channels I use, the more it takes away from my client easy score, my customer easy score or effort score.
So ultimately, we are able to look at just like in Matt's book, The Effortless Experience, that understanding of repeat contacts, channel-switching those things, at the activity level to really understand where do I need to make improvements? So you know, taking the day-to-day research, the understanding of what clients are doing, and really driving that into client easy score, in our view, gives us that opportunity to understand where do we need to make those improvements? And what are the things that are quote unquote, broken in our customers minds?
Related Article: How Do You Make Customer Effort Score Data Actionable?
Adapting Customer Experience in a Post-Pandemic World
Dom: Well, Sean, you did say the C word. So the obligatory COVID question is now coming. It's an important question, though, we ask it over and over again. But we really need to help our listeners understand how everyone's made pivots with their CX teams, you know, so how would you compare collaboration with your colleagues, your CX team? Now, you know, post 2020? Versus before? What kind of things have you learned about how you guys work together and go about CX in a post-pandemic world?
Sean: I think it goes really to that understanding, goes back even to the earlier introduction, it says, our channels have to work together. Prior to COVID, prior to the pandemic, more people at home, working from home, etc. For instance, not having our branches open for a good year plus, everything became much more common to be alright, what's the online solution? And we had already started our digital transformation, a year plus before the pandemic, but man talk about ramping up that work.
Well, what changed for us from a CX perspective wasn't just in many ways, how much more important the digital channel was becoming, because of COVID. But it was almost more important to understand how those channels came together. So a lot of our work, for instance, you know, traditionally was, we worked with a call center about their issues, we worked with the digital team about their issues. But during this whole evolution of you know, service models, it was much more focused on the omnichannel support, where our clients felt their journey was taking them across multiple channels, etc. and multiple events.
You know, it was really understanding it at that level. And that's really where our focus changed to a lot less CX or even UX around how's my my web page working for clients to how does my webpage impact the overall experience?
Managing CX in Multiple Customer Channels
Rich: You talked earlier about the statistic where customers use typically, three channels to contact customer support? Do you find that within Charles Schwab's for customers that that is the case? They're reaching out to you in multiple ways?
Sean: Absolutely. And in fact, there's an unfortunate result that in many cases, they do all three at the same time. We see that a lot.
Anyone who runs call centers where the wait time is longer than you want it to be, you know, you're not hitting your service levels. You've got a volume coming from really, unscheduled and unrecognized volume coming in. What you find a lot is that they're on hold waiting to talk to an agent, they're trying to click and talk to, you know, an agent on chat. And they're on their website trying to figure out their answer, and so that's actually been one of our bigger ahas, during this period of time is, you know, it's literally not just I try one the other and the next. It's, in many cases, I'm trying all three at once, and trying to get my service issue resolved with whoever can answer first.
The challenge with that being is, not every channel is conducive to every activity. And so by default, you might be disappointed in two out of those three experiences, even if they're the first one to pick up and give you a solution.
Rich: I can tell you that in the retail industry, I have done that very same thing where I'm on the phone waiting on customer support, and I start, oh, there's a chat function. Let me just jump on that chat real quick.
Dom: First one to the finish line of CX wins, you know, it's true. But great point, Sean, though, because it might not be the answer you're looking for. So that's interesting to see those metrics coming through, you know, like you have a customer that was simultaneously on these channels. So it's interesting how you measure that going forward.
Related Article: Omnichannel Customer Experience: How Much Is Too Much?
Customers Simply Don't Follow Straightforward Path
Sean: Yeah, absolutely. And I know, you know, this is something I've been working on for 15 years in my career. And, you know, so first of all, I'm in Colorado, I love Colorado, big fan of the mountains. My family, one of our favorite places is Moraine Park, in Rocky Mountain National Park. And we go there every summer multiple times as a family and picnic and just enjoyed it. It's just become one of our favorite places. Well, there's a river, Big Thompson that runs through the middle of that. And it's one of those rivers you all see that it doesn't know where it's going, back and forth, and left and right. And one point a couple years ago, one of my son's my youngest, he was like, well, why doesn't this river go straight? I think at the time being a dad, I'm gonna make something up, I'm going to sound confident. And he takes the answer, and we move on.
However, that very week, I was sitting down with some of my partners, and we were doing journey mapping exercises on these processes. And the guy is telling me is like, here's the path, I'd created step one through five very linear process. But as we were digging into the results, we're like, yeah, nobody follows that path. And he's like, why the heck don't the customers follow that straightforward path? Why do they bounce around? Why do they try three channels at the same time, things like that, well, boom, that conversation, my son about the river, really, it just jumped out at me.
And I was like, wait a second. Well, if you think about the client journey, it's in a lot of cases, like those rivers, that they wander around, and they're trying to find a solution. And ultimately, I started digging into that, I'm like, well, why is this and and ultimately, I mean, everybody knows a river is going to flow, and it's going to go wherever the least resistance is, you know, it's gonna avoid a rock and go into sand and things like that. But as I really started to understand it, that really got me thinking about it on the business sense. And I think just like that winding river, our customers face a lot of those same rocks in a business sense. And that's why they bounce around.
And so actually, over the last many years, I've been focusing on a program called the ROCKS analytics program. And the ROCKS being an acronym for "resolving our client's known struggles." For us at Schwab, and for myself is ROCKS is the intersection of text analytics, operational metrics, journey analytics, survey, programs and management with a sprinkle of data science on top, and it really has helped us understand the experiences even at a greater level. But more importantly, understand why do customers not follow the path we gave them? Why do they bounce around? Why did they do things differently? And it really is about the rocks we create.
Rich: Also, when you think about people's lives, nothing really happens very smoothly, you know, you're working on it at your desk, you run out of time. Next thing, you know, you're like, oh, I'm in my car now, so I'll call them. You know, there's always something going on. People just do whatever they have to do to resolve the problem. And they don't find the answer with you. They'll find it somewhere else.
Dom: Yeah, you know, what's crazy, Rich, is just today I was on with my cell phone provider trying to get an upgrade from my wife's phone. And I just lost track. I was trying to do work and talk to the chatbot at the same time. And I ended up leaving, I just totally forgot. Really? Oh, yeah, that's right.
Now you would think that you'd have some metrics that would trigger an action of an email follow up right? Or how about a text that would it be amazing if I got to text? Hey, Dom, we missed you, what's going on? You you still want to continue this talk or what? That'll be special? If I got that I would that would be a CX victory I think. Huge.
Sean: Yep. Absolutely. Well, that's a great example of so many groups, especially larger organizations, your channel owners, they have kind of blinders on and what they're focused on, they're focused on their piece. And ultimately, you can get a lot of great results with a lot of people individually working on their pieces, and making them better.
But wow, you know, what we've found if you don't bring them together, and really think about it in the context of the journey, my part of the greater, you lose that and that's ultimately to your point, that's where some of those ROCKS get created some of those challenges, those known struggles, if you will, are the things that get in the way from being a really smooth transition, or follow up or whatever that might be.
And it's not because people don't want to within the business. It's not that people don't want to, you know, work together and find solutions. It's just people are busy. And so by default, you focus on what you focus on. And again, a big push for us as I mentioned earlier at Schwab was, a lot more focused on the overall experience and ultimately how the pieces fit together. The more you can do that, again, the better you're focused on the journey.
And ultimately, that's what's important to the customer. And, and in your case, you're like, hey, it would have made sense a quick email or a followup that says, hey, pick up where you left off here. Those are the types of things that definitely become the opportunity of the future as we continue to mature.
But it's hard to find those opportunities, because the data is kind of all over the place. And if you haven't prioritized one set of work over another, it may be a while before a solution like that really comes to fruition if you're not thinking about it more holistically.
Dom: Yeah, I wonder if they even had any record of that conversation because it wasn't finished? You know, I wonder if there's even any data in there on that conversation, even existing?
Sean: Yeah, absolutely.
Related Article: What's Killing Your Customer Journey? Friction
Finding the Known CX Struggles
Rich: So Sean through all this, what have you come to learn about the struggles that your customers go through? And what specifically was your organization able to change to help resolve these issues?
Sean: A lot of our focus has been, like I said, more recently, especially has been on this ROCKS program. And it has been for us, it's all about the Customer Effort Score, it's revolving around that. Again, our focus at Schwab is we can drive up loyalty and we can reduce disloyalty by making things easy. So that's really helped us center around our primary metric.
But just knowing something's easy versus hard is not the answer, that just points you in a direction, right? So fundamentally, as we think about the ROCKS, our focus has been on finding these ROCKS, finding these things that get in the way, these known struggles that get created.
So again, as I mentioned earlier, any number of metrics we can use to evaluate the client experience, we focus at Schwab, on a couple of key areas. So for us effort, or ease is absolutely one big category of work that we measure. We also measure a ton around resolution, we measure a lot around efficiency, and then we even are starting to really be successful at measuring emotion.
So as we look across all the different channels, we start thinking, how can I measure ease across channels? Of course, in one way, that's client easy score, but there are other things you can measure. So for instance, channel switching, you know, did I go from chat to phone? Did I go from digital to online? So we can measure that, again, as part of the ease, we can measure digital containment, that I stay online and finish my process.
So going back to your example, that you kind of abandoned your digital, we really focus on that and understand that containment and understand did we get that straight through processing completed?
Checking in on Repeat Calls, Predictive Resolution
For resolution, we look at repeat calls, we look at closed loop follow up things like once we say we're going to complete a ticket and do a process and then ultimately, did we follow up with that resolution in a timely manner. We look at predictive resolution, so we even look at the idea of does it appear like we might have resolved the issue in the conversation using text analytics. You know, efficiency, we look at metrics like handle time, like everybody does. We look at time on page from a digital perspective, hold time, a lot of those efficiency metrics continue to drive and a lot of that is traditional.
But we also look at sentiment, we look at the emotional triggers, because what we find is, especially if there's a high-effort situation, leveraged against also very low sentiment, not only do you have the effort being hard, but it's a very emotional effort, we know putting those two together is kind of a big challenge.
So as we looked at finding solutions, it's really about bringing those pieces together. And so ultimately, the other piece that we are heavily focused on is getting everybody on the same page with what our clients trying to do?
So again, going back to channel potentially, you know, looking at their own focus, a lot of times, we may not even track the same activity across the multiple channels. But what we did, in a very strong sense was create this enterprise classification for Schwab that said, hey, no matter which channel you use, if you're calling to do X, we're gonna measure that consistently across all the channels. So if you call or go online, or you're chatting, or you're on the mobile app, trying to do the same thing, we're measuring you equally across so we can better understand that experience.
Knowing Why Customers Contact the Brand
And so that has given us a really good classification of why do customers or clients primarily contact us, and then ultimately tie that back to the the metrics that we measure. And it gives us kind of a view that says more holistically, what is the effort among different activities, what is the transfer rate among different activities, what is the time on page or the straight through processing from a digital perspective across these efforts, and it really enables us to look at that and understand it and study it to find and pinpoint the hotspots, those pain points, those ROCKS that we need to get in front of. And that's been a great opportunity for us to rally around and ultimately provides us a lot of that detail that we need.
Rich: It sounds like your organization has a lot of channels, you guys cover a lot of ground. So you have all these different channels. Can you share an example of how what you're measuring in one channel differs from one channel and how it all adds up at the end? Because it seems like when you have that much information, making it actionable seems very difficult.
Related Article: How Predictive and Prescriptive Analytics Improve the Call Center Experience
Dealing With Onslaught of Customer Experience Data
Sean: Oh, absolutely. It goes back to we've got more data than we need today, or can do anything with, and so for us, it's really about focusing these efforts. So for instance, when I talked earlier about metrics; the first steps we did was we did some data science and modeling to identify which metrics have the most correlation and or causation of the client easy score, the Customer Effort Score.
So we know now that certain metrics have a greater trigger or a greater influence to create an overall effort than others. And so that allowed us to down select the metrics we're looking at. So we're not looking at hundreds, we're looking at a handful, a handful for different purposes, in fact. So as we think about that, what we really focus on is measuring those most important metrics. So you gotta have to be able to do that first off.
The second piece is really bringing to bear the opportunity to integrate. So one of the ways that we do it, and I'll try to use this kind of explanation, but you know, everybody uses Excel these days, so think of the traditional pivot table: you've got rows and columns, you got some filters in the upper left. So ultimately, what we really have done is enabled us to very easily, very quickly create a self service tool for our business partner so they can easily identify the hotspots, the challenges they need to look at.
And the way we do that is, for instance, let's think about it as if you look at the column headers, those are the metrics. And then if you look at the row headers, those are the different contact types. So basically, you can look across the two dozen different primary contact types and you can see how they stand up to each other across all those different metrics by volume, which is a higher volume versus lower volume, which has higher CES versus lower CES, which has a higher transfer rate or a lower transfer rate. But it allows you to really see that in a picture, and rank those items in a way that says, Let me see how different call types stand up.
Defining Where the CX Blockers Exist
And so ultimately, as we dig into that data and understand it, we identify that it's not just about something being easy or hard. It's about why is that easy or hard. And you start looking at those other metrics, say, hey, it's hard, but it's primarily hard because it's got low resolution. And so now you you know, it's not just hard, it's hard because of resolution. Whereas maybe transfers isn't an issue, maybe there's not an issue with digital containment for that specific topic, but it's hard for lack of resolution in the phone channel.
Whereas another call type that you look at, or another contact type may say, hey, actually, within the phone channel, all those metrics are really good, but where it really struggles is straight through processing online, there's a product problem there because they're not finishing off online and they're having to call because of that. And so this really allows us to in an easy kind of view, identify by channel, by client type, by activity type, and by metric, where are those hotspots?
And it's not the end of it, because then we go into even greater text analytics, and we go into even greater survey analytics to understand why, but it really pinpoints the collective team, and teams, channel teams, if you will, to say where do I need to focus, not just by the kind of activity, but in what way? Looking at the metrics.
Related Article: Outlook for the CX Profession
Putting CX Consulting Into Practice Across Teams
Dom: Yeah. And speaking of your team, Sean earlier, did I catch that you said, you know, your team creates a narrative based on the data and research and then passes it along to other teams? And if that's the case, I'd love to hear how one of those teams use this picture, painted by like, let's say your team, to affect that actual change.
Sean: Yeah, absolutely. And you're exactly right. That's what we do, we consult. So we're not just a data and analytics team. We're also a consulting team. So we take all that great research and measurement and so forth and we consult with the business partners to say, here's where we need to focus etc. In fact, we have a cross-functional ROCKS team, where we have key participants from the call center and from the digital team, etc, participate with us to look across those cross-channel ROCKS, and overall help us as part of that, that champion team, if you will, on this work.
But I can't give you a detailed, let me use kind of a fictitious example, I'll use from a world of retail. So let's assume that returning a product is a problem. It's one of the highest effort situations you've got probably makes sense, you know, it's a breakage, it's something in that level. So, you know, if you look at that product return in this heat map, you may identify that there's some hotspots that show up, it's higher volume, and that tells you it's a really common call. Well, a derivative of being high volume is the agents, they don't put clients on hold very much, well, probably because again, I handle this call every day, I don't have a lot of questions about it, so that tells me I don't need the training team to focus on training on that process, because everybody really has an understanding.
Whereas a very low volume, what we find more often than not is really low-volume activities, have really struggling, you know, hold times, well, that means there's not enough training. They have to ask questions, things like that. So this is where when we work with and consult with like the training team, we can go in and say, hey, these are areas where poll time is high because agents don't really know how to answer the question, so they have to seek other questions; you might even look at how handle time is much higher, obviously, that maybe it's something either it is more complex or not.
But you can work with that business partner and really rally them around here are the things that you can control across all these different contact points and within your channel, but it's part of the whole.
You go to the process, the team that owns the products and maybe determines the processes and you say, hey, you know, these certain activities have a much greater failure of resolution, and a lot more repeat calls. Well, I can take that knowledge to two different groups. I can take to the process team saying, hey, is there a way to get this done faster? I could also take it to the digital team saying, hey, repeat calls, repeat calls, there's a lot of status checks.
Where am I in this process? Well, if it digital create a tool that would allow them to track their progress, we could avoid the calls. And we've done a lot of work like that, where we take into consideration and working with different action groups within the orgs and the channels, etc. to really hone into the metrics that are most relevant to them across the activities that they can both influence positively control to, again, remove those ROCKS, get rid of the ROCKS, and really focus on making things easier.
So it's, it's been something that in our consulting business, we're very active with all sorts of different groups within the org. And any one of them has activity that can come out of this, this ROCK study, and they can make that overall contribution to making things collectively easier for our clients.
Converting Gut Feel Into Data-Driven Decisions
Rich: Okay, Sean, you mentioned ROCKS a couple times, and I really want to get into this program and what you guys do? Why do you reference known struggles in your ROCKS acronym? And what are some of the common things you find when you go through this program?
Sean: Yeah, I think the known part is it goes back to there's a lot more qualitative content around experience out there, right? Again, I grew up in call centers, I've had a lot of different roles as well, but you always know why people are calling; you ask your team in a team huddle or whatever, you know, yeah, we got all these sorts of people talking about these problems.
The problem you always have in that is, is that a really loud minority? Or is it really a big issue? And that's where the quantification is so important, because you got to make data-driven decisions. The gut feel has been there forever. And in fact, we all do that, and say; I think this is probably more frustrating than other things. This is probably a challenge for our clients and known struggle. And we take action on that. And we've been doing that forever.
What we try to do now is focus on saying, well, that's great, but let me truly analyze that gut feel and say, is it supported by data? Is it actually happening at a higher percentage? Or is it just really loud?
Those are the types of things, so as we look at the program, I kind of outlined with ROCKS, it's all about converting what we feel and what we think is happening, into more data-driven decisions, so that instead of going to the business leaders, for instance, and saying, hey, I think you need to focus on this, because it seems to be often asked or commented on in the surveys, it's more about, this is not as big an issue as this other thing over here, you don't hear as much about it in the survey comments, but it's a lot more common. And actually, if you fix it, you're gonna have a better result.
And that's what kind of blending those pieces together of bringing the analytics and the journey studies and the operational metrics, kind of helps really go from that kind of gut feel to here's what we think we're going to do, you know, physically or more importantly, here's what you should do, as dictated by data, real data, and not just that gut feel experience.
Related Article: Getting to the Heart of Data-Driven Experience Optimization
Establishing a Common Language of CX
Dom: Yeah. And ultimately, Sean, you know, the bottom line is the formula is insights to action, insights to action. So with ROCKS, you know, how would you say that specifically helps you guys uncover those insights around your client experiences, and you know, some of those high points of the program?
Sean: The first is, it's something organized around, it goes back to you know, I don't care what metric you use as an organization to measure experience. I've used probably my career, I've used every single one that's out there in some form, overall satisfaction, Customer Satisfaction Index. Obviously, NPS clean, easy score, any number of metrics, it's not as much about that it's about being able to have a common set of dialog and something to quote unquote, rally around, you've got to make that connection with business partners, so that everybody can do that together.
When that was one of the big pieces of making sure we could rally around a Schwab, everybody was going to focus on client easy score, and that allowed us to all have that same dialogue. So ROCKS is kind of like that, in many ways, too. It's a strategy and a concept that says we all share in the absolute understanding that we want to make things easy, or easier than we have them today, and we want to keep doing that because we know what the result is going to be. So that's number one is to get everybody talking the same language and really focused in the same way.
As we think about the continuation of the development of the program, it then gives you a framework. So again, you know, there are a lot of companies that there are data science teams that are just really pulling out insights and they're able to share it with business partners and say, hey, go work on this or go work on that. What we've done within the ROCKS is set up a framework.
So again, that heat map that ability to look across channel across metric and across activity types, and really see how different things are more important than others, that now is given a framework. And it's given that framework so that the business partners can engage, they can understand the key areas of opportunity, the key known struggles, and then really, again, using that same dialogue, that same language and that same focus, to say, and then here's how we're going to make improvements on it.
And what's even more important than that is knowing that we're going to continue to measure that as we go, we can track progress. So again, it's always challenging to say I found an issue, and I've told someone that it's a problem, it's up to them to fix it. But we can now measure that results. We partner, we don't just, you know, say here's what you need to work on and then we ignore it. We then partner as we go. And with different programs and different groups around continuous improvement or digital capabilities and accelerators, things of that nature. That's where we can partner to really pinpoint those right activities, and then work with a business to make sure that we're all rowing in the same direction together.
The Key to Getting Started With CX
Rich: Sean, if someone is leading the CX for their organization, they're a small or midsized company may not necessarily have the budget to gather all the research data, what would you recommend they be starting with NPS, CES? You mentioned many other metrics. I'm just curious to know where you think, would be a great place to start. I'd also like to know, just your advice on how the larger enterprise organizations can start building these programs themselves?
Sean: So I came up with another acronym. And actually, it's exactly what you said, the key is getting STARTED. We all know from a CX perspective, it is not a sprint, it is a marathon is going to take a long time. And in fact, it's a ultra marathon, if you will, because it's never going to end right, you're going to constantly be trying to make things better, you're going to constantly try to make the experience better. And you're going to constantly face new ROCKS, new kinds of things that are hitting you. So the key is getting STARTED.
And so as I think about it, that acronym of getting STARTED really focuses on so I'll just go through it, and then I'll cover each one. So the S for Surveys, Transcriptions, Associate notes, Resolution, Transactions, Efficiency and Digital so S T A R T E D.
So surveys. So the first is you've got to have a good survey program. Most organizations have a survey program. What you need to make sure is you're aligned. And this goes back to it if your call center measures OSAT [overall satisfaction], but your digital team measures client easy score, and then you have something else out here that's being measured by a different group. That doesn't work, because you're not consistently looking across. So the first step for this being CX and CX research is to make sure at least your survey programs are aligned, you're measuring the same metrics, so that you can create an apples to apples evaluation by channel and activity, etc.
And there's a lot of great data that you're going to get out of the survey comments, you're going to get out of the metrics themselves, you're going to track that over time. So for me, the foundation of getting STARTED is surveys. Now, most companies already there, maybe it's just a slight strategy change, a little more alignment across channels, because you definitely want to make sure you're also measuring across your main channels as well.
The second is text analytics and using transcriptions. Most of us record at least a sampling of our calls, and transcribe those whether it's very nice, one of the other products out there, but ultimately, there's a lot of great text data out there, not just in the survey comments, as mentioned, but about the conversation. So use the transcriptions you've got whether it's call transcriptions or chat transcripts, use that to study that as well, use that kind of capability to tie to your surveys to say here is what else is being said on the calls in the dialogue between agent and client or customer. So that's kind of number two, for me, again, most companies have it, it's just about integrating that or aligning it with your strategy.
The third is associate notes, again, another great text resource, and a text source, usually even easier to understand than a lot of times transcriptions are because a transcript of a call that's long, you get a lot of content in that. But associate notes really hone down to potentially if you and your CRM are using it to take notes and you know, accurately say, hey, why did they call, what was the main output? I mean, that's invaluable when you look across that experience.
Rich: So just real quickly, when you talk about associate notes, you're basically talking about a leader essentially listened to or an analyst listened to a transcript of a call, and then took notes from that call.
Sean: Well, this is more when the agent was taking the call, they were taking notes on the interaction. So maybe in Salesforce and I'm talking to a client, I'm entering notes client called about x or a lot of times there are fields that they dropped down. Why did they call what did I do? Things of that nature.
So it's really about the agent notes when they're talking to the customer. And if you can tie those notes to the transcript itself, and honestly, if that transaction also got surveyed, if you can tie those three pieces together using tools, and many of them can do this now, the text analytics tools, is tie these different sources together. Now you really have a good view about the event. You know what they said in the survey, you know what the transcription, the dialogue was about, and you know, the notes that they associate took.
Rich: With that larger tool, you're referring to be something like a CDP?
Sean: You could do that. I think I've seen it, I mean, groups like Medallia will start doing that. So some of our more traditional early-on survey vendors, but you've got Clarabridge, you've got Stratified, you've got Luminoso, a lot of these groups that do text analytics, where you can really bring different sources together and use some ETL to actually combine them or connect them. So that when you look at, and you're trying to really study an event, you're seeing all three of those perspectives of the same event.
Dom: I got my AP style hat on right now, what's ETL stands for, Sean?
Sean: Good question. Well, ETL, just to explain, it's basically it's a process to extract and translate, but it's basically connecting data. So you got you've got data from different systems and ETL process brings them together. And a lot of vendors now are able to do that for you, so you don't have to hire a whole IT team to do that within your databases, they can kind of do that for you, as long as you have a key to go off of like the call ID, let's say.
Related Article: Call Center Technology Trends for 2022
Bringing in Resolution, Transactions, Efficiency Into CX Measurement
Sean: Yep. So those first three, a lot of text around those first three, again, this is where we're going in the CX industry. All those surveys are key; you only serve a small percent of clients usually, or a small percent of interactions at the end of the day, or you might even get a certain level of bias by certain people likely to take surveys, and others don't.
But if you have kind of transcriptions and associate notes, you know, as you're getting the rest of the story, you're getting the rest of that content. But as we continue through that you're going through the get STARTED this is when it starts to get a little harder. So the first three are pretty straightforward, most companies have access to. Now you start getting into the R and T have started. And it's resolution, it's transactions in the E is about efficiency. So this is really where you start to bring other metrics to play.
Now, we do a lot of things where we measure resolution from the survey, we asked the client, did you get your issue resolved, but we also look at the transcript to say was the dialogue, hey, I, you know, I'm gonna have to call you back? Was the dialogue, you know, so even if they didn't take a survey, I can start to look at did the issue get resolved or not? Even better is if I can physically look at the back office tools to say, did this transaction take place or not?
That might be a little harder for some groups to do. But you need to measure, again, resolution, along with ease and effort is very key to your understanding. The transactions again, what are those actual transactions? What took place? What physically took place? I mean, that's a key. And then ultimately, as you go into efficiency metrics, like whether it be handle time, or how long did it take for a ticket to get resolved, or things of that nature. Those three things again, resolution, transaction, efficiency, kind of, that's when you start to bring your operations metrics into your study.
And then the D in started as digital. The hardest to deal with, unfortunately, probably one of the most important these days because of the focus so much more as online. But that's really where you start to think about, you know, whether it's through Adobe Analytics, whether it's literally digging into weblogs, things of that nature, it's becoming a must have to know what is the digital activity around X, Y and Z in the context of these other transactions and these other activities that may be more tied to traditional channels like phone or chat.
Dom: Rich, Sean came to play today, the energy, the excitement is palpable, Sean, and I don't know if that's because you're passionate about CX or you're happy about Russell Wilson coming to the Broncos, I'm not sure.
Rich: It's got to be the Russell Wilson.
Sean: I'm a little bit happy about both of those. But no, I am passionate about CX, that is absolutely for sure.
Rich: Sean, I can't say thank you enough for joining us today. As always, we'd like to give our guests the opportunity to share, you know, where our audience can follow them and learn more about Charles Schwab.
Sean: Yeah, absolutely. Well, I hope that most of you are clients of Charles Schwab, you know, our experience that we create is huge, a huge differentiator of us as an organization. So we'd always love more investors and more clients, but ultimately, you know, I'm on LinkedIn, Sean Albertson. And, you know, as we continue to grow in this space, and continue to work within the ROCKS program, and other things, I'm also speaking at conferences later this summer at the Next Generation Customer Experience Conference, which is also part of the customer experience for financial services conference in Boston later this summer.
So join me there if you're interested. But as well just love to continue to work with partners in this space and find the next generation ways that we're going to solve these problems together.
Rich: I would use Charles Schwab, but they don't take my Dogecoin or other made up forms of currency. When they do I'll be right there.
Rich: Thank you everyone, for joining us on this episode of CX decoded and we will see you next time.