Katrina Schiedemeyer, senior engineer of supplier development engineering at Oshkosh, on CX Decoded
CX Decoded Podcast
July 13, 2021

CX Decoded Podcast Episode 13: How Valid Customer Experience Data Tells a Great Story

Most organizations strive to have an excellent customer service program. Or we hope, at least. But it's one thing to talk about strong customer experience, and it's certainly another to act on it and measure it, especially in terms of how it adds value to the company proving ROI. Measuring CX and its return on investment still remains a challenge today.

According to data from CMSWire's State of Digital Customer Experience 2021 report CX leaders are not satisfied with their ability to quantify the impact of CX on business metrics and outcomes. It's incredibly challenging to provide bottom-line results and value for CX programs.

CMSWire's CX Decoded podcast co-hosts Rich Hein and Dom Nicastro caught up with customer experience practitioner Katrina Schiedemeyer, senior engineer of supplier development engineering at Oshkosh, to discuss proving ROI and the impact of CX metrics.

Episode Transcript

Note: This transcript has been edited for space and clarity.

Rich Hein: We got to meet at our DX Summit. I think it was 2019, and your story really stuck out in my head. I knew I wanted to get you on the podcast. Could you share a little bit about your background? You know, how did you arrive at your current role? What you've done in the past, etc.?

Katrina Schiedemeyer: Yeah, absolutely. So I had started my career with Oshkosh Corporation in the continuous improvement and quality space. So focusing in on quality analytics and understanding how we can improve the overall quality of the products that we're making.

Before I go a little too deep, I'm going to give a quick overview. Oshkosh Corporation is a Fortune 500 company located in Oshkosh, Wisconsin. And the company really prides themselves on making a difference in the world that we live in. So we make military vehicles, fire trucks, access equipment, refuse vehicles, etc. So some of the coolest machines and heavy vehicles that you'll see around are some of the amazing things that were made here. So as I talk about the company, you're going to hear a lot of references to military vehicles and fire trucks. I thought was important to give a quick background. 

Rich: And you're definitely talking about some of my favorite things there. I mean, military vehicles and fire trucks. I served in the Navy. Not only that, but I was also a volunteer firefighter for several years here in Florida.

Katrina: Oh, wow. That's amazing. Thank you. You've done amazing work for the community.

Dom Nicastro: Well, Katrina, let's get going here. I mean, you know, when Rich introduced you he had me at practitioner because I just love when we can get the people that are doing the work on this podcast and in my articles and CMSWire. It's like my No. 1 goal every year, every day. So thank you. So let's jump in. So the CX program in Oshkosh, tell us from a general standpoint, how the organization views the program, does it have sort of like a formal recognized title with customer experience? And, you know, how did it get started and who kind of is involved?

Katrina: Yeah, so customer experience actually started quite a few years ago. I had a leader and they had said, 'Hey, we'd like to try and do something with customer experience.' Again, this is back when I was in my old function and said, 'Let's give it a shot.' Spend six or so months learning everything you can about customer experience see what you can develop, and we'll will help support you as much as possible. And if it's successful, great, we'll continue to expand it. And if it's not successful well, all we did is had some amazing lessons learned.

We started small. I started with no budget and started with ad-hoc tools and information, had attended a lot of conferences and Googled my way through it. And as we were developing the program, I actually became pretty successful. We now have a pretty formal customer experience program at many of our subsidiary companies. So you can log on to our website, and on some of them, you'll get a survey that asks you about your experience, or asks you about your purchase transaction, or about the quality of the product, etc. So it has become a pretty formal program.

And since then, we've expanded it to link it in with employee engagement, which is by far the most important part to our organization. We truly believe in people first and putting our employees first. And then ultimately, we have now expanded that to a supplier experience program as well.

Rich: I guess looks can be deceiving. But it seems like from your story that it was relatively easy to get organizational leadership to buy-in on CX. But you know, a lot of times what I hear from my sources is that the people in the trenches have difficulty or feel challenged because their organizational leaders find CX is something that's difficult to quantify. So what would you say to that? And how do you obtain that leadership at the get-go?

Katrina: You're right. I think it definitely is difficult to get some of that leadership buy-in, especially to get it to move from a small little pet project to something that's a little bit larger. And so I think the most important thing is to start small, have some really successful wins and then continue to expand it bigger and bigger. Had I started it with a really large project and the large project failed, it would have put a really bad taste in everyone's mouth about customer experience.

So instead, I started really small and started to learn a lot. And we continued to expand and expand and eventually the results speak for themselves. And I believe if you have good results, it's hard for people to deny that it's something that's successful and beneficial for the organization. And so that's kind of what we did is we did a lot of research and referred to a lot of white papers. But at the end of the day, you have to put your own version of the story together and make it successful for your company.

Dom: You mean you just can't share a white paper with the C-Suite and just say read this and give me money?

Katrina: Yeah, no. And I think that's where I get questions from a lot of people like, 'Alright, well give me an article that I can use to go to my leader and have them give me funding for it.' And it doesn't work that way. I wish it was. But some days, you gotta go through the learning and continuous improvement cycles and then get the money.

Dom: It's insane, right? You go all night, citing research and case studies and who did it well. And at the end of the day, they're like, 'I don't care. How can you do this?' It's hard to put that case forward. It's in our nature to do all this research. But at the end of the day, it's like, oh, this wasn't helpful.

Katrina: That's so true. And I think one of the areas that we had struggled a little bit at the beginning was making things tailored for our company. I'm going to take the airline industry as an example. The airline industry, customer experience programs are well done. You get a survey after you've taken a flight, you can chat with them, or email them, etc. And they have good benchmarking standards. In the industry that we're in, it's difficult to benchmark and to understand what that could look like and tailor it specifically for your area.

Rich: Yeah, I would say the airlines had that customer experience completely nailed except for the actual flying.

Dom: Right?

Katrina: We've got to give them some feedback on that.

Dom: But that's a great point, Katrina, it's like, you would classify yourself as heavily B2B, correct?

Katrina: Yes.

Dom: Yeah. So it's like, Yeah, great. The airline has some great CX standards, some great CX metrics, but how do we put that into a B2B world, you know?

Katrina: Exactly, exactly.

Rich: Yeah, I mean, I'd really like to know, and I don't know if you remember, but can you share the details of any of those pilot programs that you started just to give people an idea of how you started small.

Katrina: So we had started by taking a look at the number one defects that were happening in our inspection process at our locations. So, before we were shipping our product to the customer, we started to understand what was the number one area of a defect, and then we had said, let's take a look at some of these defect areas and see if people are talking about it. And so we just word of mouth said do you know anyone who was complaining or complimenting these certain areas? 

The coolest project we actually worked on was one to fix radiator leaks. So our military vehicles have radiators, of course, and we were trying to reduce the amount of fluid, the hydraulic fluid that would leak out of them. And so it was a project that was like ... this seems like a good area to start, let's start by fixing radiator leaks and improve that experience for the customer. So when they came and got their product, it wasn't leaking anymore.

And then we just continued to move on to the next one. And as soon as we had a success in one area, then we'd work on the next thing. We didn't start by trying to tackle the whole universe. We started by one very small project where we could get a subject matter expert to partner with us on. We'd help them out, help them be successful. And then we'd move on to the next one.

Dom: Earlier, you said something I wanted to make note of. It's something that Rich and I have been watching, trying to get sources on CMSWire to tell us if it's happening. You've mentioned that you're trying to tie customer experience outcomes into employee engagement. I thought that was a pretty powerful statement, because we have heard that people want to do that, but we haven't got that wow, look at this case example. This is exactly how you do this. This is how we quantify employee engagement and tie it back to customer experience beyond the whole cliché, 'happy employees means happy customers.' So how are you guys measuring that? It's a pretty big task.

Katrina: Like I said, we really believe in making the employee experience as positive as possible. I think all of us have probably called a call center before and had a disgruntled employee who was like, I don't want to help you right now. And so you're cranky, because you've had a bad experience, they're cranky, because they're not having a good work environment. And it's just a bad situation together. And so of course, we want to make that program really successful.

So our customer experience platform that we use actually has a lot of employee engagement feedback in it. Year-to-date, I have sent out 947 surveys, and almost 40% of those are employee engagement related. So they are small surveys that we send to teams: how is your team doing? You know, do you have feedback on last team meeting or training information, etc.? And so we get a lot of feedback and information.

We have a really innovative team located out east and that team will collect a lot of customer engagement feedback and understand after you've purchased a product from our website, what is your experience like? But then we started to ask the actual customer experience employees, what is their experience like. And we got feedback from them and use their feedback to make their experience better. And as soon as they were happier and had all of the things that were frustrating them lifted — difficult reporting, or the really hard login process — they were happier. And then you can immediately see call satisfaction increase, because if you have a happy employee, that passion spreads through the phone.

The second one that we did, we had worked with our production employees. And so we know that having a safe work environment is really important. And so we had done some surveys to understand how we could improve ergonomics and safety for our production team members. And as soon as they gave us feedback on it, we were able to say these team members are safer at work, which means they're happier, which means that there are less defects, which means that their customer experience is increasing. So it's a lot of work to get to that. But it's so beneficial because then you get the true buy-in.

Dom: Now as a customer experience professional, did you have to consult HR to make sure you're asking the right questions? Or do you take your customer experience chops and just apply that to the employees and ask similar questions that you would ask customers? Is it easy to use? How frustrated are you, you know, things like that?

Katrina: Yeah, so we had actually done it the second way: applied CX concepts to employees. HR still does all of their great work with employee engagement surveys that they use their own platform trying to improve the holistic experience. But then we do a lot of focus groups and surveys to our team members on a smaller scale, some more of that pulse-base surveys, or focus groups, to understand what the current flavor is.

And I actually when I first developed our research papers, and kind of the questionnaire guides, I pulled our customer experience questions and mapped them to what we could change for employee. So instead of asking, How do you like the product that you're buying? We asked how do you like coming into work every day, and just kind of change it that way.

Rich: We're talking about ROI of CX here. What was the hard metric on the support center that changed? Were you seeing lower times on your support calls? Were you seeing customer satisfaction on the calls increased? I'm just curious to know what the actual hard metric was.

Katrina: Yeah, so we use customer effort score (CES) and customer satisfaction score (CSAT). So customer effort score decreased significantly, because now they didn't have to be transferred a lot of different times to get the right person with the right information. As the team was working together, we learned, hey, did you know ABC person has the skills in this area, and XYZ has skills in that area? And as we have those team-bonding and team-building relationships, they now knew who to transfer the right calls to and who would have the right conversations with, and so customer effort really decreased, which ultimately then increased customer satisfaction.

Dom: Katrina, I was wondering if you can give our listeners a general definition of those two metrics, customer effort score, and customer satisfaction score. You know, we do have a pretty sophisticated audience of customer experience professionals. However, we welcome all listeners, maybe someone who just wants to learn right from the beginning with customer experience. So tell our listeners a little bit about those two scores and how they work.

Katrina: Yeah, absolutely. And I do remember my first customer experience conference, they had used the acronym NPS 100 times. And I didn't know what it meant. And I'm sitting there trying to Google in the back of the audience, so good callout. So customer effort score is asking the customers how easy it is for them to complete their transaction, or how easy is it for them to complete whatever process they were doing. So we use it if you're on our website, how easy was it for you to find the solution you were looking for, or how easy was it for you to complete your purchase.

The goal of customer effort score is to have a low score. You want it to be really easy. So I guess on a one to 10 scale, if you're asking how easy it was you want it to be your 10? Or if you're asking how difficult was that you want it to be really low.

On the flip side, customer satisfaction is asking how satisfied are you? We traditionally use customer satisfaction to talk about the end product that the consumer is getting. So how satisfied are you with your purchase? How satisfied are you with your equipment that you got? We found that if we use customer satisfaction to talk about how satisfied were you with your experience on the call today, it has a lot of bias in it because some people will say, well, I'm satisfied with the agent I talked to but was really dissatisfied that you guys didn't give me $100 gift card or something. So we break them out. And that helps a lot.

Dom: That's where a lot of these metrics are flawed. They're not perfect. And that's the exact case study. That's what you just said.

Rich: And I think you bring up another great question here. When you get down to the point when you're building your project, how does the process around choosing KPIs work within your organization?

Katrina: Yeah, so the first KPIs were just industry standard customer effort score, customer satisfaction and Net Promoter Score (NPS). I'll tell you a kind of funny story about Net Promoter Score. So we have been using Net Promoter Score for a while and we saw differing results. For those of you unfamiliar Net Promoter Score asks on a one to 10 scale, how likely are you to recommend this product to a friend or a colleague? It's a very standard question, you don't change the wording of it very often.

So you know, it's got a lot of benchmarking information. And then it also has a scale associated with it. So anyone at a 9 or 10 is typically a promoter, which means that they're going to go around and talk positively about your product, and 7 or 8 is passive or neutral, which means that they're not going to say anything good, but they're also not going to say anything bad. And then 6 or below means that they are dissatisfied. And they're likely actually going to complain about your product to a friend or colleague.

So we had started to use it. And we asked them, you know, after they bought a fire truck from our fire truck division said, let's send out a survey and ask Net Promoter Score questions to it. So we asked them how likely you recommend this product to a friend or a colleague. And we had seen really low scores, which was surprising to us, because we're by far the industry leader in making it. So we're like, how in the world are you writing this poorly when you keep coming back? This is over the past 30-40 years, you guys every time your municipalities are ready for a fire truck, you come back, so why?

And we realized that as we're asking it, I don't know about you, but I don't go to my friends and my family members and recommend buying a fire truck on a daily basis. So that was the issue with this metric in itself. It was flawed. You're not going to go to a friend or a colleague and say, let's go buy a fire truck today. Like that's not your normal BBQ conversation. And so right there, we realized we needed to change some of our KPIs.

We need to change it to something that matches our industry while we were trying to benchmark others that was that was a good idea. But sometimes it's a little bit flawed. And so we had started to look at what's important to leadership. And we realized that our leadership team was doing some good metric reporting around warranty claims, and we latched on to that and said great, let's see if we can improve the customer experience to decrease the number of warranty claims that we have. And so we just started pulling what's important to leadership and had our story see if it could match it did a ton of Six Sigma correlations. I've looked at data for more hours than I can imagine, and it ended up really helping tell the story.

Dom: Oh, I have another pullback question, Six Sigma. Let's tell the listeners what that is broadly.

Katrina: Ah, so Six Sigma is utilizing statistics to identify correlations in the data. Your goal is to have a certain number of defects per a certain number of products that you make. So you can do a lot of statistics to understand how you can see what's working well, what's not working well. So you have what story your data is telling you, and then if you couple that with continuous improvement efforts, you can say, hey, we're having 10% more defects, then let's work on an improvement project to reduce the defects as ultimately increases customer satisfaction.

Rich: How do you identify when the KPI isn't working for your business?

Katrina: Yeah, so sometimes we identify when it's not working by customers being confused about it in a survey or a questionnaire. I like to do focus groups with our customers. Let's talk about the service that you just got. What's going well, what doesn't make sense, what makes sense? So often as a CX professional, we get so into the weeds, and we use words like Six Sigma, customer effort score, and all these things. And we have to remember our customer are people, and they may not be as embedded in the customer experience space as we are.

So I call random people and see if they can tell us how the metrics are working and kind of how it makes sense on the customer side of things. That helps me to reduce the risk of bias and ensure that have a good data validity. And then after my data is valid, and I know that it's driving toward something, we'll work with our leadership team to understand what's important to them, and start taking some of their key initiatives, their strategic visions, and see if we can ask questions that align to it to see what those metrics are telling us.

Dom: I'm sure you're constantly tweaking and rearranging, but is there like a formal like, Hey, we're gonna adjust, we're gonna look back at this point in time, or is it really just kind of week-to-week, day-to-day in terms of how you adjust CX?

Katrina: So we like to give it a little bit of time, because sometimes you change things so quickly, you can't tell what was happening. Think of a science experiment when you're a kid, and if you're trying to make use of the baking soda volcano where you pour baking soda and vinegar in, but say, you don't know that vinegar is the catalyst for the reaction, and you try water, and then milk, and then juice and all these other things. If you change it so fast, you might not be able to tell when you pour in vinegar, that vinegar was the actual catalyst that was changing it.

And very similarly, if you change your CX program too often and too fast, you may not be able to understand what was successful about it. But more importantly, you're going to confuse your customers. And the last thing you want is to confuse your customers and bombard them with surveys.

So instead, what we do is we typically give a 30 days, 60 days or 90 days for the particular project that we're working on to grow some legs. You don't want to go past that point. Because if you're going past 90 days of letting something go, and you're not checking in on it, if it's bad, you're 90 days out, and you're stuck with a bad thing for a long time.

So we do, depending upon the size and the scope of the project, frequent check-ins to see how the data is coming back how it's looking. And then the tool that we use for our customer experience platform gives us the data back in real-time, and it's mapped to dashboards. And so the second that the data is coming in, but starting to look weird to me and the KPIs aren't matching what we should be matching, I immediately go back and look and say, did I have a typo in this? Did I do something wrong? What's the issue here? And how can we fix that?

Rich: Once you have the data, do you have to convince people to take action on these insights? Or is just something that somebody already bought into at that point?

Katrina: They have. That's an excellent question. I think it's a mixture. So when we have the data back, a lot of times the data tells a story for itself. I like to say my talent is highlighting data or transforming data to tell a story. Because at the end of the day, it's like a really good chapter book, you sit down with it, and you can imagine it in your head.

So if the data looks really clean, and has a lot of validity behind it will tell story itself and people typically latch on to that. The most important part, though, is you have to build that brand for yourself. Your data is really telling a story because so often we are barraged with data that is skewed one way or another. I mean, look at anytime you see the news that will say 56% of people think vanilla is the best ice cream flavor. Well, I could flip it around and say, well, 44% of people don't think it's the best ice cream flavor. Depending on how we tell it, it can tell two very different stories. So building that reputation and that brand for yourself as a CX professional is so important, because as soon as you have valid data, the story will start to tell itself and people will latch on to the story.

Dom: Katrina, you know, you mentioned one of your specialties is looking at the data slicing and dicing, how much of your job is actually connecting with customers to like one to one or like a group advisory board? Because I know, in the past world, when I used to go to conferences, as a reporter for CMSWire, it was seeing those people in the trenches talking with them at the cocktail hours, those are always pretty exciting. Do you connect with customers directly?

Katrina: Yes, a lot. So my actual very favorite way to connect with customers is we have a really small airport located in the town that I'm from. And so we will bring in our customers and they will come and purchase a product from us, pick up the fire truck, and so it's all the fire chiefs coming in. And I travel a lot for my job to go meet with customers. And so I'll be sitting on the plane and more times than not, we're all wearing gear from the same company, they'll come and pick up their product. And so they'll have gear that has, you know, the Pierce logo on it and I'll be wearing a Pierce jacket or something, and so we'll start talking and I'll be like, hey, how was your experience? Like, oh, you know, it was great. And they'll start telling me all these things about it. I'm like, that's cool. Can I ask you more, I actually work there in customer experience.

I find that if you can get really candid information from people, like you said, at a cocktail or over a beer on the flight, you're not going to have to worry about having things skewed through a survey. If I was writing in an open-text comment, I could write in the words great product, and you're like, yes, this is awesome, it's a great product. But a survey doesn't tell you what an in-person conversation does. So I spend a pretty significant amount of my time traveling all throughout the world. I've been everywhere from China, all the way to Mexico, and everywhere in between working on customer and employee experience type things, because I believe that those in-person connections are by far more valuable than anything you get in a survey.

Rich: We're almost out of time here, I just wanted to make sure I give you an opportunity to share the best practices that you want to share with our audience around this topic.

Katrina: So I think the No. 1 thing is to just keep trying, as your customer experience program is getting up and going, whether you're a small business, trying to sell something online about yourself, or you are a giant Fortune 500 company, there are things that are going to work and there are things that are not going to work and you cannot get defeated. It's very easy to say, well, the CX principle didn't work and also CX isn't valuable and the funding is not there, and it can be defeating. And instead, you have to keep going and understand that it's worth having that perseverance.

The second thing I would recommend is, remember that your customers are humans, too; think about what would you like to hear in a survey. What was your experience like when you purchased something, and see if you can match what you liked about it to what your company's doing instead of thinking in the big corporate terms. Realize that they're people and just ask them, how are they doing, what's going on, and truly show compassion and understanding. And if you do that, I guarantee you that your results will speak for themselves.