Kevin helps organizations acquire, develop, and retain their most valuable asset: their people. He also helps people manage an aspect of their lives that plays a key role in their happiness: their work. He’s spent the last decade of his career building leaders at every level and creating scores of engaged, high-performing, strengths-based teams. Prior to founding Lifted Leadership, Kevin served as a Lead People Scientist for Culture Amp where he helped organizations like Airbnb, Palo Alto Networks, and ServiceNow reinvent and optimize their performance management and employee engagement initiatives.
In this episode, Kevin talks about how we measure employee experience to drive customer outcomes.
[0:00 - 6:43] Introduction
[6:52 - 13:23] What is an Employee Experience Scientist?
[13:35 - 27:10] What are the first steps to connecting EX data to business and customer outcomes?
[27:19 - 36:14] What’s been learned form connecting EX data to customers?
[36:23 - 39:09] Final Thoughts & Closing
Connect with Kevin:
Connect with Dwight:
Connect with David:
Resources:
Announcer 0:02
Here's an experiment for you. Take passionate experts in human resource technology, invite cross industry experts from inside and outside HR. Mix in what's happening in people analytics today. Give them the technology to connect, hit record, pour their discussions into a beaker. Mix thoroughly. And voila, you get the HR Data Labs podcast, where we explore the impact of data and analytics to your business. We may get passionate and even irreverent, but count on each episode challenging and enhancing your understanding of the way people data can be used to solve real world problems. Now, here's your host, David Turetsky.
David Turetsky 0:46
Hello, and welcome to the HR Data Labs podcast. I'm your host, David Turetsky. Like always, we try and find fascinating people inside and outside the world of human resources to give you the latest on what's happening in HR data, analytics and HR technology. Today, we have with us Kevin Campbell, who's an employee experience scientist, that is an awesome title, Kevin, tell us about it.
Kevin Campbell 1:09
Yeah, you know, I, that, it's a new job title that often times is called people scientist at different companies. So Humu, Laszlo Bock, the former CHRO of Google employees, people scientist at Humu, Culture Amp my former employer has people scientists. Essentially an people scientist, or an employee experience scientist is someone who sits at the intersection of organizational psychology, consulting, and practice, actual real world practice. A little bit different than people analytics, whereas people analytics, you might have a data science background, or a computer science background, and you just work with people data, as opposed to other types of data, people scientists, iecq scientists tend to have that organizational psychology domain knowledge, in addition to knowing just enough to be dangerous on the analytics and consulting piece.
David Turetsky 2:05
And so you work at Qualtrics?
Kevin Campbell 2:08
Yep.
David Turetsky 2:09
And what kinds of things are you really diving into at Qualtrics? Or is that kind of one of the things we're gonna get into today? Within our topic?
Kevin Campbell 2:17
Yeah, it's one of the things that we're gonna get into today. But I love to talk about it. I love my work. The things that are exciting me these days are connecting different points of the employee journey together to have employee journey analytics. So being able to predict what's going to happen to an employee down the road, how engaged they're going to be based upon their initial onboarding experience, what initial experiences will inform their level of performance or performance reviews?
David Turetsky 2:44
Sure.
Kevin Campbell 2:45
But the thing that's really, really interesting is how you can make predictions and be able to drive impact for customer outcomes, customer experience, by virtue of what's happening in the employee experience.
David Turetsky 2:57
My hope is that's actually happening in people analytics, as well, or with people who kind of are in the people analytics realm. Probably not to the extent that you are. So I'm going to really be excited about the topic for today, because our topic is really, really close to that. But before we get there, Kevin, I have to ask you, what is one thing that no one on this Earth knows about Kevin Campbell.
Kevin Campbell 3:28
The one thing that nobody on this Earth knows about Kevin Campbell....
By the way, we can abstract the internet, the internet is on this Earth, even if it is in the ether in the cloud, it is still on the Earth. So just wandering
So I'm going to break the rules of the question because there are several people on the Earth that know about this, if anybody was climbing, Machu Picchu, on this particular day, they know this, but I proposed to my wife on the top of Machu Picchu one of the seven wonders of the world. And I almost got kicked out of Machu Picchu because of it, because if you've ever been there's a big sign above entering the city that says no clapping, no making a commotion. And the minute I kneeled to present her with the silicone qalo ring, because there was no way I was gonna bring an actual diamond ring right onto the Inca Trail and into the Andes. So I had that stuffed into my hiking hat for three days. But when I kneeled and proposed to her, everybody on the mountain everybody stopped what they were doing and turned to us and started clapping and the people that were making sure that no commotion was happening were not very pleased with that.
But no commotion police were there.
Yeah, yeah.
David Turetsky 4:54
Well, that is awesome. I love it. Did you hear recently they said that Machu Picchu is actually not its real name.
Kevin Campbell 5:02
That is very unsurprising. But interesting....
David Turetsky 5:06
Because I think, I think it's, there was something where it says the original language was something like soft mountain, instead of old mountain or hard mountain, something like that. But yeah, it's just it's new news on Machu Picchu. So there you go. But that is that is really cool as one of the most unique things someone said. So that's applause. Oh, I can't, I gotta be quiet about it.
Kevin Campbell 5:34
I see what you did there, David.
David Turetsky 5:37
Thank you. Thank you. So today's topic, Kevin is again, near and dear to your heart, of course. But it also strikes a chord with me, which is how do we measure employee experience to drive customer outcomes. And for those people who listen to the HR Data Labs podcast, we love talking about how we can transition, the conversation from being employee centric to being about the business and how HR and those of us who work with HR can influence the business outcome. So this is a really cool topic for us.
Kevin Campbell 6:10
Yeah, it's it's exciting stuff. It's, it's interesting, because we've always known intuitively that these connections exist. And, you know, happy employees, great happy customers. And there's research, published peer reviewed research, that demonstrates that there's a connection there. But we don't often see it in our organizations. And we don't often pinpoint exactly what needs to be done in order to drive that connection, and in order to take action on the connections when we find them.
David Turetsky 6:43
So Kevin, question one is, what is an employee experience scientist? Because I personally had never heard about that before. What is that kind of job. And I'd love to hear the fact that you love your job. And I'd love to use the word Mon-yey, instead of the first day of the week.
Kevin Campbell 7:11
Yeah, Happy Mon-yay, I'm going to replace Happy Monday with Happy Mon-yay.
David Turetsky 7:17
Exactly!
Kevin Campbell 7:17
My job is to identify and eliminate experience gaps in the employee experience, and driving other elements of experience management through organizational workforce dynamics. So thinking about the experience, over and above the operational data, and the experiential data that goes along with that, right, like the operational data associated with onboarding issues, was the laptop delivered? What was the start date? What's the time to ramp? Time to hit quota if it's a salesperson. But then there's the experience of how aggravating or easy or delighted were you with the experience of setting up your laptop for the first time? Did you feel like you were truly enabled and empowered to be able to hit your quota as quickly as possible, right? And bringing those things together allows you to make decisions and have insights that you wouldn't be able to make with just one or the other?
David Turetsky 8:17
Are you able to do things like correlations of met their team, interviewed with their team, basically got introduced to their team prior to start date, and experiences on the back end of satisfied, engaged performant? Are you able to do things like that, where you're able to actually draw those really simple, obvious correlations, but yet really do it with hard data?
Kevin Campbell 8:45
It's not always as obvious as you would think, actually. And yes, so a large retailer, this is real recent, too, so it's top of mind, wanted to understand what early onboarding experiences lead to greater feelings of belonging, right, which is a hot topic...
Sure.
Six months or a year into a role. And they wanted to drive it down to specific behaviors that the manager partakes in as part of that initial introduction. So does your manager introduce themselves to you and, and bring you a board with a text message, a phone call, an in person meeting, or a Zoom meeting?
David Turetsky 9:28
Right.
Kevin Campbell 9:29
And, the, as you could guess the text message had a negative impact on belonging, but the in person meeting and the Zoom meeting, were just as strong as each other and the phone call was somewhere in between, but being able to see those differences and take that hard data back to frontline managers and say, here's what we found, has a stronger relationship with your employees, feeling like they belong on a team, feeling like they can speak up, feeling like, they're going to stick around for a while. And if you want, if you have a lot of turnover, sometimes it can be really tempting to say, I don't know if this person is even going to stick around. So I'm just gonna shoot them a quick email or a quick text, rather than taking the time to set up an in person interview. But in the end, you might actually be creating the problem that you're finding yourself in the middle of by virtue of doing that.
David Turetsky 10:28
I think one of the more amazing things is, is that, you know, people really do know the difference between good interaction and bad interaction, I'm sure they think it's intuitive in their own mind. But when they experience it firsthand, it probably comes across a little bit differently. And I'll give you an example. You brought up the manager texting versus the manager making a call, or the manager, having an in person meeting versus a Zoom meeting. And having that be maybe not the first interaction because they may be did do an interview, but maybe one of those onboarding interactions of hey, let me get, let me get some time with you to talk to you about what's going on with the group. You'd think it would be obvious, but it may not be obvious because of differences in culture, differences in age, differences in background, and you talk about belonging too, there are a lot of people who probably have very different ways of dealing with other people. And so experientially, it would be really cool to actually see the differences by, to me, because I'm, I love more data, by function, by age, by certain demographics, to be able to understand what's the best way of actually being able to have those best first interactions, because as they say, then they been doing this a long time, the first impression is usually the one that lasts, right?
Kevin Campbell 11:53
Absolutely. And a couple of things come to mind, as you say that, you know, one is from an analytical perspective, or an analytics perspective, putting those things, those demographic variables in as a control variable into the model. And we like to do that kind of stuff on our end to make sure that we're keeping ourselves honest to say, like, oh, we throw those demographics in, do these relationships still exist. I don't always agree with necessarily sharing that information back with line managers, because it's just going to confuse them. So sometimes, just like simple bivariate correlations are way more impactful than that. And another piece around this is that, and I'm guilty of this as well, is that, you know, a lot of times when we're trying to close experience gaps, we try and do that by improving the experience itself, which is absolutely warranted. But a lot of times the gap comes from the gap between expectations and experiences. So another way of mitigating that is not necessarily to instruct all of your managers to Hey, have a sit down meeting instead of doing a text message, because there might be instances where that's the only thing they can get across, the only thing they can do because of other constraints.
David Turetsky 12:57
Sure.
Kevin Campbell 12:57
But if you at least have that initial conversation, of setting the expectations that, hey, we're moving fast and quick in this organization. And you know, there are going to be times where I'm not gonna be able to have the same kind of deeper interaction that I would like to be able to have with you. And I want you to know that it's not because of you, it's because of the constraints that I'm facing...
David Turetsky 13:19
Right.
Kevin Campbell 13:19
That can go a long way in terms of closing that gap as well.
David Turetsky 13:23
Absolutely.
Announcer 13:24
Like what you hear so far, make sure you never miss a show by clicking subscribe. This podcast is made possible by salary.com. Now back to the show.
David Turetsky 13:35
So Kevin, the next question I want to ask you is about connecting the employee experience data with actual customer outcomes? Because we know there should be a connection. But how do you actually start making those connections? And at what part of the employee experience do you really like to dive into that?
Kevin Campbell 13:52
I think the most important thing to start making those connections is to collect data from both experiences to have a transactional experience data collection process for your customers, or it could be relational, but the transactional piece tends to be even stronger. So they pick up a an order from your restaurant, they check in at the hotel, they check out of the hotel and making sure that you're thinking about that shared journey. There's a lot of work being done around thinking through the customer journey, and thinking through the employee journey. But what's that combined journey? And what are those touch points where you want to have the experiential data from both pieces. So that's the first thing. The next is to make sure that you're collecting the data at enough frequency to where you can be able to make those connections and being intentional about where do you set up those listening posts. And what's the leading indicator and what's the lagging indicator because if you believe that customer experience drives employee experience, then you would want to make sure the data that you're collecting and analyzing is the customer experience data first and then the employee. But more often than not, we're trying to link the other way around. And we're trying to make a causal inference, you obviously want to have the employee engagement or employee experience data come before the customer experience data or have some sort of overlap between the two. And then it's really helpful to have some sort of a priori hypothesis going into the analysis. But even if you don't, and this is where some people in my line of work might disagree, sometimes even if you don't have a priori hypothesis, sometimes just doing a bivariate correlation with everything that you ask on the employee experience side with some of your major customer outcomes, like net promoter score, or likelihood to recommend, you're gonna find something. And it may not necessarily make sense. And it may or might not be a true causal relationship, but it's going to be something interesting that you're going to want to do a double click into. And that's where the magic happens is sometimes not always having that top down, I want to test this theory, but maybe having a grounded theory that you're creating based upon the observations that you're making. One really interesting example is a quick service restaurant found that teams, restaurant teams that have better teamwork and collaboration had a strong relationship with customers, saying that the food tasted better. Now, obviously...
David Turetsky 16:36
It's possible, it is possible, right? Because if your heart's in it, you'll probably follow the SOPs more, which should lead to better outcomes, because those SOPs have been honed to get the best outcome in terms of texture, taste, and potentially revenue. So you'd hope that that's actually true, right?
Kevin Campbell 17:03
Absolutely. And that was the thing, when we presented this finding back to the stakeholders were like, ah, you know, this is probably, there's a good chance this is a spurious correlation, we didn't really have an a priori hypothesis going into this and like, no, no, this makes perfect sense. Because the way that they do it at this restaurant is that the people are set up sort of like an assembly line.
David Turetsky 17:21
Yeah.
Kevin Campbell 17:21
And you know, someone's making your burrito and they, they lined the beans up just a certain way. And then they lined the machaca, or whatever it is, and the guacamole, like having that layering so that every bite, gives you a little bit of each can make a big difference.
David Turetsky 17:37
Absolutely, totally agree. So let me get dive, double click into this, though, because this is really important, correlative versus causal. We get dinged a lot in people analytics, when we try and make too heavy a statement. How do you, because I think with the end you're dealing with you might be able to get there, but how do you get there, where you can distinctly call it causal? Versus correlative? Where do you make that distinction? And and how can you basically prove it?
Kevin Campbell 18:08
I don't think you need to prove it. And I don't think that we need causal data in order to use this information in order to make good decisions. And I also think it depends on the use case and the audience. So if your use case, is trying to predict attrition, and you're trying to do workforce planning, then you might need a very sophisticated model that accounts for things that are outside of your control, like labor market fluctuations. And if you're trying to convince your frontline manager, that this employee experience stuff is actually going to impact the bottom line. You don't necessarily have to have that same kind of sophistication in your analysis.
David Turetsky 18:54
Right.
Kevin Campbell 18:56
So and you don't have to have causal data, especially if the things that we're calling drivers, right, like, like teamwork and collaboration, if they are something that you already want to see within your workforce aside from from what they correlate with, there's nothing, there's, there's no risk in making a type one error.
David Turetsky 19:19
Well, yeah, that's true.
Kevin Campbell 19:21
There's the risk of making the error. But there's no there's no...
David Turetsky 19:26
The outcome. Right?
Kevin Campbell 19:27
Right. Yeah, having people be more collaborative is a good undo itself.
David Turetsky 19:31
Such a terrible thing. Kevin, gosh, what are you saying? Well, but to be honest, though, we make this mistake a lot and especially in people analytics, where we say that because we did this, we paid people more that they stuck around which you can't possibly say that, there's, there's so many other factors involved and to your point, you know, it would be you know, it's not such a terrible thing if you do that, but the problem is you can not make that correlation.
Kevin Campbell 19:47
That you can't You're absolutely right. Yeah. But you can say there's there's something there. Right? Correct. You know, even if you use the classic example of, you know, sunscreen has a high correlation with, you know, I don't know, ice creams sales. Right, when sales for sunscreen go up....
David Turetsky 20:24
Right
Kevin Campbell 20:24
Sales for ice cream go up.
David Turetsky 20:26
Absolutely
Kevin Campbell 20:26
Right. And we all know....
David Turetsky 20:28
The reason why I put it on because I want to get ice cream.
Kevin Campbell 20:31
There's no There's no heat, right? Like there's no causal relationship there. Right? We know that. Right? It's enough for us to go, huh? Sure. What might be happening there? Oh, hotter days.
David Turetsky 20:40
Hotter days, Sun out. Yeah, exactly.
Kevin Campbell 20:42
Right. So it's more of a diagnostic to make us do that double click?
David Turetsky 20:46
Absolutely. But still, it's still delicious correlation. Definitely.
Kevin Campbell 20:53
But you know, as an ice cream parlor owner, if you see the sun screen shop owner, not that there is such a thing, but bear with me, if you see that that person is increasing sales, because maybe they had better analytics, and they know that it's going to be a hotter day, you can kind of tag on to what they're doing. Right? It's like,
David Turetsky 21:13
Absolutely. But I want to go back to one thing, before we transition to the next question, you had made the statement of not really having a hypothesis. We know as a scientist, you're least trying to trawl for something. And even if you're trawling for some interesting insight that will come out of the analysis, you have to have some kind of bias going in, right? There's something that you're looking for, there's some kind of pattern you're trying to find. And one of the things that I'm trying to relate to the listeners is that you don't have to be a scientist, you don't have to call yourself a scientist to at least have the investigative mind to say, when I'm looking at this data, I want to drive a story out of it. And that story, could either be proven or disproven based on what I find, the facts I find. And that's exactly what you're saying, right? It's not that you're, you're, you have nothing going in, you have a predetermined notion. And that's a story potentially. Right.
Kevin Campbell 22:07
Yeah, the the implied hypothesis is what relationship if any...
David Turetsky 22:14
Right.
Kevin Campbell 22:14
Is there between this employee experience data that I have and this customer experience data that I have? And that's so, in a sense, yeah, that's definitely the catch all hypothesis. And when you frame it that way, you're almost always going to get to confirm that one, right? Because there's always some relationship, whether that relationship is, you know, the directionality of that relationship. And whether there's a, you know, a mediating third variable, or a bunch of mediating variables.
David Turetsky 22:48
Or a ton of noise that you're getting, because you're just regressing way too much stuff.
Kevin Campbell 22:52
Well, that's, that's a good point, too, right? It's like, sometimes, you can over fit your model, right, by throwing too many things into it, and then you can't even make sense of it anymore. Sometimes the simple bivariate correlation makes way more sense. So you know, classic examples, I was working with a large tech company. And they had a beautiful people analytics department that were made up of folks with a an economist metrics background from elite universities. And they had an OD person who is very well educated in organizational development, but didn't know the first thing about stats. And I was showing them a simple bivariate correlation between their current employee engagement drivers, and their employee engagement index. And it was these five items. The people analytics team, took the previous three years worth of engagement surveys, and they put it into a weighted multiple regression with different coefficients for different weights and they controlled for things like ten year, and gender and a couple other variables. And four out of the five drivers for both lists were the same. The top three were identical. So it's like, yeah, I mean, it, was their model more accurate? Absolutely. With the organizational development, that person understand why they were a little bit different, or what the difference in the analysis was, no, they couldn't explain it. Mine, they could explain when this number goes up, this number goes up, right? When this number goes down, this number goes down.
David Turetsky 24:33
And I think there's beauty in that simplicity more than anybody can imagine who's not a stats person. Because what we've failed to do as econometricians a lot of times in the people analytics realm. I will I'll just say that is that sometimes the simplest answer is the best answer. And the story you tell from it is more meaningful because when you start throwing the kitchen sink in, and you cause too much correlative problems, you cause too much noise and too much error. You try and make the most elegant thing in the world. It may seem like a BMW, but it actually runs like a jalopy because the results of it fall apart. But you can't even explain it to people, which makes them upset.
Kevin Campbell 25:17
The jalopy analogies is great, though, right? No, because I was I was, I was in a meeting, and you know, there was there was a difference of opinion about whether to use a more sophisticated analysis or use the kind of simpler one. And someone said, Well, I don't want to sell manual transmission cars in the days of automatics. And I said, you know, what, actually, I think it's the opposite. I think, I think the simpler analysis is the automatic. And the more sophisticated analysis is the manual transmission on a sports car. Right? It's it's the attorney see that has so much more power that can do all the different things versus at the Mac that just works, right? It's the, it's the, it's the iPhone, you just pull it out, and you know how to use it versus the Android that can do way more cool things. But it's, so it's, you know, are we building things? And are we approaching things for analytics departments? Or are we were we making solutions for HR leaders and frontline leaders. And I think both. I mean, I know, this is a people, you know, HR, Data Labs, people analytics podcast, I want to build tools for people, analytics people, and I want to help them because they're way smarter than I am in most instances. But I also want to be the translator between those folks and the frontline leaders and the HR business people to make it actionable and easy.
David Turetsky 26:41
And I think that skill is really key, Kevin is being able to explain the really sophisticated stuff to the people who can use it need it. And then to be able to explain the sophisticated yet automatic to the people who need the automatic, but want to be able to build it into how they work. And they may not have looked at stuff like this before. But now that you've explained it to them, they can now build it into their thinking.
And I think that kind of gets us into the next question, which is, what have you learned that can help you make that connection between the HR data, the people analytics, technologies and techniques, and being able to get the right outcomes for customers?
Kevin Campbell 27:37
The very first thing is looking at your data. And I know that sounds almost too simple. But what I mean by looking at your data, is to just describe the shape of the distribution to just look at what the data are telling you before you even analyze the data before you even compare one data point or a set of data points to another to just say, as an example, if you're running a customer service survey, and you're asking people to rate the employee that they interacted with on a seven point scale, and more than half of those ratings are a seven, you know, there's invariants in that data. Right? Same thing for an employee engagement item. If you know that more than half of your people are giving a five out of five to I know what's expected of me at work. You know that there's a significant difference between somebody who gives a five on that item, and somebody who gives a four, right, somebody who gives a strongly agree versus gives an agree. But then when you report the data, you're collapsing the strongly agree and agree into a favourability score.
David Turetsky 29:05
Yeah.
Kevin Campbell 29:06
When maybe the only connection between that question, and the customer outcome doesn't happen until the employee gives a five out of five. Right, and you're losing that extra variance. And that's that's all valuable information. Right. That's the difference between a Yeah, and a Heck Yeah. And sometimes you only see the connection to business and customer outcomes when people say Heck Yeah.
David Turetsky 29:34
Is that one of the reasons why there's a there's a customer service survey that we both know which I won't name, but that promoter score on its....
Kevin Campbell 29:45
Net promoter score.
David Turetsky 29:50
Okay, he went there. All right.
Kevin Campbell 29:53
Let's do it.
David Turetsky 29:54
All right. All right. Let's call it what it is. So is that the reason why a net promoter score we only count the the 10s. Right?
Kevin Campbell 30:03
I mean that I have a love hate relationship with net promoter score. I like to think of it more in terms of what it actually is, which is likelihood to recommend. Because who's to say that you that you're seven is the same as my seven? Right? Like, or that your six. Right. And for those of your listeners, I might assume that most of your listeners know this, but Net Promoter Score, they ask folks, and I'm totally mansplaining right now. But ask folks, how likely would you be to recommend our product, service, business to your friends and family on a zero to 10 scale with 10 being very likely and zero being not likely at all. And they take all the nines and 10s and call them promoters, right. The sevens and eights are passives and zero through six as a detractor. But for a lot of people, six is a good score.
It is. And that's why one of the reasons why I asked it the way I did was because when you talk about the fives vs the fours, the Gulf does exist. And it's huge. But there's a problem with the fives because there are a lot of people who automatically choose five, because they just want to get through it.
That's true. And you know, there's there's different ways of being able to account for that. But that's why I'm a big fan of and we're totally into the weeds. But this is, this is the point of geeking out in this way is that I'm a huge fan of labeling every scale point. And on net promoter as an example, you're only labeling the extremes, right? So you can say that maybe some people are just going through and clicking through and saying all strongly agree. But there's no doubt about whether they clicked on strongly agree. Right? When they click on Agree, so you know, I think, you know, some people will only label strongly disagree and strongly agree, but I think there's a lot of value in knowing that that three was labeled as a neutral, that that too was labeled as a disagree, because there's we're all on the same page of what was clicked on when they clicked on that two. If you don't, if you don't label that scale point, they might think of anything north of strongly disagree is as being somewhat favorable. So it's important to label every one of those skill points so that when we're reporting the state of back we can, we can be in agreement around what's being talked about.
David Turetsky 32:21
But but even your example of the four versus a five, and even if four is strongly agree, and five is outstanding, let's just say for argument's sake, the difference between the four and the five is dramatic, right? Because your interaction with the person or the employees interactions, and how they feel when they click a four versus a five is very meaningful, especially when it comes to things like, things that are very important to them. One of which is you know, does your leader support your vision of the company? Or does your leader communicate the goals of the company, right, from an employee attitude or opinion survey? You know, when you when you look at those fours versus fives, there's a dramatic difference. And I think you'll notice, right, there's a dramatic difference between that four and five. But yet, to your point, a lot of us kind of will bulk them together, because that's where you get the end from that kind of speaks to the the, the how many, or how often were those ratings happening versus the three, which is just I'm on the fence, which is really bad when it comes to those kind of outcomes, or that question.
Kevin Campbell 33:25
Yeah, and I think that's one of the things that I take issue with, with with NPS is that those neutrals are oftentimes, and the fours that could go to fives, those are the folks where you have the best opportunity, because they're not so disenchanted, that they're into a an unfavorable place. They're not so cynical, right? In many instances, but it's a lot easier to take a four to a five or a three to a 4 or 5 than it is to get a one, two or two to a three. Right, right. So I like seeing a lot of passives I like seeing a lot of neutrals, because that's a huge opportunity to move someone into being now a supporter. Whereas before they were on the fence.
David Turetsky 34:13
If coached the right way. Because if coach the wrong way, then that leadership gets back the thing that says you had a lot of ones and twos and then they feel mad, they get depressed. And then they're wondering who they have to fire on that list. And I'm being kind of serious, because as a leader and having been given those employee attitude surveys back, and then they give us coaching as to how to deal with them. It's less about trying to change their mind and what you can do, and what are the actions you can take to make it happen. But the culture and the feeling of the angst, I guess of the leader now that I've been documented as being a one or two, even though it's a very small proportion And, you know, how do you deal with that? What do you do? And, you know, does my psychiatrists, can they know those things?
Kevin Campbell 35:08
Well, I think I think a lot of it has to do with the way that you think about this data, right? Are you thinking about it like grades on a test? Or are you thinking about it like the speedometer on your car? Right? Is it? Is it an evaluation of your performance? Or is it information that you're using to adjust your behavior in order to get the outcome that you're looking for? You know, as I've heard people use the Fitbit analogy, right? So you know, you're just making you aware of how many steps that you've taken. There's no need to incentivize people to hit their Fitbit steps, just by virtue of tracking it, the behavior is going to come. So I think both of those things are important, not over incentivizing people to hit their scores. And the way that you talk about the score is sometimes even using the word score.
David Turetsky 35:56
Right, absolutely. Sure. So I think we're going to change the title of the episode to how you get ones and twos to become fours and fives. Just kidding, just kidding.
So Kevin, we could probably talk about these topics all day. And I really appreciate you being here. One of the things that I want to do is try and summarize for our listeners what we talked about. So first of all, you introduced the concept of an employee experience scientist, and how that's different from people analytics. And I think that's really cool and very important. And then we talked a little bit about the experience, how you get EX, to be able to show us the correlations and the causal effects of employee experience on customer outcomes, really cool stuff as well. And then we got into a discussion around NPS, which we probably will ruin that. But that's okay. Now, we talked a little bit about what people can learn from those measurements and from what you do. And I'm going to have to ask you back. But the one thing I always ask people is, is there anything else you wanted to tell our listeners before we end today?
Kevin Campbell 37:20
I think the big thing to keep in mind is that, you know, humans are full of heuristics, and shortcuts and biases. And I don't think that is a bug. I think that's a feature. And the more that we can lean into that, as people in HR as people that work with data, to understand that that's what makes us human. And that's where the magic and experiential data lies. Because the quality of that cup of coffee is so much more when you see that Starbucks label on it, not because of the quality of the actual coffee, but because of the quality of the experience that you have. So I don't know if that has anything to do with the rest of what we talked about. But...
David Turetsky 38:15
It doesn't, doesn't care. It doesn't matter. It's okay. Oh, good. That's a wonderful. It's a really, really cool way of ending. And that's I learned a lot today. Really appreciate it was a lot of fun.
Kevin Campbell 38:27
Yeah, I appreciate you.
David Turetsky 38:28
So thank you, Kevin. And thank you guys for listening. And if you liked the episode, please send it on to people who you think might enjoy it. And please hit subscribe. Thank you very much. Take care, and please stay safe.
Announcer 38:42
That was the HR Data Labs podcast. If you liked the episode, please subscribe. And if you know anyone that might like to hear it, please send it their way. Thank you for joining us this week, and stay tuned for our next episode. Stay safe.
In this show we cover topics on Analytics, HR Processes, and Rewards with a focus on getting answers that organizations need by demystifying People Analytics.