#225: From Stakeholder Buy-In to Stakeholder Knowledge of What That Means

This topic was such a big deal that we managed to have no guests, and yet we had five people on the mic! Why? Because this episode doubles as a marker of a shift in the show itself. Beyond that, though, we had a lively discussion about how every business stakeholder professes to being committed to being data driven. That should make every stakeholder super easy to work with, right? And, yet, analysts often find themselves struggling to get on the same page with their counterparts due to the realities of the data: what it can and can’t do and how it is most effectively worked with. Not a small topic! There were even pop quizzes (feel free to let us know how you’d score the answers)!

Songs Mentioned in the Podcast

Photo by Immo Wegmann on Unsplash

Episode Transcript


0:00:05.6 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

0:00:15.0 Michael Helbling: Hello everybody, this is the Analytics Power Hour, and this is Episode 225. Each moment is small on its own, but some of them loom large, and for all of us here at the Analytics Power Hour, this is a big one. A great one. As you know, we recently were able to bring Valerie Kroll and Julie Hoyer in to guest co-host While Moe was out on leave. It was great. So great, now we thought, how could we keep this going in some fashion? So I am pleased to announce that we have offered and they both accepted joining the show as co-host officially. Now, this won’t mean that we’ll have five of us on every show we’ll rotate somewhat, but I couldn’t be more honored and proud along with Moe and Tim to officially welcome Julie and Val to the show. Yeah.


0:01:10.0 Julie Hoyer: Why, thank you.


0:01:11.3 MH: Alright. So anyway, so this episode is a little unique, ’cause we’re all be on it, then this will be great because to kick off this new phase of the show, we’re talking about stakeholder buy-in, it’s funny, ’cause as I was thinking about this topic, the song, They Don’t Care About Us by Michael Jackson was going through my head the entire time, which is sort of weird, but I do think there are misperceptions and some illogic around that, but there are also opportunities that maybe we can uncover in a conversation about this. So we assembled a team of experts. That’s us. So let’s dig in. Welcome to the show, Julie.

0:01:49.8 JH: Hi, happy to be here. Back again.

0:01:52.8 MH: Yeah, back again. And now in an official capacity.

0:01:53.8 Tim Wilson: Back for good.

0:01:55.2 JH: That’s right.

0:01:56.3 TW: That’s right.

0:01:56.4 JH: Never leaving.

0:01:57.5 MH: Welcome Val.

0:01:58.5 Val Kroll: Hello. Hello. Hello.

0:02:01.2 MH: Awesome. Welcome Moe.

0:02:02.1 TW: This is awesome.

0:02:02.2 Moe Kiss: Howdy. Although, Julie, now I have Eminem back again playing in my head, so, thanks for that.

0:02:08.2 MH: Okay. And welcome Tim, who probably doesn’t get either of those two song references from the past two minutes.

0:02:14.7 TW: Nope. Michael Jackson, I’m vaguely familiar with.

0:02:18.4 MH: Okay. Anyways, alright, and I’m Michael, not Jackson. Alright, well, let’s dive into this ’cause we all have stakeholders and most of them do care about us, but we do sometimes run into problems, I guess, in that regard. So I think if we talk about of stakeholder buy-in, stakeholders are obviously people who need access to the information the data provides, they need insights, they need information, they need to be influenced sometimes to choose the right direction, they need input on strategy, and as data and analytics professionals, that’s kind of the role often times we play, as sort of advisors to people who are in charge of making decisions or needing to make decisions in the business.

0:03:05.6 TW: I mean to me, the fundamental… As we were kicking this topic around, it’s the challenge that if you you cannot find anyone in business today who doesn’t say that it is a top priority for them to be data-driven and that they are bought into using the data to get those insights and make decisions. Which all sounds great, but in my experience, I feel like they often think, so great, I’m bought in, so here you go, analyst, so go get me those insights. Like there’s just this gaping chasm because they say, I agree to use the data, so now you, the analyst, just give me some insights and recommendations driven by the data, and we’re off to the races. And that’s a challenge ’cause it’s an impossible ask. They have made an assumption that, and I know I’m kind of caricaturing it a little bit, but I’ve got so many experiences playing through my mind where that was the case, they’re like, we’re bought into it, so go make some magic.

0:04:17.1 MK: But I don’t think that’s even true. Like the fact that sometimes stakeholders are like, we wanna be data-driven and we’re bought into it, isn’t… Like that’s what they’re saying, but that also doesn’t necessarily mean that that’s how they’re behaving, because sometimes you do come to them and you say, either we didn’t find something or the data’s really inconsistent, we don’t have a rate on this. Or like are this exact situation happen this week, and it’s like, and then they’re like, well, that’s not what we wanna hear. And then you’re like, okay, well, then you’re not actually that bought into it because you would be quote unquote, receptive to the message, whether it was good, bad or indifferent, if you were. And I think it’s a sting of like… It’s the same, right, people always say that they want total honesty, but the truth is they really don’t. It’s the same, like everyone wants to use data to inform their decisions, but there’s a very big difference between what you say and the reality of how you behave. Right?

0:05:16.2 VK: I was gonna say the same thing though as Moe is like they will say they wanna be data-driven, but when you bring it to them, it actually doesn’t change their behavior, they still are acting as if you hadn’t told them or they’re looking for you to just give them the result they expected. So it’s like they were gonna do it anyways, no matter what.

0:05:35.8 JH: It’s kind of like they’ve done their part by saying, “I will be data-driven,” like that’s my role. I’ll help you invest in technology, even if you want me to help you make a case and let me know when those [chuckle] insights arise and cooked up for you to send in to my… Drop into my inbox. [laughter]

0:05:53.6 MH: I think Moe you, there’s a key part of it, that the data being… The data being inconclusive, and that then, that’s like, I wanna be bought in, but you can’t give me absolute… You can’t give me definitive answers. I wanna be data-driven, which my expectation means any question that I ask you have a lot of data, ’cause I know we have a lot of data, I’ve paid for a lot of data. And if you come back and say, Well, it’s a mixed bag, or it could be this or it could be that, or we don’t actually have the data or we need to go collect new data to answer that.

0:06:28.2 MH: All of that, I think falls in… I think that’s actually a really big one that it’s like the data is not gonna give you this perfect, definitive answer. I think it’s… I feel like it’s less if there is a definitive answer and it’s not what they wanna see, that they won’t accept it, I think just in general, that answers, the data provides always have a greater degree of uncertainty, they’ll look for a shred of uncertainty and say, well, but you’re saying you’re only pretty sure about this? Well, that’s my out to go do what I think because you need to give me the truth.

0:07:06.8 TW: Yeah. Or the other side of it was sort of like, well, I was gonna do it anyway, so this is about the slimmest shred of evidence that I can find that gives me the justification for the thing I wanted.

0:07:19.7 MK: Someone actually referred to it the other day as panning for metrics, and I was like, oh, that is a great term of like…

0:07:30.8 JH: It’s good.

0:07:32.0 MK: Like they literally are like shaking their gold pen, trying to figure out which metrics best support the narrative and…

0:07:38.4 MH: And the best metrics filter out to the bottom. That’s the what you… Yeah.


0:07:44.4 MK: I wanna be clear like this is obviously not every stakeholder, and I don’t think stakeholders are the devil or anything like that, I actually think, and we’ve talked about this lots before, these are intrinsic traits of being a human, of looking for information that you believe is true and things like that. It’s not like our stakeholders are real stupid or anything, I generally think they’re phenomenal people, but it’s like we’re trying to overcome the ways that our minds work, and that is even more difficult, I think, for someone that doesn’t specialize in data and understand the nuances with uncertainty and things like that. And so… Yeah, I do wanna get to stakeholder empathy, but that’s like a later in the show point.

0:08:34.5 VK: I think especially for more senior leaders who have been around the block and have some experience under their belt, they were making decisions long before they had access to this type of data, and so it’s a lot of unlearning of behaviors too. And so if they’re unable to model it, that’s like perpetuating that to their teams, and so it’s a lot of change management, and I don’t think an analyst starts on day one is like, okay, first thing I’m gonna do is put together my change management plan. That’s more important than getting access to the systems. Right. But it’s a huge component.

0:09:08.1 MK: Do you know the funny thing is, so very controversial, we just killed a multi-touch attribution model a couple of weeks ago, and I’ve been checking with all the team leads and I’m kind of like how to go and then like, no one’s…


0:09:26.1 MK: I was waiting for it.

0:09:26.2 MH: Somebody needs to take this away.

0:09:28.4 MK: I was waiting for it. [laughter] But like the feedback from all the team leads, ’cause I wasn’t here when this happened, was like, Oh no one really seemed to care, like it wasn’t a big deal, and I’m like, it wasn’t a big deal because we’ve been telling them, we’re gonna do this for 18 months. We have told them and told them again, over and over and over that this was coming, and it’s the same when you’re delivering news that people don’t wanna hear, like you can’t just walk into a meeting and be like, “Here’s a bunch of numbers that completely refute your 12-month road map,” like you need to… It’s change management. Totally, totally. All this stuff is change management.

0:10:06.2 MH: Yeah. How about this question? Have you observed, ’cause I certainly have, that you’re not the only one, the only voice in the conversation when you’re communicating data results or insights, and that other people are also contributing, so other agencies maybe, or other partners, wherever, coming from other places, like people will be like, “Oh, I analyze this too, and I came up with this.” Do you run into that pretty regularly.

0:10:39.3 JH: Yes.

0:10:39.4 MK: Julie’s feeling very courageous.


0:10:44.0 TW: Oh yeah.


0:10:46.3 JH: Yeah. Let me say yes one more time. [laughter] All the time.

0:10:48.5 MH: Well, talk about that a little bit Julie, ’cause I think that’s more common than we maybe even… We even realize, and maybe that’s part of the problem sometimes too.

0:10:58.5 JH: Yeah, definitely, I think it’s hard when you have your group of stakeholders actually at your client, and you are one of the agency or consultative voices at the table, and in the background sometimes that I’ve run into too, is like they’re talking to us saying, we believe you, we… You kinda share things, what’s going on, your thoughts ahead of time. They seem very aligned, you get on the call, you’re presenting what you found, maybe questioning some things about different strategies they’re using for media or what the intent behind some of them is, and then you get the other agencies coming in and saying, “Oh well, actually that’s what it’s meant to do.”

0:11:38.4 JH: And your stakeholder kinda stay silent even though they were questioning it, and it gets very hard because it’s not my job to tell the agency they’re wrong, and I can’t hold that other agency accountable to what I heard my stakeholders say like on another one-on-one call. So it gets very tricky, and it very much is a question of, okay, so have you guys changed strategy in the coming weeks or months? Are we gonna see a change? Are you saying everything’s staying the same, stakeholder, are you saying this is what success looks like? ‘Cause from what we talked about, it didn’t seem like it. From our point of view, it seemed like there should be some shifts. Yeah, it’s not a fun monthly cadence to be on, it’s hard.

0:12:20.2 TW: Well, that goes back to the a little bit in the mind’s eye of the stakeholder that like, sure, it’s data, it’s objective, so no matter who looks at it in which data set, it’s all gonna point in the same direction. And so then in that sort of client multi-agency scenario, and there are different agendas, whether malicious or conscious or not, like different data sets available, all of that. And then the stakeholders like comes in saying, I am ready, I’m expecting agency A, agency B and consultant C to all basically arrive at the same conclusion. And when they don’t, they’re like, “Oh, wait a minute. But I wanna be data-driven.” And again, it kind of bounces back to, “Well, you guys need to go figure it out, ’cause you’re not all giving me the same answer,” when the reality is, it’s like, well, everybody may be right, and it’s more…

0:13:15.6 VK: And the fastest way to kill that trust too, is all you have to say is, well, in another system, the number looks different, that’s not what I thought. [laughter] Immediately…

0:13:22.7 MK: Yes.

0:13:23.6 JH: You killed all decision making or trust that might have come out of that meeting. It’s like the easiest way to do it, whether it’s intentional or not used for good or evil, I feel like that just kills it. And to your point, Tim, what blows my mind is in these monthly meetings that you get stuck in of like, let’s talk about last month, which ended three weeks ago, and talk about what happened to make decisions moving forward. You know, they get so caught up in explaining what happened, and they want you to magically tell them which direction to go down the road, but they haven’t told you if the road has a left and a right, it’s like a three path fork. Are there five directions they might go, they’re just like, “I’m standing at the edge of a cliff, like which way should I jump?” And they’re expecting you to come out with, “Here is exactly where you should go,” and it’s never gonna happen.

0:14:12.1 TW: It does, but I need to have like a timer or egg timer in this thing with… This is not to say that stakeholders are bad or dumb or evil like it goes back to the empathy, that’s the challenge, ’cause they kind of have an expectation and they’re not necessarily armed with the tools. And then I think they’re usually really trying to help. Like every time a stakeholder, they give a very, very specific data request, that’s kind of an alarm bell because it makes me… They’re trying to be helpful, but they may not actually be asking for the right thing, and that winds up being another… But in the case where numbers are driving, the stakeholder says, I can help by decreeing that we need to figure out which system is right or understand the discrepancy, and all of the sudden multiple resources are heading down trying to reconcile numbers because it’s tangible. And even the analytics get sucked into that.

0:15:09.9 MK: Do you know, it’s almost like my experience with where there are different numbers, they can’t take a step forward until we can explain why the numbers are different, and it’s this real… Generally, I’m like, if it’s less than 10%, I don’t give a shit. Like I’m not spending… Or I would tell someone on the team not to spend time on it, but for stakeholders, they’re like, “No, I need to know why it’s different,” and that’s not always a question we can answer, it actually just even thinking about it now makes me wanna pull my hair out because…

0:15:49.8 VK: One of the times that I was in, that’s kinda like blending a little bit of the pain you were talking about, Julie, and what you’re referring to Moe, on site with a client, and we were dealing with this multiple agency partners, different consultants in the room. And we took a quick break and I could tell that her primary client, he was nearly in physical pain because he was so distraught about numbers not lining up and matching. And so I actually went to one of the agency partners on the side, was like, “Hey, can we go grab a Starbucks real quick?” And I was trying to get him on board with, before we meet next time, I wonder if you and I could connect and we can just try to figure out what the story is, whatever. And I’ll never… I was so surprised by his reaction, he’s like, “You know, it just creates more questions for me, which means that my contract will probably renew, so I don’t mind the questions.” And I was like, what? What? And I was like, I can’t even take this back to my client, but that, I was like, this is gonna be this hard every single month, every single time, this poor clients.

0:16:47.9 MK: That is gross.

0:16:51.6 VK: Yeah. That was rough.

0:16:51.7 TW: That’s very callous.

0:16:52.4 VK: That was rough.


0:16:55.3 MH: Alright. It’s time to step away from the show for a quick word about Piwik PRO. Tim, tell us about it.

0:17:01.4 TW: Well, Piwik PRO is easy to implement, easy to use, and reminiscent of Google’s Universal Analytics in a lot of ways.

0:17:06.6 MH: I love that it’s got basic data views for us technical users, but it keeps advanced features like segmentation, custom reporting and calculated metrics for power users.

0:17:15.8 TW: We’re running Piwik PRO’s free plan on the podcast website, but they also have a paid plan that adds scale and some additional features.

0:17:21.8 MH: That’s right. So head over to piwik.pro and check them out for yourself. Get started with their free plan, that’s piwik.pro. Alright, let’s get back to the show.

0:17:32.9 MK: Okay, Tim, can we talk about… In prepping for this episode, you talked about the fundamental question is how to fight the frustration that they just don’t get it. And I feel like that is… You’re spot on the core of the issue, one of my bug bears at the moment, and I’m trying so hard to get everyone around me to not use this language of like, “Oh, we need to educate the stakeholders.” I find it so fucking condescending, and I know that when people say we need to educate the stakeholders, it’s coming from a good place, but it’s almost like you’re an idiot and I need to teach you a thing or two, but it comes back to, they don’t get it, that is the reality. And we are trying to help them understand it, but it’s like, fuck, I can’t even talk about this without having this really visceral physical reaction to the topic.

0:18:33.5 TW: I think through, I mean, I don’t think I thought that empathy is like so critical, I agree 100%. I don’t like data literacy either ’cause it gets defined as being, we’re gonna teach them how to log into the system, or we’re going to teach them how hard our jobs are basically ’cause we want to walk them through.

0:18:55.9 MH: Yeah.

0:18:56.5 TW: But I think back, just in the last couple of years, there have been some is… As I’ve been more involved in kind of more, some more advanced statistical or machine learning type things where there were concepts, where we were trying to figure out what the comfort was for a Type I versus a Type II error and putting a lot of work into or even with the causal impact package and trying to say, “I don’t wanna show the work.” But if we do this well, and we can say here was your ask and we need to take you along just enough so that you get some intuition about what we’re doing and you’ll pick up in that case, that there’s some uncertainty in it and you’ll get visuals that you can internalize.

0:19:45.9 TW: So that then, I like to think that’s then taking them forward as well. It’s like in the moment you had a specific ask, we came up with a way to address it, really labouring hard on the communication and the visuals to say I am not gonna walk through and try to teach you statistics. But I do want you to be really comfortable with what we did and what we can, what conclusions we can and can’t draw from that. I think there’s one, and none of that has to do with, “Well, you’re dumb. And I need to… ” ‘Cause it’s the flip side. I need to be completely inside your head so that I’m communicating to you really effectively, even though maybe the back of my brain is like, “You just don’t get it.”

0:20:32.5 MH: I find myself asking the question directly more and more of like, how close do these numbers need to be together? Or how much confidence do you really need to have to be able to make a decision? Because that sort of helps me then know, like, okay, for this stakeholder, for their decision making process, how comfortable are they? And I don’t deal with the advanced concepts you deal with Tim, as the quintessential analyst. Yeah, you knew I had to slip that in there.

0:20:57.6 S1: You think that’s cool? Fuck you.

0:21:01.6 JH: We gotta get a sound for that.


0:21:03.2 MH: Yeah. But I think that’s one of those things where it’s, yeah, they’re trying to solve a problem they have, I’m trying to make the data come as close to their problem solving realm as possible, to make it as simple as possible. ‘Cause more than times that I care to admit, I’m completely mystified by why people choose to behave the way that they do around the data that they have. And that’s disappointing to me ’cause I’m trying to understand, I’m trying to be empathetic, but I’m sort of like, why would anyone rational act the way you’re acting right now? And sometimes it’s because they don’t care that much. Sometimes that happens. But for people who actually care, a lot of times it’s because there’s sort of like this misinformation or inability to kind of like… I’m running out of ideas of how to say it.

0:21:55.1 TW: Or you don’t understand or you don’t know something yet. That’s the other like…

0:21:58.3 MH: Yeah, something I don’t know about what they’re trying to do or how they’re trying to decide. A missing component from my side.

0:22:06.2 VK: I almost wonder to the data literacy part of like, it’s a little painful because, exactly, we’re not trying to teach them our job. They don’t need to be able to look at the graph of the results and pull out the exact same summary we may give them. But they need to… It’s almost like building the trust between the analyst and the stakeholder, right? Between analyst understanding the business deep enough to say, what are you really trying to achieve? Get after? What do you really need to be asking of your data, and help guide them that way. And then when they ask, they trust you. When you bring back results of the way you communicate the results, they’re either going to question how you got that, and then I feel like that puts you down the route of, oh, data literacy. I need to explain exactly how I got this and blah, blah, blah. But if you can build that trust, then I feel like that partnership works better and you fall less into, “Oh, we need to make our stakeholders better at looking at data.”

0:23:00.1 TW: I agree. I think that’s the other, there’s a piece of that if they ask for something very specific, the way to try to help me understand, make sure I really understand what you’re trying to do. So there’s a little bit of, I’m framing it as my responsibility to understand what you’re trying to get to through the course of the discussion and the questions that I’m asking can soften them up a little bit. One, builds trust. Like, oh, they really care and they’re trying to figure it out. But two, oh, I guess the time-frame I’m looking at does matter, or oh, I guess that unexpected event maybe does matter. So it softens them up a little bit, and then coming back with saying both, here’s exactly what you asked for, but here’s also maybe something that’s a little more nuanced, maybe a little messier, maybe a little less conclusive, but it’s appropriate. It’s kind of that trying to walk down the path with them and not trying to solve it all at once. Like, oh, my first deliverable, I’m going to completely transform the way that we interact.

0:24:08.6 MK: Tim, pop quiz. I literally had this conversation happen yesterday, and I feel like this is the perfect time to revisit it.

0:24:16.6 TW: Oh, shit.

0:24:17.8 MK: With you as a quintessential analyst. So stakeholder…

0:24:21.0 TW: Got to get the sad trombone sound effect.

0:24:23.9 MH: Thank you.

0:24:25.9 MK: Okay, stakeholder Johnny comes to you, Tim, and goes, hey, I want you to look at metric A for June 2022. I don’t know, people give me more specifics. I want you to break it down by platform time. I want to look at different countries. And can you also add in some, I don’t know, people help me here. Give me a very specific…

0:24:49.3 MH: Device category.

0:24:50.8 MK: Device category.

0:24:51.6 MH: Browsers.

0:24:52.0 JH: And I want percent change year over year.

0:24:52.8 MK: I want percent change year over year. So, Tim…

0:24:58.7 MH: I mean, it goes without saying, right?


0:25:00.9 MK: This, obviously. Tim…

0:25:01.0 JH: It’s a given.

0:25:03.0 MH: It’s a given.

0:25:03.2 MK: I’m curious, right? So, let’s say you know Johnny pretty well. What pops into your head? How do you handle it?

0:25:11.4 TW: I mean, I’ll admit still what’s gonna pop into my head, is I’m gonna immediately go trying to figure out how logistically I can actually pull that off and how I might visualize it. And it’s a problem. Like I will, which I think is, I don’t think I’m the only analyst who has that, is…

0:25:25.8 MH: Interesting.

0:25:25.9 TW: If you’re giving me something that’s challenging and specific, I go to, I’d probably start doing the math of saying, “Well, if you want it broken down by this and by this and by this, let me explain math to you.” You just said six times four times 12, you’re going to have a shit… That’s not the right way to go. Even though that is what my…

0:25:44.3 MH: It’s fascinating.

0:25:44.4 TW: What?

0:25:45.0 MH: No, I immediately try to figure out, why are they asking me this?

0:25:50.3 TW: No, that’s the right thing to do. So that’s where…

0:25:52.0 MH: No, I just didn’t realize the brain works differently for different people.

0:25:55.5 MK: Totally.

0:25:58.5 MH: Which I knew, but I didn’t totally get until just you started talking about it. It’s just so cool.

0:26:00.3 TW: But I think there are plenty of analysts that they don’t recognize, that’s not how they… They’re like, “Oh, they gave me specific requirements. Let me go meet the requirements. If I meet the requirements, then they will be happy.” And they are initially ’cause they’re like, this is what I asked for, but I can’t actually do anything with it. And then that heads down the path of not helpful.

0:26:21.5 MK: You’re avoiding my question/pop quiz.

0:26:25.8 TW: I mean, I would literally turn it around to say, okay, one, let me make sure I capture everything you said. I want to write it down. You rattled off a bunch of stuff. Can you help me understand kind of where you’re going? Is it because… I would start turning it into hypotheses, that on the fly I am trying to come up with. Do you think that year over year, do you… Is your expectation that this year, because we came out of COVID was wildly better? And if so, just help me understand what you’re gonna do with that. And I’m not challenging you, I am just trying to make sure I actually can understand what you’re trying to do, ’cause I’m going to have to make some judgment calls as I pull this. And I don’t want to be peppering you with questions on Slack constantly, so can we talk about this a little bit more?

0:27:16.5 MK: I love this.

0:27:17.7 TW: So one of the things I do hate about, it can…

0:27:22.9 MK: Tim’s exhausting. Yeah.

0:27:23.1 MH: It is. Yeah.

0:27:24.5 JH: Too much pressure.

0:27:25.0 VK: Johnny’s a pushy guy.

0:27:28.2 MH: That’s actually, Tim, that’s actually, I’ve seen you do this and I’ve always sort of been, it’s one of the things I really do hate about you, is you always have like three or four more really great questions than I do in that moment. Like that’s like one of your, I do think it’s a really big strength of yours and that’s perfect way to say it, ’cause it’s like you are able to go in and drive really deep into that really quickly. And it’s sort of like, I think for me, I play this back in my, I play the conversation back in my head and be like, well, what was their emotional state like? And were they stressed out? Were they fine? Like, but you actually like go and ask some very detailed questions, which I think gets you to actual real analysis faster. I tend to like spit think about it for about four or five hours and then come up with a hypothesis of my own that I then use to answer the question.

0:28:19.3 TW: But the thing is they can be, they can be dead wrong.

0:28:21.0 MH: Well, both of us could.

0:28:21.3 MK: But Helbs, I love that you focus on the emotional state, right? Because like sometimes when someone comes to you with a really specific request like that, it’s actually not their request, it’s that they’re under a fuck ton of pressure. Someone has asked for something for like a board deck or like founder or something intense and they want something, they’ve been asked something very specific. And so like they are in this pressure cooker environment already and they might not even have the information to answer Tim’s questions. And so I actually think that giving, like giving focus to their emotional state and like what’s going on with them when they ask is phenomenal. Like that’s such a lovely approach that I haven’t kind of thought of.

0:29:01.7 MH: Being an analyst in an e-commerce environment during the holiday season? Well, boy, you learn all about matching intensity.

0:29:09.5 TW: But that, I mean that is a, that is a huge ’cause that comes down a lot. And it’s like, well if I’m asking for more and it’s, and you get the well, so-and-so needs it management, management needs it like to, that is challenging. And that is one where you say, well, let me, how important is the speed? How important is the accuracy? What’s the minimum viable product? But also it’s an opportunity to say, you’re now on my team. ’cause now we together need to make our best guesses about what will solve that. And they kind of, this is one where they, they kind of fucked up. Somebody came down and threw something at, ’cause you’re, that’s a great, the analyst is often at the end of the chain and you may be two or three levels removed and you’re being told you gotta plug some number into this thing that makes no sense.

0:29:56.4 MK: Totally.

0:29:57.7 TW: If it’s easy to pull and you can’t, you can be like, I mean, you have to have judgment and say I’m gonna just plug it in. But I think that that sometimes it’s like, well, the kind of the problem is the stakeholder didn’t take a breath and ask a little bit of a deeper question and you’re not gonna solve it for this scenario, but another opportunity to say…

0:30:19.4 MK: Sorry, pop quiz.

0:30:20.8 TW: Yeah. Shit. Okay, Julie’s answering this one.

0:30:23.5 MK: No, No, no. This time, this time I’m gonna go to Val or Julie. Like I’m happy for whoever wants to jump in. In this scenario where it has come down the line and the analyst is the last person, what are your reviews on? Do you think like, I like Tim’s approach of like, okay, stakeholder, we’re now on the same team. We’re trying to answer this question together, but do you think it’s important for the analyst to just work with that stakeholder or do you think they should go back up the line and be asking questions so they can better understand the question?

0:30:54.0 VK: I absolutely think going up the line and maybe even going up the line on your own side as well, because I was thinking about like how would I react and I would actually struggle to even listen after like the third metric or cut of the data because my mind would be thinking about like what meeting was he or she just in or is he preparing for? Or like, did I hear about something in that town hall or like, is it like I would just be trying to like connect the dots because like there’s just no chance that like whatever we’re like trying to squeeze out of this is gonna ultimately answer those questions. And so I love the, we’re on the same team, like let’s do this together. And so that might even be like, let’s frame up some questions just to make sure that we’re getting at exactly what we need for whatever said high-pressure, high-intensity meeting. But I do think that’s critical because otherwise it’s gonna be just, you’re just perpetuating this, the cycle. It’s, it’s losing the opportunity for a teachable moment.

0:31:49.4 JH: And it’s always kinda scary when you’re the analyst at the end of the line. ’cause I agree with what Val said. Like I would push back and if I needed to, I would figure out a name that maybe I could go directly ask some questions too. Because when you’re the analyst at the end of the line, now I’m responsible for the number that’s getting tossed around randomly. And if somebody disagrees up the ladder, guess who it comes back to when you had no contact. So my…

0:32:10.8 MK: Totally.

0:32:11.9 JH: My spidey senses always go up of like, okay, I’m gonna need a little bit more before I feel comfortable tossing a number over the fence that’s going to be tied to me.

0:32:20.3 MK: Totally.

0:32:22.4 MH: Yeah. I’ve definitely been the person to throw the pin and hold the grenade. And that’s not been positive experience.


0:32:31.7 TW: Oh, man. I mean that there was, somebody was actually in, I can’t, this is gonna come to me about four and a half minutes after. ’cause I cannot remember they were giving an example of where somebody had asked, and this was within the last month or so and the numbers they were pulling were wildly out of whack with some other number. And it was basically things had been lost in translation and the the cut that was being looked at was, was actually a very, very narrow slice was what the ask was for. And this person was never told that and could not get historical numbers to reconcile at all. It was off. And just the amount of churn and spin and you would like to think, oh, the organization will learn that this daisy chaining request together never ends well. And instead it’s like, nope. It’s often like, well I don’t know the analyst, it took the analyst forever to figure this out. It’s like, well, you kind of set them up for that, but.

0:33:33.5 MK: Failure.

0:33:34.5 TW: Oh, man.

0:33:34.8 VK: Well, just really quick to go back to something you were saying earlier, Moe, about how you don’t like the, you know, setting aside an initiative for like educating stakeholders, which, I agree with as well. Back to taking advantage of like teachable moments. One thing that I really like because okay, let me pause. If I were to sit down and put together a deck, like we’re gonna educate the stakeholders, it’s so easy to be like, okay, we’re gonna talk about these two topics and it’s like, well if we talk about ABN testing, you kind of have to add multivariate and now we need to add this. And now I’m teaching you so many things that like actually aren’t critical for you to be able to, in your role, use data to help inform decisions. And so the teaching within the context, when you have the time and space of a request or a project where you’re collaborating on, one thing that I’ve found very helpful is like the test plan or if it’s an analysis like the analysis plan, so that you’re talking about like what assumptions are we walking into?

0:34:26.3 VK: What is the business case that we’re trying to prove out? And being able to like step through those questions together when you have like a little bit more breathing room and like even going all the way through the end, like what is the outcome matrix. So if this is their outcome, what actions are we gonna be taking? And when Julie talks about this with clients that we’re on together, she talks about it as like the soft contract for if this is what we see, this is what we’re gonna go do. And so it’s really making sure that all your efforts are gonna be tied to action and there’s just a lot of goodness there, but they’re, you’re not doing all the education at the end when you’re presenting the results or the analysis. They’re kind of bringing them along that journey. And so then you’re making them aware of some of the, the caveats of this approach or, you know, some loopholes with this data that we’re gonna have to like work around. But I’m curious, like, you know, if that’s something that you’ve seen in the past or if others have, you know, different ways to approach the education without it being a 165 slide deck where you try to teach them become an analyst.

0:35:24.8 JH: I do think the education upfront helps a lot because I think a lot of the issues we’re talking about comes from, again, like the analyst being pegged on at the last part. Like they’re not brought into the process and actually brought as somebody to the table understanding the data and like owning that like responsibility of the project and like being looked at as a key person on the team to be there from the beginning of someone talking about planning. Because even with the idea of like that analysis like plan that we have used as like the soft contract is I’ve had experiences where we outline the whole thing with assumptions and how it was gonna be done and it was dependent on how certain media was run and the stakeholder got it, they understood it. We got them to say yes, the question I’m asking is this, this is our hypothesis about it, this is what we’re trying to validate by doing this analysis.

0:36:17.5 JH: And then unfortunately the way the media that was actually rolled out didn’t happen that way at all. And so what we ended up having to do is go back to the stakeholder and they were freaking out saying, “But I need something to show leadership because I had just spent a lot of money on this media campaign. What can I learn from this?” And so we actually had to go back to that approach, like document soft contract and say, okay, well because of what happened, we can’t answer or validate this exact hypothesis. What we can help you answer are maybe some of these other questions. It’s not the key question you cared to answer, but they trusted us to say, here are your viable options with the outcome and what happened. You know, these are the learnings, they’re not what you hoped for. But I feel like that was such a good example of when we were able to partner early on in the level of education we were giving them. They didn’t need to see all the, charts or know all the math or understand all the statistics behind it, but they got it at like that business level of like what were they getting out of it and what decisions could they make based on it.

0:37:17.0 MK: Well, fuck, Julie, you are just so good at your job. Like I’ve never even worked with you, and I’m sitting here in awe being like how…

0:37:25.1 MH: Pretty good. Pretty good. Pretty good.

0:37:26.5 MK: Pretty fucking good ’cause everyone else here has worked with you.

0:37:30.4 TW: But the other piece is because you had that discussion up front. When it loops back around, ’cause that’s not, if you hadn’t done the upfront and the BD got screwed up and it wasn’t. And then you were like I can’t. They’re like, well, what the hell? You’re the analyst, they execute, you analyze, and instead you could be like we talked about this enough up front, don’t you remember how the critical thing was that they executed this way and they completely missed it and we can’t just wish that away. If it didn’t matter, we would tell them to execute however they wanted, and we punched the magic R code in and it would solve for it. So it’s like having that upfront interaction then sets you up to go back, it’s like well, this isn’t what you wanted to hear, I love the, “Let’s see what we can find.”

0:38:24.3 TW: Couching it carefully and like… But this isn’t answering the big question, so let’s not think that that’s okay, ’cause that’s the thing, I think we forget that if we say, well, this is screwed up, and we’ll just do the best we can, and we don’t want to throw anybody under the bus and we want it to be… We want to move ahead, you want people to feel some pain because you do want to drive the change, and so walking that… You’re not trying to inflict pain, but there’s the constant. Oh, we’ll set targets, next time. We’ll just do something we’ll set targets. That’s this tough thing, is that there needs to be friction and some discomfort to get the upfront discussions happening better the next time.

0:39:06.7 JH: That’s definitely my biggest pinpoints, is like you want to help them, but. How are you realistic? But you’re not. Nope, I’m not doing anything for you, but then you’re also not giving into the bad habits and giving them false hope or bad information. It’s not fun.

0:39:23.4 MH: Yeah, and not come out looking the bad guy at the same time. Alright, I want to flip it around a little bit ’cause we’ve been talking, I think it’s been really great. But I’m sure all of us had at various points, have worked with stakeholders who were amazing to work with and were delightful. And so think about those experiences a little bit, what were the stakeholders doing that enabled you that was making that such a great interaction and making it a great partnership? ‘Cause I feel that’s kind of… We can kind of end this whole thing, I’m a little bit more of a positive, you know.

0:39:54.8 MK: Please can I go first?

0:39:55.1 MH: Yeah, of course, yes. Please do.

0:39:57.7 MK: Okay. ‘Cause I actually. Yeah, I have had a gazillion, amazing stakeholders. What Tim said about like, we’re on the same team and we need to work this out together, it’s generally people that are, “Here is the problem that I’m trying to solve,” and not just like, “Here is what I want you to do.” It’s here is the problem, and starting there and then working it through together, that is where you always see… You see the analysts really push the boundaries on what’s possible and you see, generally a really good outcome for the stakeholder too.

0:40:34.0 JH: In Tim’s word, I have had the best interactions with stakeholders and the best outcomes with stakeholders when they view you as part of their team. We’re on the same team, we’re working towards the same goal, and they outline, here are my goals, and they kind of open it up to, have you actually talked about how can you help me reach those within your wheelhouse? How do we make this partnership as strong as possible.

0:41:00.1 MH: Yeah. It’s interesting ’cause I was thinking about this too. I think one of the things for me is if they can tell me what they’re trying to do like big picture. I felt in addition to just this one thing we’re working on now. I found other ways to add value to that throughout the relationship, where I’m kind of like, “Oh, I’ve actually, because I know this about what you’re trying to achieve. I’ve also now done this, this and this, that you don’t even think about ’cause you’re not in the data, you don’t think about these things in this way, but here’s how I can kind of set you up for success.” And so that’s kind of cool too, and it feels nice ’cause then I feel, “Oh, I might have added some value there,” so that’s one way I thought about.

0:41:42.1 VK: One of my favorite clients that I… When I was internal that I worked with, his name is David, he was super open to a major reset. When we first started our interaction, and one thing I think we should all just keep in mind is when we’re working with these people and these marketing teams, these product teams. It’s likely that they’ve worked with analysts in the past, and it’s also possible that those have not been positive interactions. And so you’re dealing with the trauma, the data-trauma of previous interactions, and that’s something to keep in mind.

0:42:10.3 VK: And so David was open to just taking a quick pause and we talked a lot about what we have to each do in our roles to be successful, and I got a really great understanding for what he was accountable for within the store. And so when we’d get together to meet, he would always say things, “The last time we released all those new products, you said that that did something to the tags, so I just want to let you know that that’s gonna be changing in a couple weeks. I don’t have a date yet, but I’ll send you that. Like is that helpful information?” And so we were always checking in on, I’m saying this, is this helpful is this valuable, and it was just really open and honest with each other, and so.

0:42:45.3 MH: You’re like, no, I’m crying ’cause I’m happy.


0:42:49.6 VK: These are happy tears.


0:42:50.6 VK: These are happy tears. Yeah. But that really enabled us to just… No conversations were ever charged, I felt what I was doing was what he needed, I was able to do it in the appropriate time frames, no one ever felt rushed or crunch, and so again, back to partnership, but the reset because again, there was some baggage being brought into that conversation. It was good, it was healthy after that.

0:43:13.4 TW: I’ll throw mine in. I’m thinking of a couple of people in specifically, and it goes beyond them talking about the business problem, but ones who are really enthusiastic about their job and what they’re doing and are just, you can’t get them to stop talking about giving you all the background and the context. One, I generally find that interesting, if it’s somebody saying, so here’s how this works, and they take a big step back and they’re just so excited. And this is a lot of times they have to explain that to lots of people, but the fact that they’ll take it and say, “Oh, forget about the problem. Forget about the data that I’m asking. Let me just talk about what I’m doing and why I’m so excited about it.” And that’s gonna then get to what the challenges are, partly, ’cause those are just, they’re interesting conversations, and it goes back to my contention that actually trying to get inside their head, once they’re trying to scope it to the problem and trying to give you just enough of additional context that gets…

0:44:21.8 TW: It gets a little contorted and the ones that have been willing to not only say, “Hey, the first time we’ve met, let me give you the big picture about how the business works. Here’s where I fit within the business,” and then on going, “Oh, hey, something happened last week, lot of craziness going on inside, ’cause this is what happened,” and then if there’s the room for them to spend five or 10 minutes explaining how those dots connect and what that means, ’cause I think ultimately the analyst… I’m always more successful as an analyst when I actually understand what’s going on and I’m interested in it, and it’s tough for me to be interested in it if somebody hasn’t told me a story that makes me think it’s worth being interested in.

0:45:06.8 VK: One more to add, I would be remiss if I did not say this one, when you get someone that’s super excited like what you’re talking about Tim, and then they become your evangelist, and it can make space for the good work that you do inside the organization where you don’t have to be your own cheer squad and doing the work, because there’s a lot of meetings that you’re not gonna get invited to, especially at first, and so if someone can be there to talk about the value you can bring to a conversation or a problem, you really really come to appreciate those people too.

0:45:36.8 MH: Oh yeah, giving you the inside track on like, “Hey, in this meeting, we really gotta make sure we get this point across or something.” Oh man, that’s so valuable.

0:45:45.5 TW: Those are the people where if like if I’ve got any spare bandwidth, those are the ones that I’m going to try to figure out how to go on above and beyond, there’s only so much time I want to make them really see things. I know they’re on my side I know we’re on a team, let me make sure let me invest in that relationship more because they’re going to be a lot more authoritative than I am when it comes to talking to other of their peers.

0:46:12.2 MH: I think the other thing for me is also when they really know their metrics. So we talked a little bit earlier about sort of data fluency, data literacy, whatever, I don’t need people to know Type I, Type II errors and that kind of stuff, but if they understand, okay, my metrics are this, and when they’re moving like this, I already know what’s happening with it, then we avoid so much conversation about why did this number change and it’s just sort of, as an analyst, sometimes that is so annoying to go do all of that work, and then you could sort of be like, yeah, I saw this number moved, oh yeah, we did this change to the code base, so it’s going to do that from now on. Oh, nice. Okay, thanks. Wow, you already thought about this. You already know the answer. So let’s just keep rolling. It’s kind of nice.

0:46:55.8 TW: I don’t know, I mean, there’s the flip side of the people who totally know their metrics, but they’re the wrong metrics, they’re misguided, and you can’t get them to move off of it right they’re, “Oh our CBC benchmark, Click-through. You gotta look at the click-to-open rate.

0:47:08.9 MK: I love that Tim has brought us full circle back to…

0:47:13.4 TW: We’re not gonna end on positive.

0:47:13.6 MK: How do we end it this on a negative note?

0:47:17.1 MH: There’s no way this show can’t end on a positive because look at us, we’ve got two brand new co-hosts. Everything is right with the world.

0:47:26.4 MK: That’s true.

0:47:26.5 JH: But can I do one…

0:47:27.6 MH: Yeah, Julie go.

0:47:30.0 JH: Shoot off of what you said, Helbs, is like, I think ideal state or a great stakeholder to work with is one that is less focused on how their metrics moved again, only looking backwards because they’re so fixated on explaining out what happened, but if they are really integrated into the business and they understand some of those bigger storylines happening, they’re more focused on how do I make a decision moving forward based on my data. Like these are the things I control and I can do, and I need you, my analyst to help me decide which lever should I pull, or “Hey, I really think my gut’s saying A, but I trust you to help me test that,” or there’s a lot of nuance there, but if you have a stakeholder that can actually say, I’m less interested in looking six weeks ago what happened, and you have to tell me exactly why a metric moved up 10%, and I’m more interested in this bigger ecosystem, and I’m looking forward because I want to go optimize something, I want to make something better, I feel like that is your golden stakeholder. Because they’re not stuck in this more classic way of thinking about data and you’re always looking backwards.

0:48:38.4 MH: Yeah, having a plan. So have a plan, be smart, care and look out for your analyst. That’s the four stakeholder golden keys.

0:48:50.8 TW: The first Analytics Power Hour, e-booklet, well, e-pamphlet will be coming out…

0:48:55.1 MH: Oh geez, now you’ve gone too far in. Speaking of going too far, we do have to start to wrap up the show. We’re not gonna do last calls this episode as we just wanted to soak in the newness of this whole thing, but Val and Julie, what an honor, thank you both for joining the show, it’s pretty exciting to have you.

0:49:14.3 JH: Yeah, this is gonna be fun.

0:49:17.7 MH: And Tim, I know this is gonna mean you’ll have to figure out how to ask shorter questions and talk less and, no I’m just… I’m kidding that’s probably not gonna happen anyways.

0:49:29.5 TW: It’s just more work for Josh.

0:49:31.9 MH: That’s right. And Moe, it’s great to have you back as well after leave and everything, so. Yeah. I’m pretty excited about the future, but if you’re listening, we would love to hear from you, we’d love to hear from you via whatever way you’d like communicate with us, and that’s best done on the Measure Slack group or Twitter, or our LinkedIn page. And so, yeah, we’d love it. Tell us your thoughts. Have you dealt with great stakeholders? Is there a fifth golden rule? I don’t know. Let us know. Alright, and no show would be complete without a mention of Josh Crowhurst, our awesome producer. His job just got so much more difficult, but he’s apt for it and we can’t think of enough, Josh, thank you for all you do to make the show possible. Alright, I know that you probably work with many stakeholders and as you’re working with them, you’ve got a lot of things to figure out, but remember from all of my co-hosts, Julie, Tim, Val and Moe, I know I can say with confidence they would want you to keep analyzing.

0:50:41.7 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @AnalyticsHour on the web at analyticshour.io, our LinkedIn group and the Measure chat Slack group. Music for the podcast by Josh Crowhurst.

0:51:00.0 Charles Barkley: Smart guys want to fit in so they made up a term called analytic. Analytics don’t work.

0:51:06.5 Kamala Harris: I love Venn diagrams, it’s just something about those three circles and the analysis about where there is the intersection, right?

0:51:17.0 TW: So I have a mechanical keyboard, so I when I type…

0:51:20.7 MK: Oh boy. It sounds highly aggressive.

0:51:27.4 MH: Yeah, we know, we know.


0:51:27.4 JH: We are aware.

0:51:30.6 TW: So what I usually do is I usually mute, if I’m going to type something as best I can, or I try to type super softly, which sometimes…

0:51:38.0 MK: Which never works.

0:51:40.3 TW: Even when I do it…

0:51:41.1 MH: Let’s see if you can hear this.

0:51:44.6 JH: Yes, we can hear…

0:51:45.4 TW: Oh shit.

0:51:46.5 MH: It’s mechanical.

0:51:49.3 TW: Yeah. Well, you know what? No regrets.

0:51:55.8 VK: ‘Cause there’s five of us and you have dot dot dots between welcome and all of our names, are we responding to your welcome message?

0:52:01.6 TW: Yeah, yeah, yeah, you just be like, “Hey, what’s up.”

0:52:04.1 JH: This ones…

0:52:04.9 MH: Yeah, yeah, no I’ll be like, “Silence.”

0:52:07.6 MK: The last time Julie be like…

0:52:11.6 JH: Well, the one I was choking on my coffee, I was not prepared, I was like…

0:52:16.1 MK: Yeah, exactly.

0:52:17.0 TW: What did I say before, Julie? On this podcast, you speak when spoken to. No, I’m just kidding.


0:52:25.9 JH: Well, fuck.

0:52:27.4 TW: And mark up.

0:52:29.0 MH: Yeah, there you go. That’s right. Yeah, ’cause that’s right. Next on Business Insider, Analytics podcast’s abusive environment. They get quotes from poor Josh it’s like, “Yeah, for no money at all.” Okay, here we go. Five, four… I got this. Okay. Five, four…


0:52:52.7 MH: So who wants to kind of kick us off with in terms of what are we trying to solve for here?

0:52:58.9 MK: Wow.

0:53:01.8 MH: Oh jeez, we have a podcast that’s five people on it and nobody wants talk. I love it.

0:53:03.7 MK: No, but like you can’t just start a topic like stakeholders and be like, “Here it is.”

0:53:10.0 MH: Alright, I’ve been a stakeholder, so let me tell you some of my problems.

0:53:15.7 JH: Stakeholders, have you heard of them?

0:53:16.7 MH: Stakeholders, so first off, what if you’re vegan, and then there’s people with stake… No, okay. Let’s just…

0:53:26.3 VK: Took way too long for you guys to…

0:53:29.5 MH: Yeah, I was like, wow, so we have a lot of work to do to make this whole new group gel to get all my jokes, but it’ll be fine.

0:53:40.2 TW: Rock flag and new co-hosts.

One Response

  1. Cleve says:

    Speaking of having multiple agencies, or even different internal analyst teams, looking at the same data and coming up with slightly or very different conclusions. What we’ve found is that it frequently comes down to the different groups making different assumptions about what the data actually represents. They often based their assumption simply on the name of a variable or event name, but what it actually represents on the data collection side of things is slightly different than their assumption. May sound like a small difference, but when you start building reports and analysis based upon a misunderstanding of the data, you can get wonky results.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#249: Three Humans and an AI at Marketing Analytics Summit

#249: Three Humans and an AI at Marketing Analytics Summit

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_249_-_Three_Humans_and_an_AI_at_Marketing_Analytics_Summit.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares