#258: Goals, KPIs, and Targets, Oh My! with Tim Wilson

<yawn>KPIs? Really? It’s 2024. Can’t we just ask Claude to generate those for us? We say… no. There are lots and lots of things that AI can take on or streamline, but getting meaningful, outcome-oriented alignment within a set of business partners as they plan a campaign, project, or initiative isn’t one of them! Or, at least, we’re pretty sure that’s what our special guest for this episode would say. He’s been thinking about (and ranting about) organizations’ failure to take goal establishment, KPI identification, and target-setting seriously enough for years (we found a post he wrote in 2009 on the subject!). He also really helped us earn our explicit tag for this episode — scatologically and onanistically, we’re afraid. But solid content nonetheless, so hopefully you can hear past that!

Articles and Other Resources Mentioned in the Show

Photo by Famartin on Wikimedia

Episode Transcript

0:00:05.8 Announcer: Welcome to The Analytics Power Hour, analytics topics covered conversationally and sometimes with explicit language.

0:00:13.0 Michael Helbling: Hey everybody, welcome. It’s The Analytics Power Hour. This is Episode 258. Setting a goal or a target can be a surprisingly challenging thing in a business. And as analytics professionals, we’re often involved in that process; some might say not involved enough. All too often, goals or KPIs can seem like magical thinking, not connected to reality or laughably small as to not elicit a sense of purpose or effort. But when measuring things, how do we create the bar for which they will measure up? And should we or how do we? Well let me introduce my co-hosts because we’re going to talk about it. Julie Hoyer, welcome.

0:00:57.6 Julie Hoyer: Hello, happy to be here.

0:01:00.9 MH: Awesome. And Moe Kiss, how you going?

0:01:04.7 Moe Kiss: I’m going pretty great.

0:01:05.0 MH: Awesome. And Val Kroll, how’s it going?

0:01:07.0 Val Kroll: Very. Excited to be here.

0:01:12.5 MH: Awesome. I’m excited to talk about this. I’m Michael Helbling and for this episode, well we needed a guest, someone who’s got a long track record thinking and working with this topic. And Tim Wilson is the Head of Solutions at facts & feelings. Prior to that, he was the senior director of Analytics at Further. He’s the co-founder of the industry leading podcast, The Analytics Power Hour and frequent speaker at conferences all over the world. But most of all today, he is our guest. Welcome to the show, Tim.

0:01:41.2 Tim Wilson: This is a podcast?

0:01:43.6 MH: That’s right, Tim.

0:01:44.8 TW: My publicist was trying to explain to me kind of what this is. Okay.

0:01:49.5 MH: Yeah. It’s all part of the whole Fourth Floor Productions team. Ken Riverside’s doing a whole lot there for Chicago podcasting, so.

0:01:56.1 TW: Is it like live? Are people listening like right now?

0:01:58.6 MH: Yeah, Let’s just assume they are.

0:02:00.2 TW: Okay. Hey, Tim, you also have a book that’s coming out early next year, so we’re going to plug that a little bit too, called “Analytics the Right Way: A Business Leader’s Guide to Putting Data to Productive Use” and you co-authored that with Dr. Joe Sutherland, who’s also been on the podcast. Tell me, Tim, did you talk about this in the book?

0:02:21.2 VK: Did it come up?

0:02:21.3 TW: A little bit. A little bit.

0:02:22.6 MH: Did it come up? A little bit? Okay.

0:02:24.3 TW: Actually, I think maybe this is the unofficial kickoff to the book tour that… It might be this… It’s also the conclusion.

0:02:32.8 MH: Thanks so much for supplying us all advanced copies of the book, Tim, so we can take care of the…

0:02:38.2 MK: Yeah, Tim.

0:02:39.6 MH: Yeah. Where’s our [0:02:41.1] ____ galleys? As they say in the biz.

0:02:45.8 TW: I’ll get those right over. This way you didn’t have to pretend that you’d read it. That way… It works better this way for everyone.

0:02:54.4 MH: We have never done a podcast with a book author where I have not at least partially read the book.

0:02:58.3 TW: Okay.

0:03:00.1 MH: So yeah.

0:03:00.8 JH: What counts as partial?

0:03:01.6 MH: Most. You speed read it and then you make notes.

0:03:06.9 VK: Are you trying to get him to set a target, Julie?

0:03:09.6 MH: Yeah. Exactly.

0:03:10.6 JH: Maybe.

0:03:11.6 TW: I really like the vibe you guys have. It seems like you guys really kind of like give each other shit.

0:03:15.1 MH: Don’t… I don’t like it at all.

0:03:18.2 VK: Hopefully, you fit right in. We’ll see.

0:03:20.3 MH: We’ll see…

0:03:21.4 JH: See if you can hang.

0:03:22.2 MH: We’ll see how you do. This is… Yeah, we’ll see if you can hang. All right. Let’s talk about, Tim, goals and KPIs, and sort of what they mean in a business and like, where do you start? Let’s say you walk into a client or walk into a company and you’re like, okay. Let’s see what you’re measuring. See what you’re doing. What sticks out to you when you first talk to people?

0:03:45.7 TW: I mean, one thing that does that sticks out to me and it goes back years and years and years, I think I came into the analytics world thinking about analytics as measurement, as the most important thing we need to do is measure whether we’re achieving what we set out to achieve, which I think is maybe a little different. I was a little naive. It took me a little while to realise that in my mind, there’s kind of fundamentally two different things we can do with data and they are very, very different. And one of them is the world of analysis and validating hypotheses. The other is measuring performance, which is like, sounds less sexy, but like to me that’s where goals and KPIs and targets sit, and they get kind of mashed together in companies.

0:04:36.2 TW: So I would say when I work with a company, or even when I was in-house, it was like the dreaded weekly report or monthly report that in my mind, often tries to cram those two things together. They’re like, let’s do the campaign readout. And it’s never really clearly answers the question of did this campaign meet the expectations we had for it sometimes because there weren’t expectations and what did we learn? What worked, what didn’t work? What can we take forward? And those just get like crammed together. So I think the…

0:05:12.5 MK: What do you mean by, “they get crammed together”? Like, what does that look… What do people actually do?

0:05:17.5 TW: Well, I mean, I think people, they’ll… I mean the campaign’s a simple example. They’ll show, a slide with sort of search traffic and it’ll show what happened to click-through rate over the course of the campaign. And they’ll say, here’s what happened with clicks. Call it clicks.

0:05:33.3 MK: Whatever. Insert…

0:05:34.1 TW: Or call it conversions, whatever.

0:05:36.2 MK: Insert other metric. Yeah.

0:05:38.6 TW: Insert some metric. And they’ll say, here’s what happened. And you can see that it spiked at this point or it dipped at this point and or we made this change. And that’s… That doesn’t fundamentally answer the question, did it get as much of that metric, that will skip the… “Here’s what we expected to hit for that metric” won’t get answered because the data’s there and there’s this kind of assumption that people can look at it. So you wind up with slides like that, where somebody’s saying, but was it good? Like, was it good or not? And so it just, it winds up talking about what happened and kind of what we think the drivers were, and maybe what we learned. But it doesn’t just objectively answer that question, “Did it deliver the results we expected separate?” from if there were hypotheses to be validated or learnings to be had. So.

0:06:33.1 VK: What’s one of the best ways then to get at the goals discussion so that people aren’t shrugging their shoulders, “Was that good?” when you get to that conversation?

0:06:44.4 TW: I mean, I’d love to hear from you… Like getting them to acknowledge that they should have that discussion. And Michael, you set it up in the intro a little bit that analysts, a lot of times, when they get asked the question, “Was that good?” And then they say, see, this is why I need to be in the opening in the planning. And yet they still aren’t pulled into the planning. And then I think sometimes when they are pulled into the planning, they themselves get caught up in not recognising what they need to do. So I mean… Yeah.

0:07:17.0 MK: Okay. Can I just… I’m going to throw a scenario at you, and I actually love this like post-campaign report scenario because it’s my fucking life. So we’ve got the dream team, the analyst does great job. They’re at a meeting months before the campaign launches. They agree on the goals, the objectives of the campaign, they have targets, everyone’s aligned, everyone signs off on the brief. Great. We have a measurement plan, we know how we’re going to collect the data. Life’s a dream. We get to the post-campaign report. It’s beautiful. Perfect visualisations. You can see where I’m going with this. Everyone’s…

0:07:58.0 TW: No, but it sounds great.

0:08:00.0 MK: Everyone’s… Oh, everyone’s in alignment. What… I guess what do you do when the marketers and the analysts have this really great relationship, they’ve agreed what good is, but the wider business expectation is metric went up. Good. And that’s what they care about because they haven’t been part of that conversation the whole way along. They just see like the shiny post campaign report and they’re like, anything where there’s an improvement is good. That’s their baseline. Whereas for the marketing team and the analysts, they might have a different kind of understanding of what the goal is.

0:08:39.3 TW: I mean, I don’t think I’ve ever been in a case where the broader business is saying that was good and the marketers are saying, yeah, but it’s not as good as we expected it to be. But I mean, generally point taken. I mean it is… I think that upfront planning, I guess there’s a couple of things. One being really, really taking it seriously and we sort of kind of kitchily use this, the two magic questions of: What are we trying to achieve and how are we going to know if we’ve done that? Like really capturing that and making… And socialising it. Like encouraging the marketers to… It’s kind of on them. Like they’re, they need to communicate up as well. If they’re doing something and they have a stakeholder and they’re waiting until the campaign is fully run and then they’re saying, here’s what we did, and that’s the first that someone’s aware of it, that’s a little bit of a miss on their end.

0:09:34.9 TW: But I think where the analysts can step in is giving them… And even as you’re kind of laying out, they set goals, they set targets, they set what data’s going to be collected. Like, there’s part of that that I think is critical to the campaign planning. And you want the agency to understand, you want everyone to know that we’re running this campaign to drive up retention or drive up MAUs or drive revenue, and have that used as a communication tool during the planning and the discussion and the reference to the campaign so that when the report comes out there is kind of like slide one is very objectively, this is what we were trying to do. Like the business thing we were trying to do, increase retention, satisfaction, revenue, whatever, here’s how we were going to measure it and here’s how we delivered against target. And that’s it.

0:10:34.6 TW: Therefore, of these five KPIs, two wildly outperformed, two were pretty much on par, one really tanked. And that’s it, like that’s the extent of the discussion. So that’s kind of like to me, the point where if somebody says, but wait, the one that you said tanked was actually up year over year or it was, I would’ve thought that would be pretty good or that exceeded benchmark. You’d be like, look, no this is… If we hopped in a time machine and went back three months, this is what we agreed success… How we were going to measure success. So that’s when I… It’s kind of back to where I said things get mashed together. It’s, if you’re having that discussion on slide seven, then it does kind of open itself up to like, now we’re discussing whether it was successful or not? Like no have the discussion, was it successful or not? Hard stop. Now what was the other stuff that came in with it?

[music]

0:11:33.8 MH: It’s time to step away from the show for a quick word about Piwik PRO. Tim, tell us about it.

0:11:41.3 TW: Well, Piwik PRO has really exploded in popularity and keeps adding new functionality.

0:11:46.8 MH: They sure have. They’ve got an easy to use interface, a full set of features with capabilities like custom reports, enhanced e-commerce tracking, and a customer data platform.

0:11:58.3 TW: We love running Piwik PRO’s free plan on the podcast website, but they also have a paid plan that adds scale and some additional features.

0:12:03.9 MH: Yeah. Head over to piwik.pro and check them out for yourself. You can get started with their free plan. That’s piwik.pro. And now let’s get back to the show.

[music]

0:12:16.1 MK: I think though what I’m trying to point to is, the complicating thing inside a business is sometimes success means different things to different people, and tries you might to get like the best alignment known to mankind. It doesn’t always work out that way. Sometimes there’s a really good understanding between the core group, but then once you get further out into the business, people might sign off on the report, the… Sorry, the measurement plan ahead of time, but like six months later, it’s all kind of forgotten and people’s expectations change because they fall into sunk-cost fallacy and they’re like, and now we’re going to try and just show that it was good because we’ve already done it and put all the effort in.

0:12:55.7 TW: But it’s why I’ve spent 15 plus years. I mean, I will use analogies like a time machine, in saying this is, if you have the discussion in the abstract with an organisation and say, look, we are going to be at a point down the road and we’re going to want to agree whether this was successful or not. We don’t want to spin our wheels debating whether it was successful or not because everybody’s going to want to say it was successful except somebody who wants to bludgeon somebody else because they don’t like them. So it… It’s not something that like an analyst can just say, oh, I’m going to check this box and do it. It is a real cost and damage to an organisation if that’s what they wind up in. I mean, that’s… So, I absolutely get that businesses are messy, but I also watch those planning discussions kind of skate around really focusing on saying, we’ve got to figure this out.

0:13:50.1 TW: Everybody look me in the eyes. Who’s going to forget about it? Who’s going to undermine it later? They need to be brought in. They need to be looked in the eyes, not like an overly onerous process and sign off. But the fact is, if somebody signs off on something and it’s documented, and it was clearly stated as a meaningful outcome, and they come back later and say, well, but and it’s like, look, what do you want me to do? Literally, you told me three months ago that this is what success looked like. Do you understand that this is never going to be effective if this is the discussion that we’re always having? So it’s not… You can’t just, it’s not a one-time check it off and it’s done. It is hard. But to me, yes, organisations struggle with this mightily, and it’s not like an analyst can just fix it with kind of the nature of what their report looks like. So I agree with you, but I also don’t disagree with myself.

0:14:46.4 JH: Well one thing, too, if that case was true, Moe, and they found that, the analysts and marketing duo that are really in lockstep, they know what success looks like, they set expectations and they find that it didn’t hit expectations, my guess is that the next step would be to make a decision of how to make it better and hit the expectations, right? There would be a change. And so I’m almost wondering like, would leadership really look at it and be like, no, I think that was good enough. Because I would hope that if they were actually working that proper process, the outcome of that would be that they’re looking forward to make decisions about the next campaign they run, to do better to meet the expectations. And so I just wonder like would it come out in the wash that if those teams, boots on the ground, have control over it, are working that and continually trying to get better, then their story to leadership is like, yeah, we were doing well because we realised we could improve these different things and kind of course correct.

0:15:41.8 MK: I really like that. Yeah.

0:15:43.5 JH: Rather than they kept doing the same thing and they’re like, oh, it underperformed and then leadership’s like, no, it looks good enough. Because I’m guessing they would be like, oh yeah, let’s change, let’s change, let’s optimise.

0:15:54.5 MK: Yeah. I like that.

0:15:55.8 VK: And I also think that like the measurement planning discussion and like the target setting and the metrics and everything, which hopefully we’ll dive into all those topics discretely throughout this conversation. But that can’t be used as just a contract to save the data team’s story. This is also saving those business partners from their future selves, right? Like this is saying like, if you can say right now how you want to feel about this performance when it comes back so that you are better informed to make those next decisions exactly how you described, Julie, how can we set that bar before we already have the sunk-cost of being within the campaign, right? So I think there’s also this idea sometimes it’s seen as like the shield and really it’s like positioning it to like the partnership discussion of like, we want to be able to objectively have that point on slide one as you illustrated, Tim, and then move on to like the parts that get really interesting about adding some of the color. So that people aren’t like, well what if you cut it by new versus returning? Is it successful then if…

0:16:53.8 MK: Oh, yeah.

0:16:53.8 VK: It’s like, oh my god! I’d lose my mind.

0:16:57.4 MH: But I mean, I think of… No, that, I’m pretty sure all of us had little mini T trauma reactions when Tim was talking about that before.

0:17:07.8 MK: Can I throw to the group then? Okay. So you go into your measurement planning session. I love this because, not going to lie, everyone can just like shadow me along to work after this at my next measurement planning meeting. But like, let’s just, let’s scenario this out. So you get into a room and you do the, so what are we trying to achieve and how will we know if we’ve done that exercise? And you get everyone to silently brainstorm and put on a poster what they’re trying to achieve. And one person goes, enterprise sales. One person goes, brand awareness. One person goes, monthly active users. One person goes, incremental revenue. Over to you guys.

0:17:51.3 TW: I mean, to me that’s perfect because you’d say, well, clearly…

0:17:56.2 VK: Moe’s face.

0:17:58.1 TW: Clearly…

0:17:58.8 MK: I’m like, my face is like, sorry, you want me to drive brand awareness and enterprise sales with the same thing?

0:18:08.4 TW: That’s why it’s perfect. Because your reaction right there is saying, well, clearly, clearly this campaign is going to fail if we don’t get on the same page now. I mean, to me doing that, like that is the, that is where I also feel like the value in measurement planning, which… It’s a loaded word because it means different things to different people. But having that exact exercise, you can kind of sit back and say, okay. Can we all agree that one campaign is not going to be successful at all of those? And it’s kind of up to your role in the room whether or not you’re the one to facilitate, how do we get on the same page? But imagine this… The opposite scenario because I feel like that doesn’t happen. Everybody’s thinking that. And then you’re three months down the road and looking at the metrics, and then people are arguing about the execution and what’s successful.

0:19:02.7 TW: So I’m like, way, way, way a thousand times better to uncover that lack of alignment before the execution has started and saying, hey guys, we clearly should not be handing anything off to a creative team if we’re not on the same page with that. So that’s why I’m saying, it’s great. It’s great, because you’re surfacing it under the guise of, I just want to measure this successfully for you, but what you’re surfacing is that the business is not aligned. So does that make some sense?

0:19:35.6 MK: It makes total sense. Like I said, you want to come to my next meeting?

0:19:44.0 MH: I think I want to pick on a word there, Tim, which is alignment or aligned, because it sounds like that is a very important predicate for what we’re talking about. So can you talk a little bit about that part of it?

0:19:58.9 TW: Yeah. And I think it is kind of sneaky because the data sort of… Getting people to articulate what they’re expecting a result to be is a good way to kind of abstract the emotion a little bit and surface that they’re not on the same page. But I mean the… And no. I’m assuming it’s usually not quite that extreme. Generally… Well, no, I guess I’ve seen paid search being like, no, this is a branding play or this is a direct response play. Or maybe not paid search, but with a campaign, is this a branding top of funnel thing or is this a direct response conversion? And all of a sudden, you’re saying, well, I just want to figure out what the right metrics are. So clearly, but now you articulated out loud that you’re not on the same page, and marketing 101 mean, we know that if we’re not clear on what we’re doing, we’re running in multiple directions. So it’s like trying to figure out what success looks like forces a discussion about what the business purpose for doing an investment. And I know we’re saying a campaign, but this can apply to a channel. It can apply to a line of business. It can apply to anything.

0:21:16.4 MH: From an analyst perspective, how important is understanding kind of historical and other things like that going into these kinds of conversations? Like is that crucial or could somebody walk in brand new? Talk about that maybe a little bit.

0:21:29.0 TW: I think the… I mean any… The extent to which the analyst can have an understanding of the business, I’d rather the analyst understands how marketing or operations or product or CX or whatever the topic is. I’d rather… If they were brand new that they have a understanding of the domain, I would put over understanding historical performance of the metrics specific to that company. Because that is what’s going to come next, is people are not going to want to set targets, they’re going to want the analysts to… Well what do we normally do? And pull that data and make that an analyst task, which is a kind of a separate sort of downstream topic that I think people don’t like to actually set targets to define success.

0:22:18.1 JH: Can we talk about… Because we’ve been talking about alignment and what they want to achieve, and what does success look like. But can we actually break that down a little and talk about like what, how are you defining a goal? Because we were throwing the word around. So like, what’s a goal here? Because I think some people will confuse the three goals, KPIs and targets. And in an ideal world, what level goal are we talking about here? Do you want me talking about clicks? You want me to talk about impressions? Where should they be?

0:22:50.7 TW: Yeah, I feel like goals is a… Target’s a tough word too, because sometimes when people talk about targets, they mean what audience, like their target audience versus what value do they want a metric to hit. So we… Val and I ran into that with a past client, where we had to make sure we weren’t saying target…

0:23:06.5 MK: I think people sometimes also confuse targets and forecasts.

0:23:10.1 TW: I have… Oh, that’s… I…

0:23:11.6 MH: Also a huge store that people buy stuff at.

0:23:16.0 TW: Target. Yeah. That’s “tar-jey.”

0:23:18.5 MH: Oh, sorry. How do we keep them…

0:23:23.3 TW: Have you been saying it wrong all this time?

0:23:23.6 MH: Yeah, I have.

0:23:25.4 TW: But it’s, I think “goals” is a loaded word. I tend to think of goals as being what is the business outcome, not necessarily tied to a metric. Absolutely, if somebody, if I say, what’s your goal? And they say our… If they say their goal is to drive a million dollars’ worth of revenue, I’m okay with that because that’s a business outcome. It has a metric and it has a target. If they say that their goal is clicks, email click-throughs or email open rate, that I have a… I’m like, okay. Well I got to use a word other than goal. I got to say, well I’m going to ask what your business outcome, what is it you’re trying to achieve? Because I don’t think there’s a right or a wrong when I use goals, but it has bit me a few times because I have this assumption that people are thinking goals as a business outcome thing, not even necessarily with how it’s going to be measured, just something that is valuable to the business. Which gets back to the earlier discussion, like if everybody’s not on the same page on kind of fundamentally what the business purpose of doing this thing is, then that’s tripped up, but I… Yeah.

0:24:35.2 JH: Can I give you an example and ask what you think about this kind of wording of a goal, too? Because I’ve had it, where it’s so…

0:24:42.3 TW: My publicist is definitely hearing from this, because these are like getting peppered with questions. I’m supposed to be articulate.

0:24:49.7 MH: This is what it means to be a guest on the show, Tim.

0:24:53.9 TW: On this podcast. Okay.

0:24:57.3 MH: It’s not for the faint of heart.

0:24:57.6 TW: Hard-headed.

0:25:00.0 JH: How would you advise someone… Or would you say this goal is fine as it’s worded? Because I hear… Sometimes an objective is so big and grey or a goal that they state that it’s like, I don’t even know if you guys know what success of that looks like. And it’s hard to get them to like narrow it down, because one I’ve run into recently is similar to saying like, digital growth through personalised experiences. Like that’s what they will say their goal is.

0:25:29.9 VK: Tim is triggered, visually triggered.

0:25:34.5 TW: Yeah. Because that’s…

0:25:37.5 MK: Isn’t that… Is that a vision?

0:25:38.1 JH: And then they’re trying to break it down farther. I know it’s really hard. It’s really hard and I’m like, is it maybe just reflective that you guys are feeling like the alignment to the rest of the business is unclear, but it’s hard then to come in and try to help them, how do I further clarify that? Because it’s just so… It’s got no railings on it.

0:26:00.6 TW: But I would… Every time Val and I ran into this with one of our large clients recently, where they’re combining the how we’re going to achieve the goal or the objective with the objective itself. And so stepping back and saying, okay. What is the outcome? It’s good that you have, call it a strategy or a vision or an idea of how you’re going to, but really what we’re trying to talk about right now is what result is it that matters? Not how you’re going to achieve the result. You’re doing the campaign or you’re doing whatever. So I’ve definitely gone in when you said what are your goals? And they’re like, well here they are. We’ve got these four pillars and these are the goals. And it’s got language in them that… If I say, but wait a minute, you mostly care about some outcome. Let’s talk about what the outcome is. And that can get wordy because people get really proud of the way that they have come up with some alliterative way or some sort of cool way that rolls off the tongue that… What was it? Digital growth through what?

0:27:10.0 JH: Personalised experiences.

0:27:11.6 TW: Through… Yeah. I’m sure by the time it hits you, everybody is saying, that’s the thing for 2025. Digital growth through personalised experiences. And it’s just, you’ve got to like, okay. What… Are you trying to… Even digital growth, like exactly what do you mean by…

0:27:28.0 MK: Yeah. Unpack it.

0:27:31.8 TW: Growth.

0:27:31.9 MK: That’s how you to start is unpack it.

0:27:33.9 TW: Yeah.

0:27:33.9 MH: Because probably a goal lives a layer or two down from that, which is sort of like, okay. So how do you know you’re impacting those things, like digital growth and personalised experiences? And then your knowledge of, okay. Well if this changes, then we know we’re impacting digital growth. Okay. Good. Then we could set a goal on that, because then we know we’re getting to that.

0:27:55.9 VK: Okay. I’m going to challenge that just a little bit, because like, would they be happy with digital growth through non-personalised experiences? I think like unpacking it, because like the through personalised experiences are like the things that you potentially could do to achieve that digital growth. I think that’s what Tim was talking about. Like there’s too much in there, or like sometimes it will also have a target by X percent. Then you’re like, Jesus, let this all breathe. And so I think like the digital growth, just even defining those two words, like what… Exactly what your line of questions were, Michael, but for those two words. So like what does digital growth mean? How will you know, yeah, you’ve achieved that? Because like that’s going to be a whole conversation and potentially some alignment in and of itself, let alone how you’re going to get there.

0:28:40.9 MH: But I also think that just sort of exposes, Val, sort of like some of the tenuousness of the overall statement, which is really not sure if those two things should even be connected or are really aligned in the first place. And so that’s part of what I would call a goal-setting process is like basically picking your strategy apart a little bit.

0:29:03.2 TW: But it’s back to your point earlier of like, what do you want the analysts to come in armed with? I want them to have the critical thinking and the communication skills to not… We can kind of shit on that now, because we don’t have to actually have the conversation and be like, well what the fuck does that even mean? Is kind of the…

0:29:20.5 MK: That was my thought.

0:29:25.1 TW: That’s what’s running through your head, but you can’t do that because everybody’s kind of in love with it, and it’s going to live all through 2025. Like you’re not going to be able to kill it. So that’s where then the analyst has to say, cool. Got it. I’ll be sure to put a tag slide that says those words, but then we’re going to get to, but don’t you understand that what I’m trying to do is make sure that we can objectively say, are we achieving business outcomes? We can worry about how… Because really the how you’re going to do it like this, we would say somebody has a hypothesis that may be really well-grounded. The personalised experiences are one of the most effective ways to achieve digital growth. Again, tough to… It takes a while for an organisation to get there, but to say there’s a whole other process we want to follow with the personalised experiences because that means a lot.

0:30:18.4 TW: We can do MVPs on personalisation. We have lots of hypotheses and we want to validate those along the way. And we know that what we’re trying to get them to do is to support digital growth, but let’s just talk about the outcome of digital growth, which, what does that mean? What do we mean when we say digital growth? Because chances are they don’t even know what that means either. So, great point. It’s like, it’s a process. It’s not a… I think you’re… Done well, there are a lot of questions being asked that people say, that is a valid question. I don’t know the answer. We’re going to have to figure that out, and there’s value in figuring that. I understand that we really do need to figure that out, because that is going to pay dividends, in every future conversation we have is we’re executing over the course of the year. And oh, by the way, it makes the reporting of results much, much smoother as well, but it pays dividends all over the place.

0:31:14.3 VK: So on the other side of the spectrum, because if this is coming from a place, I’m assuming, Julie, those examples are set by like non-data people, that those are business stakeholders. And so on the other side of the spectrum, if you were to get into that discussion and say like, well what are the goals of this campaign? And they just say, I want to beat benchmark or like, I want to exceed benchmark.

0:31:34.3 MK: Oh, no, no, no, no, no. I’d want it to be higher than last year.

0:31:41.1 VK: Yeah. I want… Yeah.

0:31:42.3 TW: Yeah.

0:31:42.5 MH: Yeah.

0:31:42.7 VK: So that’s like the other side. So like if we could just triage that one, I think that could be helpful too.

0:31:46.6 TW: Yeah.

0:31:49.0 VK: Tim, again, also visually triggered.

0:31:51.9 TW: A lot of times… Also triggered. Because there is a whole other piece of this where I get that there is a human reaction. It gets to the… Michael, the whole targets versus forecasting as well, that fundamentally when we’re asking people what would success look like? Or as Val’s, like, what would put a smile on your face down the road? And that’s very uncomfortable, because there’s people… Like forecasting, they’ll say, well, can’t you, the analyst, what’s our historical? We just want to do better than that. And do you want to just like, you’ve learned nothing? You don’t think you can… You think nothing’s more efficient, more impactful? You literally wanted to do better than last year? No, no, no, no. We’re doing… We’ve got way better targeting. Like how much better, like what would be an indication? Like there’s a discomfort in the business of setting those numbers because there is a feeling that I’m… They’re so used to looking at historical data, and having this false sense of accuracy and precision on their historical data. And now we’re asking them to predict the future.

0:33:00.0 TW: And we’re not just asking them to predict the future, we’re asking them to predict the future contingent on them impacting the future. So they all of a sudden… They may ask for, well, can’t you just run a forecast and tell me what it’s going to be? It’s like, well, you want to quit your job? If you don’t have any influence over what’s going to happen, you want just to run a forecast, then what are you doing? Like you’re supposed to be doing something that drives forward, how good do you think that is? But recognising that they’re very… It is uncomfortable, but I think there’s… It’s unnecessarily uncomfortable. People are just freaked out that it’s going to go on their permanent record if they miss their target. And the reality is, it’s not. It’s usually not that big of a deal.

0:33:48.2 MH: But I do think this is an area of sensitivity because organisations do treat this different ways. And I think that’s a huge part of why this is such a tricky thing for people a lot of times is because it’s like, if you didn’t hit that goal, or if you didn’t hit that target or whatever that is, it’s like, oh, that’s the worst thing in the world that could happen.

0:34:14.1 TW: I don’t… People… That card gets played like, oh, but there are organisations… I’ve worked with a lot of organisations and Mike, you may… I think we’ve had this discussion before.

0:34:32.4 MH: Probably.

0:34:32.7 TW: I was saying, Michael, I mean it came out as Mike and I realised I’m just… I just want you to know that I do know you go by Michael. That was me just losing my train of thought.

0:34:38.6 VK: Did you listen to an episode before you joined us today? Is that how you do? He goes by Michael?

0:34:43.2 TW: That’s… Again, yeah.

0:34:46.5 VK: Sometimes helps.

0:34:48.1 TW: Sort of helps, that that card gets played like, oh, but no, no, no. The reality is on an organisation by organisation basis, asking them, like really, even when companies have MBOs and they have targets set, and things they have to hit, like even those, they tend to get to the actual period and say, yeah, but there’s some manager judgment in that. But when we’re talking about a campaign before, I just haven’t actually seen it. I’ve had people tell me. Some organisations, well really… I’m like, really? Which one?

0:35:25.2 MH: I’m not saying it’s real. I’m saying the fear of it is real for people.

0:35:30.6 TW: Okay. Well then I just teed off on you for no good reason.

0:35:34.1 MH: So the perception is like, oh, if I show up to this business meeting, review meeting with this poorly performing campaign, or a campaign that doesn’t show digital growth through personalised experience and continue to pick on that one, then I’m going to look bad and my performance is going to be duly… I’ll be impacted, right, in terms of my standing in the company or my job or my bonus or whatever. And I think that’s a…

0:36:01.6 TW: It’s a psychological safety.

0:36:02.9 MH: It’s a huge driver, right? And so like I see people hedging their bets constantly in these kinds of discussions for that reason. They’re just like, I’m not willing to make a commitment, or if I make a commitment, it’s going to be like so loose that I can slide my way out of it somehow or not kind of commit. Like at the beginning, Tim, you said like, can we just decide if this was a success or not, and then talk about more of the details? Like that kind of language is just sort of like, absolutely not. I will not commit to that kind of clarity in the discussion. And I think that’s a big, big problem.

0:36:44.2 TW: That, I agree with. And I think it does take modeling a behaviour. It does take time. It does take… If anything, dinging people for saying, but you’re telling me you can’t tell me if it was objectively good or not. It’s okay if you missed. I was like, but I just want to know. And I watched… I was in an agency years ago that actually put some performance targets. I had to sort of… I think it was the client sort of insisted that we put performance targets in the proposal. And boy, the client team was not happy about that, but it actually forced a discussion and they put targets in, and the client came back and said, it’s got to be way more than that. And all of a sudden, the agency was saying, but wait a minute, with what you’re spending and what you want to do, seriously? Like that’s… We can’t possibly hit what you’re thinking.

0:37:37.3 TW: And they said, you know what? You’re right. Let’s do something different. Still spend money with the agency. Thank you for actually having that discussion. So I don’t think it takes too many times to get over that hurdle to say, wait a minute, I missed. And because I stood up and said, I missed this, and here’s the ideas we have, and here’s what we might do differently, and here’s what we’ve done, all of a sudden, they’re like lauded, and be like, oh my God, that person didn’t beat around the bush and try to spin a turd into a skein of gold. They said it was a turd, and in hindsight, yeah, of course, it was going to be a turd, but you know what, we all sat around and thought it was practical. You know what we’re going to going to do? We’re going to not do that again. And they’re like, thank God you’re not going to keep pushing out turds. Okay. That was unnecessarily…

0:38:23.5 MK: What about… Okay. I feel like we’ve said the word turds enough.

0:38:27.4 TW: Did not know where I was going.

0:38:28.6 MK: I have a three… Already I have a three-year-old. So what about the opposite? We keep talking about missing. What is your view on when you are partway through a campaign, let’s say the goal is 3 mil and your forecast to hit maybe you’re at 2.8 and everyone goes, let’s put the goal to 4 mil. What’s your perspective on that?

0:38:52.4 TW: I would generally want to maintain the original goal and say, sure. We can… Maintain it and say for a record, so even we’re doing the readout at the end, say, this was going so well. It wildly exceeded our expectations. I thought you were going to go down the path of, if it exceeded expectations, and then people are asking why it exceeded expectations…

0:39:13.1 MK: Oh no, no, no.

0:39:14.2 TW: And that I have a little patience for. Because I think that happens, too. Like, what was the analysis? Like, who the hell cares?

0:39:22.9 MK: Why do you have little patience for that?

0:39:23.8 TW: Because there’s limited… There’s finite resources. And if you said it is… This investment, we’re going to put these dollars and time behind something with the hopes of achieving some thing, and we achieve it and then saying, well, why? Why did we achieve it or why did we exceed it? It’s like, but you did, like you paid money and you hit it. Now, you may not have a, quote, “perfect causal… Causal relationship.” You said we did X and Y happened. So that’s actually technically not causal. I get it. But there’s only so many hours in the day. I think I need to more go focus on things where we’re not hitting. Now, if it blew it out of the water and somebody says, I wonder if that’s because it turns out that Millennials in the northern hemisphere like absolutely love portrait videos, or something. And I wonder if that was it. Okay. Cool. Now you have a hypothesis. Go and validate it. Use that going forward, but not in the service of explaining why you paid X, you got Y, and that’s what you wanted. That’s just like, that’s just intellectual… Okay. That’s intellectual masturbation, I guess that’d be… To see how many…

0:40:32.4 MK: Oh, wow. You’re on a roll!

0:40:34.7 TW: You said, is there like… Is there like some flag that a… It’s like, can a podcast be like flagged as being only for like explicit or something? Is that something…

0:40:42.5 MH: No, we’ve got an editing process, Tim.

0:40:44.6 TW: Oh, okay.

0:40:44.7 MH: So we can bleep all that. Don’t worry about it.

0:40:47.2 MK: Clean it up in post. Yeah.

0:40:50.2 TW: Okay. That’s fascinating.

0:40:51.9 JH: So some of what we were just talking about, I feel like is assuming we have good KPIs measures of success set, and I know that you have written some blogs and ranted a few times about outcomes versus output KPIs, because I was thinking, Moe, your example of like, our goal was 3 mil. Well, what if somebody had a goal set with a bad KPI of a million impressions? You know what I mean? So can we talk a little bit about the difference there?

0:41:20.5 TW: Sure. Yeah. This is literally… Okay. My publicist won’t get fired.

0:41:27.7 JH: I’m hitting all the buttons.

0:41:28.4 TW: I’m going to like stop.

0:41:32.5 MH: It’s physically painful. It’s so funny.

0:41:32.7 MK: I’m actually quite enjoying it.

0:41:36.0 TW: [0:41:38.8] ____ a little story, this goes back to… I don’t know, 2004. I learned that distinction. I think a lot of people know it. If you search for outcomes versus outputs, you’ll get kind of varying definitions, but I learned it when I was working with United Way in Austin, and we were trying to figure out who, which agencies we were going to fund in the like emergency, food, shelter, and financial assistance. And there was this guy who was a retired social worker who was leading my committee, and we’d get these proposals in. And he would say, look, this, this program sounds good. Like what they’re doing, but the metrics they’ve proposed are just output metrics.

0:42:15.0 TW: And he’s like, we need to have… Push them to come back to us in their revision with outcome metrics. And the example that I loved was for a soup kitchen. And they would say, we’re going to, now, I’m going to botch this. We’re going to serve this many meals. And that’s kind of an output because they could directly control it. Ultimately, what they were trying to do was to remove… Reduce food insecurity. Harder to measure that. And their output was in the service of that outcome. So generally somebody would say, I’m measuring impressions, but I’m really trying to grow awareness. But that distinction between an output is just something that is a very specific tactic. And a lot of times, in marketing, you can drive that in so many artificial ways, your CPM, you want to drive that down, just open the floodgates and let the bots roll through. Those are outputs. And you’re not… An email… The business doesn’t give two shits about a click-through, like that matters not at all. Click-through rate doesn’t matter. Open rate doesn’t matter. Opens doesn’t matter. They matter, but they’re just outputs.

0:43:30.4 TW: Ultimately, you’re seeing this email because you’re trying to deliver some outcome and an outcome is more, and it may be your marketing, in saying, the outcome I’m trying to deliver is to generate, to deliver qualified leads to the sales team, because I really don’t have control beyond that. But we as a business recognise that qualified leads have value. So there are things get… Get blurry. It’s sometimes you have to measure, you have to use an output measure, because there’s just nothing else there. So I tend to talk about trying to have outcome-oriented measures and just have a bias towards saying, it may be harder to measure, but if it’s actually much, much more aligned with the business outcome that you’re trying to achieve, it may be worth it to do that. Because outputs are just so easy to manipulate and they don’t necessarily lead to something that matters.

0:44:27.1 JH: And is it fair to say, like, if you weren’t hitting an outcome, maybe go look at your outputs and see if you’re like, you could be more efficient in some of those? Like I’m thinking about like, what if your click-through rate is shit and you’re not driving the leads they care about because your email sucks?

0:44:41.7 TW: Thousand…

0:44:41.8 JH: Measure the outcome, you’re not hitting it. Check your outputs.

0:44:44.6 TW: Absolutely. So definitely not saying that those metrics don’t matter, but they are a very useful diagnostic, and even modeling out kind of, okay, what has to happen? So if I want to optimise the delivery of those… But that’s kind of how we’re delivering the outcome, putting… But if you put them all on the same report, then people are kind of the most point, people are like, they’re going to cherry-pick the ones that are up or above, or it’s like, no, no, no. Do that head scratching offline. Ultimately, I want to tell the business, are they delivering the outcome that want?

0:45:22.9 JH: Can I hit him with a third one…

0:45:25.3 TW: Yeah. Of course.

0:45:26.1 JH: And just keep railroading all of you and all of your questions?

0:45:28.1 VK: Good.

0:45:30.2 JH: Because I’m leading you down a path. Because I really want to ask you about this.

0:45:35.6 TW: Oh dear. Now I’m scared.

0:45:37.2 JH: I’ve been setting this up.

0:45:39.3 MH: Oh wow.

0:45:39.4 JH: Okay. So when it comes to, we’ve obviously been talking like analytics reporting performance measurement, but we have been saying there’s this intuition with a KPI that it needs to be a little more like down the path, down the funnel, closer to something the business cares about. It’s inherently valuable, in some way, right? Now, when we talk about on websites, A/B testing and running tests, a lot of times, the best practice is to actually choose a measurement for your test closest to your change, I.e., clicks. So how come those seem to be so at odds? Because I get into conversations with people and they want to be hypothesis driven, but the measurements we talk about, we’re like on the ends… Different ends of the spectrum.

0:46:28.6 TW: So isn’t that the best practice? I mean, that’s the easiest signal to detect, right? So it’s the cheapest. And the other extreme is saying, well, I want to drive revenue. It’s like, well, no experiment is going to be powered enough because there are so many other things that happen. So to me, driving more clicks doesn’t matter if it’s not actually moving a needle. Something like depending on where the clicks are, saying, well, I want to, if this gets them more PDP views that should get more people added to the cart, then maybe still not a…

0:46:55.4 MK: What’s a PDP view?

0:47:06.1 VK: Product description page.

0:47:07.5 TW: Sorry. Product details, product details page.

0:47:07.6 JH: Product description page.

0:47:10.0 MK: Oh, Jesus Christ. I’ve not worked in [0:47:11.1] ____ a-comm for a while, Tim. Keep us all up to speed.

0:47:14.0 MH: All right.

0:47:14.7 TW: You know what? I never have. I am just so full of shit. I can throw out these… I was just saying it so somebody could get it defined for me. I had no idea what it meant either.

0:47:24.6 MH: Yeah. I didn’t stop the… Stop you to define skeins of gold, but anyways, keep on going.

0:47:30.3 TW: So, I mean, because there’s… It’s easy to get caught up and saying, well, we’re… I mean, that I think is a problem in A/B testing, where you get so caught up and we want to do… We want to be able to detect a signal, so we’re going to pick a measure where we’re most able to detect the signal, and gets divorced from something that actually matters. So, I mean, social media was kind of the same way when I was working in that. We’d be like, well, what’s the value of a retweet? And you’re like, well, that’s… Like, what’s the dollar value of a retweet? Well, that’s garbage. But if you have a social media team that is trying to increase awareness or perception, and you’re doing that through social media, you’re not going to tie it to a tweet, but maybe that’s another one where you need to field some form of a brand study to ask about that.

0:48:25.7 TW: Or… And I do not know enough details about how Conductrics does this, but the fact that Conductrics put voice of the customer into their platform, I think is like absolutely amazing. If you are changing something and that’s because you’re trying to improve the user’s experience or their perception, then yeah. Run your test and then fricking ask the split. As opposed to saying, well, they clicked on it, who cares? It’s harder, but that goes back to the person who’s coming up with the test. Like there’s the problem, the hand wringing in the testing world of button color changes, which somebody was saying, telling me last week, they knew somebody did a button color change because they were probably looking at clicks and who the hell cares? So you want, even when you’re running tests, generally I think you want to do things that are going to have a real impact. Otherwise, you wind up doing high velocity of these minor, minor little things. And I’m out of my league on the experimentation front. I feel like those are the sorts of debates that people come to blows in the TLC or Experimentation island.

0:49:39.6 VK: So I’ll take one additional crack nuance to that too, which is all in the same spirit though, like making sure that the activities that are being done are actually in service of something that the business cares about. So when picking your metric, you obviously don’t want to pick something that’s like super naturally volatile, so that you’re like reading bad signal. It has to be like robust enough. You also want to pick something that’s sensitive enough to the change you’re making on the page, which is why people often say, pick something that’s closest to the change of the behaviour you’re trying to modify. And I think that that turns into a little bit of a debate of what is the metric you use for your statistical T-test versus how do you define success of the broader hypothesis of the change of what you’re doing in the first place? And those can be two separate concepts. So if the best way… There was a large telecom company years back that said, you know what? I’m sick of using clicks. We’re going to make the goal of every single test revenue.

0:50:41.1 VK: And they wanted to implement a chat bot and they still use revenue, not like how many calls of the call center decrease or how many people were actually able to complete their task? No. It was revenue. So guess what? The task failed. They never implemented the chat bot, which is just wild. If you knew how large this company was, your head would roll. But all that to say that there’s a difference between what you need to make sure that your test is adequately powered, that you’re able to detect the types of changes, versus defining that in the broader context of what success is. Because sometimes you can also set a secondary metric to say, but this action can’t cannibalise this other really important interaction on the page or something else that I really have to keep my eye on. Because it’s all an ecosystem, right? That you’re trying to like balance. There’s… If someone else is going to take on a new behaviour, sometimes it means it’s at the sacrifice of other things. And so I think, again, being really specific and thoughtful about the broader definition of success will save you from the scenario where you have to choose something that is more of an output but… All right. Now I’ll hop off my soapbox. Sorry.

0:51:42.4 MH: Wow. That was great.

0:51:44.2 TW: That was way smarter.

0:51:45.0 MH: Why couldn’t you say something like that, Tim?

0:51:47.2 VK: Don’t berate our guest, Michael.

0:51:50.7 MH: Sorry. All right. We do have to start to wrap up. And yeah, wow, Tim, it seems like you’re pretty good at this whole podcasting thing so you should really do more of it.

0:52:00.7 VK: You should look into it.

0:52:00.8 TW: Maybe I’ll start one.

0:52:02.9 MH: Yeah.

0:52:03.0 TW: Yeah. And people, they like watch it? Is it like on broadcast television? Like how do people actually hear this?

0:52:09.3 MK: Oh Jesus Christ!

0:52:12.2 MH: Oh boy.

0:52:13.0 VK: Moe is so over it.

0:52:14.2 TW: It’s like, commit to the bit!

0:52:20.2 MH: Nope. All right. Regroup here. All right. We do have to wrap up the show. This has been actually a really great conversation and reminded me of a lot of really useful things and things Tim has yelled at me over the years about this topic. So it’s good. This is a much pleasanter version of some of our conversations, which in fairness to Tim, I’m usually the problem because I’m like, but what [0:52:44.9] ____ So it’s good. But one thing we like to do is go around the horn, share a last call, something that might be of interest to our audience. Tim, you’re our guest. Do you have a last call you’d like to share? And also, Tim, just so you know, for the show, you can never share more than one.

0:53:02.1 VK: Hard ball.

0:53:03.1 TW: Okay. So I’ve got like six.

0:53:06.7 MH: Yeah.

0:53:06.8 TW: This would be the only time that I’ll stick with one. So I definitely have had, or I think I’ve listened to the podcast enough to know that people have referenced Benn Stancil’s work before, but I haven’t heard anybody reference this piece that was a while back, but it was called “Do AI Companies Work?” And it was just a… I have not seen it anywhere, where he just breaks down the arms race of like the billions and billions and billions that are getting flooded into the Anthropics and the OpenAIs, and the Googles and the Microsoft, to build the next model. And those models then very quickly get overtaken by another model. So he draws kind of the contrast to like when the cloud infrastructure was stood up, that pouring billions of dollars into Google or Microsoft or Amazon into cloud infrastructure, you were building a really sustainable competitive advantage. And he was just like, but how is this working when it’s going to cost you multi-billions of dollars for something that within six weeks is now fourth on everybody’s favorite benchmark? And he’s kind of questioning like, how is that going to be a sustainable model given how easy it is for organisations seemingly to stand up and compete in that space, and just throw a ton of money at it? So it was just a little bit of a head-scratcher, kind of made me think. So Do AI Companies Work? And that’s it.

0:54:37.1 MH: Awesome. All right. Well, Val, what about you? What’s your last call?

0:54:43.6 VK: Well, inspired by our guest, for the first time in a long time, my last call actually relates to the topic of the show. It’s a Medium article by Eric Sandosham and it’s called “Whose Job Is It To Produce Actionable Insights?”

0:54:58.6 MK: Ooh!

0:55:01.7 VK: And it was actually inspired by something that Tim wrote, but Eric was a guest not too many episodes back and it was an excellent episode. But I’ll just give you a couple little tidbits of things in there to kind of like scratch your interest. So the first subheading is called “What the Fuck Is Actionable Insights?” And he completes that paragraph by saying, so you can immediately appreciate that there’s no need to use the term actionable insight because there is no such thing as a non-actionable insight, but it’s just a whole joy. And he gets into like the ownership of actionable insights and it’s just… He’s, I love everything he writes, but this was an especially good one. So highly recommend.

0:55:41.5 MH: That’s awesome. All right. Moe, what about you?

0:55:46.5 MK: Okay. I have got one out of right field. And if anyone wants to go, I’m going to fight them on it.

0:55:58.9 JH: Ooh!

0:56:00.0 MK: Yeah, it’s really controversial, but I just read a book that I thoroughly enjoyed. It really surprised me when I had it recommended to me, I rolled… I believe I rolled my eyes and then was like, yeah, you know what? I’m going to, I’m going to read it. And I’m just going to talk a little bit about the book first and then I’ll tell you what it is. So it does have a lot of, I guess, like trauma in it. She had a very difficult childhood. But it does talk a lot about ADHD in women. It talks a lot about the sexualisation of women and how they’re treated, particularly in the ’90s by the US comedy scene and things like that, and it’s actually given me so much thought about so many different areas of my life, like both as a mum and as a woman, and how you show up and… Yeah, I’ve really enjoyed it and it completely knocked me for six, because I didn’t expect to like it. I do also recommend listening to it on Audible because the author narrates it.

0:57:08.6 MK: And the book is Paris Hilton’s memoir, which I now am a very big Paris Hilton fan, which is not words I ever thought would leave my mouth, but it’s very evident from her book that she had a far more difficult childhood than I probably ever realised because we were like around the same age. And so when I was a teenager, like she was kind of just becoming famous, and I just thought she was like this snotty rich woman, but she’s clearly worked really, really hard and is definitely the OG influencer, which was like also just a really interesting kind of thing to read about. So anyway, for anyone that’s interested, I thoroughly enjoyed it.

0:57:52.7 JH: I wasn’t expecting that title. That was a surprise.

0:57:52.8 MK: Me either.

0:57:53.5 MH: Yeah. All right, Julie, keep the streak going.

0:57:58.1 JH: Ooh, I’m up. All right. We’ll see if this one holds up to all the other ones. I recently read an article by Emily Oster on her ParentData site that she does, and it was what studies about screen time often get wrong. And now the topic itself as a parent was like cool to read because constantly all you hear is like, screen time is evil, this and that, you know?

0:58:17.1 MK: No screens! Yeah.

0:58:20.7 JH: Yeah, no screens, like all the stuff. So once I got past my parental guilt, I really liked the article because she has a great way of talking about making causal conclusions from studies that aren’t properly designed to do so. And she shows it finally in such a simple way. She has a diagram with three circles and three arrows. And so she has, for example, here, it’s like screen time, child test scores, and parental ability. And she has two arrows going from parental ability to the other two circles, and then a single arrow going from screen time to child test scores.

0:58:57.2 JH: And the way she talks about it is saying like, there are plenty of studies to relate parental ability to both of those other items. But what’s interesting is when you try to study that third relationship of screen time to test scores, and why everyone’s like, oh so worried about it, they end up forgetting the confounding variable, like that third circle of parental ability. And so I just love that this simple diagram and the way she talks about it is saying like, sometimes you can’t forget these underlying variables that are truly driving the relationship that you are trying to observe. And I just found that it was so applicable to a lot of the work I do with clients where I find myself saying like, I feel like this is a self-fulfilling prophecy because your two circles you’re looking at are like how much marketing I give to this group of… This audience and their outcome. But the third circle we’re not talking about is that like you segmented them saying they were already high performers, like stuff like that. Or there’s plenty of other examples, but I just loved the way she really broke it down and made it easy to digest.

1:00:00.2 VK: That’s a good one.

1:00:01.0 TW: That does. I mean, I was listening to, I think the last episode where there was, whoever the… One of the co-hosts talked about this like eight rules for a causal inference or something. And it actually used kind of the DAGs, sort of the circles and confounding, and the number of different ways those arrows can go. Like there’s a lot more than just confounding that comes in. But also I will say in our book, we have basically a three circles example late in the book so, with counsel to people, they should try to draw that sort of thing out to think about it. I mean, I’m curious, like if you’ve thought about actually trying to draw confounders, potential confounders that you cannot control for when working with a client is kind of a fascinating… Like you said, like if it works well on that and she’s speaking to kind of a lay audience that that’s an intriguing thing to think about actually trying to illustrate the thinking and saying, I can’t tell you about this causal thing because that’s not data that I have. Sorry.

1:01:06.5 JH: Yeah. Great point, I love it.

1:01:06.6 VK: Sorry. Is the guest not supposed to weigh in with more commentary?

1:01:09.8 VK: Yeah. We might have to cut that out.

1:01:11.1 MH: Yeah. We don’t usually get into it on specific topics on the podcast though. But it’s okay. You’re a guest. You didn’t know. Oh, Michael, what’s your call?

1:01:21.8 TW: I was going to say, Michael, it’s weird, is it only you who… I was going to jump in and save you on that…

1:01:25.5 VK: Julie…

1:01:28.7 TW: It seems weird, do you never have to do one?

1:01:28.7 JH: I got sidetracked by our guest going off the script.

1:01:33.7 MH: That’s okay.

1:01:34.2 JH: Michael, what about you? What’s your last call?

1:01:36.0 MH: Well, thanks. As a matter of fact, continuing the trend for former guests on the show whose articles I like, we like, Cedric Chin, who’s the main writer for commoncog.com, they have a case library. So a lot of companies, they kind of go into detail on certain things. And they just wrote this really awesome case study on Brooks, the running shoe company, and goes through kind of like their transformation. And they were really doing poorly for many years and then got a new CEO and kind of reinvented the company, kind of flipped their distribution channels on their head. And like the vision required to kind of do some very counterintuitive things to reshape that company into the success it is today.

1:02:18.8 MH: It was a really fascinating read. And I think about that only because a lot of times as analytics people, we’re not necessarily in the position to sort of like make those kinds of decisions for our businesses. But like to know that that stuff exists and to encourage that kind of thinking is something really good for all of us to engage in. So anyways, I’m a big fan. And there’s lots of other great cases there, but that one specifically was just a really excellent read.

1:02:46.4 MH: All right. I’m sure as you’ve been listening, you’ve been thinking to yourself, wow, this Tim guy is so good at analytics. One might even say quintessential. How do I reach out to him and find out more or communicate with him in some way? Well, we’d love to hear from you here at The Analytics Power Hour. And you can reach out to us. The Measure Slack group is a great place for that. I think the TLC was mentioned and I think we’re all there too. Also on LinkedIn. And we also have a YouTube channel. So you can check us out there as well. And so, hey, Tim, wow. Thank you. Thanks for coming on the show. This has been a really great experience.

1:03:23.6 TW: Yeah, it’s been okay.

1:03:25.2 MH: Jesus.

1:03:28.1 TW: All right. Well, you know.

1:03:31.0 MH: I don’t know who’s gonna be in charge of sending you your…

1:03:34.0 VK: Your speaker guest [1:03:36.5] ____.

1:03:37.7 MH: So yeah. All right. But actually, as we wrap up, no show would be complete without a huge shoutout to Josh Crowhurst, our producer, who does so much behind the scenes to make the show happen. Thank you, Josh. And thank you, Tim, once again, for coming on the show. And of course, I think I speak for all my co-hosts, Val, Moe, Julie, when I say, hey, no matter what those goals are or what those KPIs are, or how you’re setting it up, remember, keep analysing.

1:04:10.7 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at @AnalyticsHour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:04:27.6 Charles Barkley: So smart guys wanted to fit in, so they made up a term called analytics. Analytics don’t work.

1:04:33.7 Michael Wilbon: Do the analytics say go for it no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

1:04:46.3 MH: All right. So let’s just go to a couple of logistical things. So, Tim, we do go through an editing process on the show, so we can stop and start. So whatever needs to happen, if you get interrupted, whatever, we can just…

1:05:06.8 TW: I’m a lot more comfortable if I just keep going and just find my way back or don’t.

1:05:12.1 MH: That’s…

1:05:12.2 TW: Yeah. No, I’m more…

1:05:17.8 JH: Rock flag and turds.

1:05:19.9 TW: What the fuck was that?

1:05:20.0 MK: Oh, wow!

1:05:21.4 VK: That’s good.

1:05:23.1 TW: Rock. What is rock flag?

1:05:27.2 MH: Rock flag. Yeah. All right. Tim, you’re hamming it up too much.

1:05:34.9 MK: At least he can commit to a bit, right?

1:05:38.2 TW: Most guests just sit in stunned silence, like what the hell just happened? That’s what they’re thinking. They don’t say it.

1:05:43.2 VK: Yeah. No one’s ever said it, Tim. Sorry.

1:05:46.8 TW: Yeah.

[music]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#260: Once Upon a Data Story with Duncan Clark

#260: Once Upon a Data Story with Duncan Clark

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_260_-_Once_Upon_a_Data_Story_with_Duncan_Clark.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares