#263: Analytics the Right Way

Every so often, one of the co-hosts of this podcast co-authors a book. And by “every so often” we mean “it’s happened once so far.” Tim, along with (multi-)past guest Dr. Joe Sutherland, just published Analytics the Right Way: A Business Leader’s Guide to Putting Data to Productive Use, and we got to sit them down for a chat about it! From misconceptions about data to the potential outcomes framework to economists as the butt of a joke about the absolute objectivity of data (spoiler: data is not objective), we covered a lot of ground. Even accounting for our (understandable) bias on the matter, we thought the book was a great read, and we think this discussion about some of the highlights will have you agreeing! Order now before it sells out!

Links to Items Mentioned in the Show

Episode Transcript

0:00:05.8 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

0:00:13.7 Val Kroll: Hey everyone, and welcome to the Analytics Power Hour. This is episode 263 and I’m Val Kroll from facts & feelings. You know, writing can be hard. While I am absolutely just opening the show with some totally off the cuff extemporaneous remarks, it’s not hard at all for me to imagine a world where the intro that we do for every episode is carefully written out ahead of time. But that definitely wasn’t done here. Nope, I’m totally freestyling and free associating. And that’s how this Tim style rambling I’m doing, which just happens to be the topic of writing, is a nice transition to what this episode is all about. It’s a first for the Analytics Power Hour. And no, I don’t mean because it’s the first time I’ve done the show opening. It’s because we’ve secured an exclusive designation as the official podcast for what is sure to be the most talked about analytics book of 2025. The book, you might ask, analytics the Right Way. Or the full title, analytics the Right Way, A Business Leader’s Guide to Putting Data to Productive Use. I’m joined today by Julie Hoyer from Further for this discussion. Julie, are you excited to talk with these book authors?

0:01:24.7 Julie Hoyer: Oh my gosh, absolutely. Have been waiting for this all what, month?

0:01:32.5 VK: Very nice. I’m also joined by Tim Wilson, my colleague from facts & feelings. And he’s more of a guest today than a co-host because he’s one of the co-authors of the book. Tim, welcome to the show, I guess.

0:01:45.4 Tim Wilson: Hopefully this is the last time we’ll use this little gimmick, maybe.

0:01:49.7 VK: We’ll stop doing cool shit. We won’t have you on as a guest. How about that? No, never. You’d never. And we’re joined by Tim’s co-author, Dr. Joe Sutherland. In addition to working with corporate executives as a consultant and advisor, Joe founded the center for AI Learning at Emory University where he also teaches. And as it happens, Julie and I both got to be his students in a way when we worked with him together at Search Discovery. Now, Further, Joe has a list of credentials that is, frankly, kind of intimidating. Let’s see if I can get through it. He has one political science degree from Washington University in St. Louis and three more, including a couple of doctorates from Columbia. He’s a fellow at The Weidenbaum Center on the Economy, Government and Public Policy at Washoe. He worked in the Obama White House from 2011 to 2013. Casual. He published academic papers all over the place. He’s been on this podcast three times now, believe it or not.

0:02:50.2 TW: That’s an accomplishment.

0:02:51.9 VK: Sure is. Yeah. But intimidating. Not really. If you know Joe, he’s not scary at all. Today we get to welcome him as our guest. Welcome to the show, Dr. Joe.

0:03:01.9 Joe Sutherland: Thank you very much. It’s good to be back. That’s the reason we wrote the book, actually, was because Tim dangled the podcast appearance and he said, “hey, you’ll actually… ”

0:03:13.0 TW: All he had to do. You’ll let me on as a guest.

0:03:15.3 VK: I love it.

0:03:16.1 TW: I just need to bring somebody with some real credentials. That was the…

0:03:19.8 VK: Yeah, that’s the hook. Yeah. I love it. So excited for this one. So I guess a good place to start would be asking you guys just a little bit about how this book came to be. I know you guys worked together at Search Discovery ’cause I was there to see it, had the privilege to see it. But this didn’t come together till a few years later. So I’m curious, kind of how it started. A little bit of the origin story. And what did you guys see that was not out there in the space that you wanted to kind of address with analytics the right way?

0:03:50.0 JS: That is a great question. I actually have a specific memory of when this book, like, hatched in my mind, which is I was, like, on my back patio on the phone with Tim. This is, like, years ago. And I think one of us just goes, we should write a book. And I mean, it’s true.

0:04:08.9 VK: Simple as that.

0:04:10.0 JS: And the truth is, like, I do think we’re ideologically aligned in so many ways when it comes to, like, the practice of data and analysis and machine learning and artificial, all these things that you hear about today. And I just knew that by coming together with Tim, something wonderful would be made. And where where it went to, right, Was I get a lot of these customers, clients, or folks I guess I encounter a lot of them at the center all the time who go, I’m ready for AI. Can I get into it right now? Let me just buy it. Let’s do it. And they never ask the question like, “well, what are you actually trying to achieve? And how do we get there first? And do you even have the data availability? Have you thought through where your investments need to go?” And I actually think that the principles behind making our way towards these artificial intelligence projects and capabilities at companies which are truly transformational, the principles are universal. I mean, you can really link them back to any data or analytics question. And I wanted to give the corporate executives of the world and any sort of business leader. I wanted to give them a book that would basically say, hey look, read this and, or give it to your people, have them read it right and you’ll get there. That’s kind of what I was hoping to get out of it when we started.

0:05:28.3 JH: And that’s no small like task either. That is a lofty goal.

0:05:33.9 TW: Well, I mean, I think part of what happened, Joe and I met like he was thinking about like the introduction happened. I remember sitting in Atlanta in a conference room, me thinking, this guy’s gonna make me feel stupid. We hit it off and then as we work together, I have some very clear memories of sort of having an expectation than when you bring in a data scientist. And that’s kind of what Joe’s sort of the role, the branding he was, we were using for him at the time was data scientist. And I had gone through this journey on my own where I was going to try to become a data scientist like a few years before and kind of realized after a few years like, no, I can do really useful stuff, but I’m not really going to be ever something that I would consider a data scientist. But I had this expectation that when you talk to a data scientist, they’re going to start immediately talking about models and methods and you know, the vast quantities of data and the number of times that Joe would get brought in and there would be somebody, we want to do an X, we want to do AI, we want to do machine learning, we want to build a model that.

0:06:43.2 TW: And he very consistently would say, wait a minute, like we first have to define the problem, we have to frame the problem. And so having someone who had all the horsepower to do all the go super deep. And I think Julie, you might have even lived it more than I did. Like, yeah, he can really go deep, super deep on the technical. Was always saying. But the way companies tend to fall down is they skip that clarity on what they’re trying to do, what are their ideas. And so we had, while traveling, while just doing catch ups, we had many, many, there are many memories in my mind of Joe and I sitting across from each other at a coffee shop, at a restaurant, at a bar, having these discussions where I was actually learning a lot. He introduced me to the fundamentals of causal inference, which kind of blew my mind. And I was like, oh, this is a very important idea. Not all of the mechanics and the details that go into it, just the basic ideas behind what you’re trying to do and why you’re trying to do it is really powerful.

0:07:48.9 TW: So I’d had an idea to write a book eight or nine years ago. This book has very prominent vestiges of that. It is a much, much richer book because there was a lot more depth of thought, a lot more experience, a lot more collaboration on a much broader and deeper set of projects going into it. But it was, it’s not a book to say this is going to teach you data science, but it’s also not a kind of lofty, hand waving book that is just get all the data and get all the data super clean. We really wanted to write one, as Joe said, for kind of the business manager, the business leader, the business executive, so that they are positioned to actually get value out of their data, out of their analytics in a productive and efficient way.

0:08:38.1 VK: So that’s interesting. You both called out the audience kind of in your description there. And I think that that’s a really interesting choice because you think, oh, I’m going to write an analytics book. I’m going to write it to my people, to my analytics cohort and professionals. How come you guys made that choice? Was that kind of always there from the beginning or did that kind of come together as you were starting to frame out what some of the topics you were going to dive into were?

0:09:00.8 JS: Good. I mean, there’s a lot there. I think one of the points we make in the book is, I mean, we make so many points, right? And I think that they’re all like just new mental models for thinking. That was one of the reasons I loved the collaboration with you, Tim, was like, we just developed some really cool new mental models for how to think about the world and how to think about data and analytics and all those exercises that we go through in corporate America. But a few thoughts. One is, I’ve realized more over the past few years that there is this zeitgeist in the analytics or IT or technology industry vertical, what have you, where in a lot of ways you feel like you can just purchase insight. Like and I don’t know, I feel like it comes from a variety of forces. Right? And we talk about this in the book where it’s not like there’s some sort of bad actor out there who’s trying to convince you to buy their product when it really doesn’t create any value at all. Right? There is a reason why these things happen, but I just don’t get the sense that as a business leader these days you can always trust everything that comes from your tech or analytics or data folks without understanding sort of the more fundamental concepts. I’d be curious to know your thoughts about that, Tim.

0:10:21.2 TW: Yeah, I mean we definitely had a lot of discussions about this and I’m in a spot that having in many ways kind of facts & feelings and kind of the drive behind facts & feelings. The consultancy that Val and I and Matty Wishnow started is, it’s aligned with that there is, there are just forces that are naturally happening in the world, in the business world that kind of over index towards collect more data, run fancier models, find more technology, hire more data scientists, push to do more. And it just seemed like to us when we were working with clients that there was kind of, they were trying to start on step four and they’d skipped steps one, two and three. And even if the analyst or the data scientist was trying to go back to steps one, two and three, which is around thinking this is not, there’s not like a six step thing in the book. So that’s, that’s a metaphorical steps one, two and three. That, that’s really where the most opportunity to kind of redirect an organization’s investment is much more about getting the business owners who are trying to get value out of the data if they get off the hook and get told to just lob it to the analytics team and say bring me some value, having grown up in that analytics world, and feeling how difficult that is and sort of slowly realizing that, oh, it’s because we’re not putting enough upfront thought into it.

0:12:04.4 TW: So even though the audience is kind of the business leaders, we certainly think analysts who, and data scientists who read it will hopefully will help them think differently as well and give them confidence to say no, no, no, I have to go engage more farther upstream. We have to have clarity of, wait a minute, this is a data, this is an analytics ask. Am I trying to just like objectively and concisely measure the performance of a campaign or am I actually trying to figure out something to make a decision going forward and giving everyone kind of a little more clarity of language and ways to interact. But it really does go to, a lot of that burden falls on the business with the layer of I think we agreed there were some sort of fundamental misconceptions that the industry has. Analysts have it as well. Often business tends to have it as well. More data is better. If I have all the data, you’ll build me a perfect model.

0:13:08.6 TW: You’ll get to an unambiguous truth. So I think there is a level of statistical fluency that they’re not super difficult ideas, they’re kind of mind blowing. That’s the nerd in me. The potential outcomes framework. Boy, give me that second cocktail and get the wrong person in the corner. And they they are. Yeah. I talked about counterfactuals like four miles into a seven mile run with my trainer that where was she gonna go? You know, she…

0:13:49.1 JS: So no, no, but just, just to jump in on that, like there are more pieces to this book that I just, I want to communicate to the audience. Number one, my wife actually, my wife reviewed the manuscript and she goes, Joe, this is kind of like your philosophy on life in a treatise, like in a statistics like you know, sort of framework. And I think there are just a lot of really cool. It’s not dispensations, like we dispel a lot of the misconceptions that will help you. I almost feel like live your life like better. It’s hard to describe. Like I think back to, you know, I got all these degrees, right? And I only have one doctorate by the way. Just, just. I don’t have multiple doctors. Just one, just one. Just to be quick.

0:14:37.6 VK: Oh, shit we’ll have to rerecord that.

0:14:39.6 JS: No, no, you don’t have to rerecord, it’s fine. But the it’s like I always thought like why did I get a degree in statistics and, or at least you know, with a methodological focus and statistics and applications of machine, like why did I do that? And it’s like I really do think it was to sort of like self soothe and cope with the natural like OCD impulses and anxieties of life that I’ve experienced my whole, like once I understood the world in probabilities and sort of through the framework of a probabilistic approach, like it made my life better. I took things less personally, like I made better decisions. And I really do believe that the way that we think about the world through this book is actually going to be really helpful to people. So that’s one point that I wanna make.

0:15:29.4 TW: I mean I just took up like photography for self soothing. But you know, if you got to go get you know, some variable number of advanced degrees to each their own.

0:15:39.9 VK: That’s right. Well I’m glad you guys called out too that like this is still valuable for analysts to read. Especially because now I can’t wait to just like buy this and make all the analysts I know read it. I’m so excited about that. So I’m glad you touched on that because that was going to be my follow up question. But the other thing you guys already mentioned that I really wanted to touch on was the misconception. So going through that was one of the opening parts that I love the most is that you guys broke down, like, what are the misconceptions of like, how we got here? Like, why is it the way it is? And so without giving away too much, I didn’t know if you guys wanted to like dive into the ones you mentioned.

0:16:14.2 TW: We can’t. I don’t think we’re, I mean, it’s so eloquently put in the book that even if we just kind of off the cuff try to rattle them off, it’s. You’re still…

0:16:21.7 VK: We definitely couldn’t do it justice.

0:16:24.9 TW: I mean, I think I had, and I’ll give a little credit to Matt Gershoff on this as well, when years ago at a Super Week. And he said these three things that business is about making decisions under conditions of uncertainty. There’s a cost to reducing uncertainty and uncertainty can’t be eliminated. So I sort of had that. He kind of introduced me to this idea that the goal was not to eliminate uncertainty and there are diminishing returns. And I still think that is like, that is a huge thing. Like we’ve lived that where people say, what is the answer? And you have way too many data professionals walking around quoting Deming saying, In God we trust, all others must bring data. And they just kind of wield a misunderstanding of that as though you have it.

0:17:18.5 TW: Without data, you’re just another person with an opinion f you, I’m like, well, that has perpetuated this huge misconception that the data gives you an objective truth. And it just, it’s just never perfect data. So even getting truths about the past which aren’t that useful, it’s never truly perfect. And it certainly says truths about tomorrow you’re just not going to get. And it’s like, even though people say, yeah, that totally makes sense, but we just operate where when the analyst says, I don’t know, I can’t give you a definitive answer. So to me, that’s probably like one of my favorite misconceptions is that this gold rush for data is because it’s going to let us eliminate or essentially eliminate uncertainty, which is just a fool’s errand. But that is what the industry is doing. So there’s one of my favorite misconceptions. I don’t know, you Want to do one, Joe?

0:18:16.8 VK: Yeah, your take, Dr. Joe.

0:18:19.4 JS: Let me tell the. The Ye Olde economists joke ’cause I actually love this one. And it also, it does link back to the misconceptions. Yeah, so one of my favorite misconceptions just comes with this idea that data are inherently unbiased. And as a trained statistician economist, I could tell you that’s just totally false. There’s actually a great economist joke that goes as follows. So CEO of a major company he’s hiring for a role. He brings in three folks to interview for the job. A mathematician, a statistician, and an economist. You know, CEO calls the first guy in. He’s the mathematician. He says, look, what does two plus two equal? And the mathematician goes, well, it’s four, of course. CEO goes, four? Are you sure? Yes, exactly. Four. That’s exactly what the answer is. CEO is not pleased. Calls him the statistician. He says, what’s two plus two equals four? Statistician will go, on average is four. You know, give or take 10%. CEO still not pleased. So he calls the economist, and he gives him the same question. And the economist gets up from the chair, he looks around very sneakily, he closes the door, closes the shade, sits right next to the CEO and goes, what do you want it to equal? And I just… It’s so true. [laughter]] You know, it. Oh, oh, oh, there’s a laugh. Who did that? That’s cool.

0:19:42.2 VK: We’ve leveled up the production of this since your last time you came on the show, Joe.

0:19:47.2 JS: I feel like we just entered the new mill… This is Millennium, like. But I actually, I love that joke because there’s this old adage too. It’s like, with the nut torturing, the data will confess whatever you want them to confess. Right? And that’s just the truth about data. So stop thinking about it as something that’s inherently unbiased. It’s how you deal with it and how you build confidence in your methodology that really lets you get to the right answer.

0:20:14.2 VK: I love that. It sounds like you guys have a lot of things that you’re packing into this book that you’re packaging for these business leaders. How did you walk them through this? Was there an overarching framework that you leveraged? Because I think that I intuited one from working with you all over the years. But I think it would be helpful if you guys talked that through a little bit. If that was one of the mechanisms that was kind of driving the narrative and how you were packaging it up.

0:20:42.5 TW: Sure. I mean, when it comes to the outline that was in the book proposal, this is just kind of amuses me. I get sort of irritated with business books that feel like there’s way too much wind up where they’re like, you’re into like the fourth chapter and they’re still telling you what they’re gonna tell you in the book. We actually did have to add like an additional introductory chapter ’cause we had so much to say. So I’m sure every author says, well, we’re not guilty of that. But you know, like, so there is that… There’s that part of, just from the structure of the book is there are a couple chapters up front trying to say a lot of the common ways of behaving are problematic. And let’s help you understand why those are problematic then kind of the core of the book, it is kind of a framework trying to keep things as simple as possible, which is, and I’ve talked about pieces of this on many episodes of the Analytics Power Hour podcast in the past, but that fundamentally, when you’re trying to put data to use in for an organization, there are kind of three discrete things you can do. You can be trying to measure performance objectively and concisely, which so many organizations really, really struggle to do well.

0:22:03.7 TW: They may have a lot of reports and dashboards, but they’re not really doing a good job of objectively saying how are we doing relative to our expectations in a meaningful way. There’s validating hypotheses that’s like the analysis or testing or that’s got multiple chapters devoted to it ’cause that’s where we’re trying to make better decisions going forward. So lots of ways to validate hypotheses. I think the, at least in marketing, there’s a lot of talking about if you’re doing AB tests on a website, they’ll say, what’s your hypothesis? Well, everything that we’re doing with lots of different techniques, it really should be grounded in validating a hypothesis. And then the third is you have data that is just part of a process. It’s enabling some operational process. And those sort of, they fit together interestingly. And we did do a lot of kind of thinking and talking about how to talk about AI. This is not a book that is AI, AI, AI, AI, AI. Because we went in saying AI, it’s purpose, it delivers value because it is part of an operational process. So it actually fits in this one area. And so everyone who’s if you’re super excited about AI, it’s not doing a whole lot. It might be a code assisting or something on validating hypothesis to develop some code for a model but it’s not like AI replaces the analyst because those other two, measuring performance and validating hypotheses really are much more about kind of human thought.

0:23:43.0 JS: So I 100% agree. It’s 2024 in retrospect, was just it was the year of agentic AI, right? It was, everybody was very interested. How do we use large language models to replace analysts and replace people? And you know, the truth is, like, it really preys upon the super lazy impulse that I think we have as like a as a human species in society, right? Which is like, man, if I could just create a machine that could just do my work for me and delegate the work to the machine, like, I can go golf. You know, while it does all the super valuable stuff that I was doing, right? I’ll just go golf. And like if you read through the book, you’ll actually, it demonstrates why that is like a super, like, it’s just not true. You could never really do that. Actually, I’ll jump in on, we kind of talk a little bit about book, but there’s a guy who got the Nobel Prize back in the ’70s. His name was Herbert Simon. And he, he had this idea that as a society, what we do when we’re looking for the answer to make a decision, we sort of just look in like our local area and space and talk to our friends. Good examples.

0:25:01.1 JS: Like, if you’re trying to find your ideal soulmate so you can marry them. Like, most of us don’t go and date like the 8 billion people in the world, right? Like, and find the best one. What we do is we go and we ask our friends and we kind of go, it’s somebody who’s on the periphery of your social circle or that you know growing up, like, we look to find the best possible alternative that’s just in the local area where we’re looking. And we can do that actually super well because we were baked in with these impulses and intuitions. But machines like, to find the best option and to make the best estimate of what might happen in the future if a decision is made. They have to search like the entire space of possible outcomes and opportunities. It’s kind of like, we often refer to it as boiling the ocean. And it’s virtually impossible, right, to be able to make really, really incisive decisions and insights with an approach that boils the ocean. It’s actually just not even really feasible within the amount of time that we have available to make those decisions. And so I’m not sure why I got into that, but I thought it was important for you. Oh, agentic AI, that’s what it was. The takeaway there was just I do think that people with this artificial intelligence revolution happening over-assume that we can delegate to the machine. But the truth is you’re still going to have to go through the decision making processes that we articulate in the book.

0:26:27.0 TW: I literally saw a post on LinkedIn that said ’cause there’s so much around like the Generative BI. And oh, the simple questions. And it’s like for instance, if I wanna know how many leads came from California last month, I should be able to do a natural language query. And I’m like, that’s literally no one is saying I have very simple, straightforward, defined questions and it’s spending me so long to get to it. So that’s, to me, that’s also, it’s kind of the, it’s this saying, well, these dots, they’re close enough to connecting. Let me go ahead and make the leap that I can just ask if I could just ask the AI to give me insights. And then it’s like a for instance, I’m like, well, nobody getting told how many leads there were from California last month is rarely the type of question that takes you anywhere.

0:27:17.0 JS: Let me actually, I do want to dig in on this ’cause like what you’re describing, actually I would think of as. It’s not even an insight generation using artificial intelligence. What you’re describing is, the process of doing the query to get the answer is just being replaced, right, by some sort of generative AI technology. So that actually is consistent within our book. We kind of break out insight generation from operationalization and Operational Technologies Enabling automation. And that example you just gave, I actually would throw it in the bucket of, yeah, it is sort of like an operational enablement problem, right. Which is just, oh, we need to get to the query faster. Right. And that to me is consistent with the use of AI but…

0:27:57.8 TW: Yeah, it’s fine to do it. It’s just a, it’s to just paint that as saying, and this is what’s going to replace. I’m like, no, like you’re actually missing the boat on what should be going in to actually getting real value out of this. If you think that it’s following that path is what’s going to do it.

0:28:20.2 VK: Well, I want to draw back on something too that you guys mentioned. Like you called them decision making frameworks and I’m luckily enough to have worked with you all. And so it’s very ingrained in me. But I run across this a lot still where people talk about the value of a new product, a new technology. It’s like it’s going to give us insights. And then you ask them more about it and they say, oh, it’s going to give us knowledge. We’ll know what’s going on. There’s value in knowing. And it’s like to a point, there’s value to knowing. But I’m like, the real value comes when you act on the knowledge. Right. Like, and you guys make very clear distinction about that, especially in this framework. And I think that’s how we as a small group here today think about it. But it’s still shocking to me that I run into a lot of people that I have to make that argument too and really say, like, I think there’s one step farther. And like, so when we talk about value to clients and of different services and things as consultants, like them being able to go take action on what we’ve helped them learn, like that is really the end point. And I think a lot of people’s minds will be very blown and like opened to that by reading this book, which I am so excited about.

0:29:27.8 TW: Well, and I do think what happens, and we have, well, trying to explain sort of counterfactuals potential outcomes framework which I mean, I think Joe was like, rein it in, Tim. Rein it in. I know you’re excited about this, but…

0:29:42.5 VK: Spin off book. Yeah. He looks the right way part two.

0:29:46.8 JS: Are we doing another book now? Book number two. We just have to. Right now. I remember when we were on the…

0:29:53.5 TW: Well, I mean to be fair, there were lots of things where I was.

0:29:56.0 VK: Joe just wants to come back a fourth time.

0:29:57.8 TW: I was the one who was like trying to…

0:30:00.3 JS: If I come back five times, I heard there’s a gift.

0:30:01.5 TW: There’s a, get the jacket.

0:30:03.0 JH: Yeah, there’s a jacket. Okay, okay.

0:30:06.5 TW: But I would often take a crack at saying I’m going to try to describe this because I am closer to the less deeply, deeply immersed in the mechanics of this. Joe would come back and basically in their footnotes in the book, we had fun with the footnotes, but there are lots of times where we’re like, if a trained statistician is reading this, we are taking a shortcut. It is not material for what we think people need to know. Joe does have his reputation so it would be in the footnote and say, look, this is technically not quite correct, but it’s good enough. And which I think goes to a lot of what we were trying to do with the book. But I’ve seen that again and again when someone says, oh, we’re just going to make this change and we’re going to see what happens and we’ll figure out if it… We’ll make a change and then we’ll just see if it worked or not. And we sort of walked through an example of saying, well, what if you make a change and this is what the data looks like?

0:31:04.6 TW: ‘Cause it usually looks, it’s not some abruptive massive step function that says, look, we changed the button color on this page and revenue jumped way up. I deeply believe that’s what, as human beings, we think is going to happen. We’re going to do something and it’s going to have this abrupt, sudden, immediate impact. And we’ll look at the chart and the chart will kind of go along and it’ll have a big jump and go on after. And we’ll say, see, that’s what happened. That doesn’t happen. And so helping people understand that, that it’s like, no, if you’re going to have an intervention, if you’re going to do something and you want to see whether or not it worked or not, you can’t just say, let me do it, and then I’ll wait and look at it afterwards and we’ll have a we’ll just, it’s going to be obvious. Like, it’s not obvious. And then it gets dumped on the analyst to say, well, figure out the answer anyway. And, well, the easiest way to figure out the answer would have been to think about how you were going to answer that question before you actually made the change.

0:32:05.5 JS: Well, it’s amazing, right? It’s like, think about how you’d want to answer the question before you even try to answer it or get into it, right? But if you don’t do it and then you force an analyst who God forbid, hasn’t had the experiences that we’ve had in the wild west of data analytics, like, you might end up having somebody who looks at what happened and you draw the wrong conclusion. Right. That’s kind of the risk. Like, oh, when we cut our investment in sales professionals in the Southeast, like our efficiency went way up. Well, let’s just cut more. The conclusions can be wrong. And if you don’t think about like the appropriate inferential framework, you might get to the point where you say, well, we made that decision. We’re going to skip the process where we vet it and figure out if the inference was a solid inference. We’re just going to go right ahead to automation. We’re going to throw this into the machines. We’re going to have them automated to oblivion. Right. And then all of a sudden you get somebody who’s got no real time feedback or consideration for what’s going on, just implementing whatever random decision you made to really like to kingdom come. Right. And I really think there’s a risk, especially in this era of automation, that we skip to this human out of the loop stuff way too fast simply because we drew the wrong inference. And part of the book is thinking about how to slow that process down.

0:33:31.3 VK: One of the parts too. And we have teased it here before and I know we’ve had a lot of conversations about it and like other sidebars. And so I’m really excited to ask you guys about this is your ladder of evidence, because I know that is not easy to come up with. I have had other conversations with people and it’s like you think it’s so straightforward and then you get into, oh, but what if you think about it this way or you know, another way? So can we talk about your ladder of evidence that you settled on?

0:33:57.2 JS: We can definitely do that, but before we do that, we have to walk through Tim and I’s like ideation process on this.

0:34:04.8 VK: I need to hear this origin story.

0:34:07.0 TW: That was intense.

0:34:09.6 JS: It was intense. It was the subject of many like long Zoom FaceTime conversations. And it actually, I think that this section alone single handedly reorganized the book like three times.

0:34:23.2 VK: Wow.

0:34:24.1 JS: Is that accurate?

0:34:25.3 TW: Yeah, very much.

0:34:26.4 JS: But once we nailed it, I think it actually came. It came home. Like, I really do think it did.

0:34:31.8 VK: We need a drum roll.

0:34:32.8 JH: Yeah.

0:34:33.0 TW: Yeah, well. Oh, hold on.

0:34:35.2 VK: Come on, Tim. The ladders of evidence.

0:34:43.9 JS: Wow.

0:34:44.7 TW: I mean, so the funny thing is, is that there was a, I think it was like a Shopify blog post buried somewhere that had this idea of a ladder of evidence that I had thought was really useful. And I’d written a little bit on it. So that’s kind of where it started. I dug in enough to say like, oh wait, this is not like some deeply established way of thinking about things. And where we landed, we’re also calling it a ladder of evidence. And it is conceptually consistent, but it gets to that idea of uncertainty, which also gets to this idea of how strong is the evidence I’m using to make a decision. So the ladder is very simply, there’s anecdotal evidence, which is super, super weak evidence, but it’s evidence. And this is in the context of validating hypotheses. If I want to validate a hypothesis, if it’s low stakes or if I have no time or any number of factors, all I have a little bit of evidence. You know what, generally speaking is better than no evidence. But we need to recognize that that is anecdotal. There is descriptive evidence, which is, I mean, tons and tons of techniques across lots of different types of data.

0:36:03.8 TW: That’s where I think a lot of analytics and a lot of research and insights lives. It is stronger evidence because we’re looking with generally have more data. I think actually it’s in the book, and this was credit to Joe, that descriptive evidence is when you got a whole bunch of anecdotes kind of gathered together. So it’s kind of a continuum. It’s stronger evidence. And then the third kind of the strongest evidence is scientific evidence, which is generally speaking, controlled experimentation in one form or another. And it’s not like these are good versus bad. It is a strength versus weakness of the evidence. It goes to the criticality of the decision, but it goes to understanding that you can’t just say it goes back to those misconceptions just ’cause I have a billion rows of data and I’m going to run a model on it that is still almost always not going to be as good as running a controlled experiment if I’m trying to actually find a evidence for a causal link between two things. So we spend a whole chapter on descriptive evidence and a whole chapter on scientific, and there are books written on scientific evidence.

0:37:14.9 VK: So what were some of your earlier like, words you tried to use? Because I also feel like most of the time when I’ve seen some version of this, it’s using words that you see on like data maturity curves. You know, it’s just like it does imply like descriptive good, better, best, or it. Yeah. Yeah, predictive. It always has to get to predictive or…

0:37:37.3 TW: Predictive. Descriptive, predictive, prescriptive.

0:37:39.8 VK: Yeah. All those terms that are much more like talking to the method itself. And I know these kind of are. But your buckets that you ended up on are so much nicer and broad enough that you can’t really get down in the dirt on, like, nitty gritty. Like someone can’t, I feel like, come in and really be like, oh my gosh, I completely disagree. So I thought it was very artful how you guys landed there. So what were some of the previous tries?

0:38:07.0 JS: Well, number one, somebody can totally come in and disagree, and I fully expect them to do that. I welcome you to comment on my LinkedIn posts. As much as you care to disagree with us, it’ll get the conversation going.

0:38:23.2 TW: If you’re in such disagreement that you wanna buy 100 copies of the book and burn them?

0:38:27.0 JS: Yeah, if you wanted to go get 500 copies.

0:38:29.8 TW: You know what? Screw this disagreement.

0:38:35.5 JS: I mean, the earlier, I think the way we had thought about it before was actually like, in terms of like, analysis rather than like the weight of the evidence. Like, no, it was kind of like. And this is why I like where we ended up ’cause it was starting with this what methodologies can you use to answer your questions? Right? And it was kind of like, well, there’s easy methodologies and there’s hard ones. Like, that was kind of where we had started with it. But I think as the sort of picture in my head as we were like developing this was actually, and I think we ended up with a cartoon in the book about this, with the hand scale. Right. It was kind of the scale which was like, well, actually there’s this question that has to be answered and it has to be weighed against some sort of weight of evidence against it. And if it’s a really heavy question, you have a lot of heavy evidence to come up against it. And that’s what I think started to get us towards this idea of like, well, if it’s just a light question, you just need a light amount. And what are the usual forms of light evidence? Well, it’s usually just walking down the hallway and talking to your coworkers, see if they’re in a good mood or a bad mood. Right. It could be very simple stuff. And that was my memory. Like, the shift was going from the methodological thought process and mental model to thinking about it more fundamentally. And that’s what I think gave us the elegance of it.

0:39:58.2 TW: It was kind of your. Yeah, it was like historical data analysis, research, primary and secondary and controlled experimentation. And I mean, one of that, as we were going around and around trying to kick the tires on what we had, we had a whole debate around is secondary research. Joe was like, “that’s anecdotal.” And I was like, “what do you mean? It could be like super robust secondary research.” He was like, “no, you got that in a study if you got access to the underlying data and, and you knew the research question they were trying to answer and you knew their methodology and that lined up with what you’re trying to validate. Sure. But that never happens. Even like a scientific journal, secondary research, it is always one step removed.” So I was like, “yep, that’s touche. Totally get that.” So, so that’s one where like primary research would fall into descriptive evidence. Unless, you know what if you do a small usability study that’s kind of anecdotal. So there’s a little bit of a gray area. But I think that ramp of saying how strong, thinking of it as the strength of evidence, I mean ever since we kind of hit that point, I am using that word, I’m using that phrase a lot.

0:41:11.9 VK: Yeah. Even after you said it, Joe, like that was a light bulb moment for me when you were like, yeah, it’s not so much the methods, it’s the, the weight of evidence. I was like, “wow.”

0:41:21.2 JH: It’s also just so nicely put for your audience because it’s incredibly practical too. Because if you’ thinking I’m like a leader inside of an organization, I perhaps have like multiple analytics teams that I work in. Maybe some are embedded, maybe there’s a center of excellence, then they’re broken out by their specific functions. There’s like the digital analytics team which suffer from performance and so if that’s the construct in which you think about all of this, you might not understand how to. Right. Size the evidence for the question or problem at hand. And so I think that this is going to be one of those sections that really connects with your audience. I think it was very nicely done.

0:41:56.7 JS: Thank you. I’ll add one more thing which is like I do think a lot of the books and, and Tim actually deserves credit for the tone of an approach of the book as more of a fun, entertaining, interactive, very like down to earth tone. Like I think a lot of the scientific approaches can come across as super heavy handed and super duper like this is so full of firepower that you could never deal with it. And it’s meant to be very impressive. Right. And it’s like in its methodological weight. And the way that we’ve had a lot of fun with this book was thinking about like little simple examples. I mean this you might go and be the vice president of analytics at CocaCola or you might be the CEO of like Merck Pharmaceuticals or something. You could be any of these people. But on a day-to-day basis, you’re not sitting on the top of a mountain with your hands on your hips being like, haha I mean, what you’re doing is like you’re going down the hall to talk to Michael and Catherine and they’re grumpy because they discovered the snack room is no longer stocking their… It’s a much more down to earth experience. And I know, I really thankful that Tim enforced that on the book.

0:43:06.5 VK: Well, and on that note, ’cause you did bring up interactive, I did want to bring up the little is it quizzes that you guys are doing at the end of each chapter. Like the performance measurement check in. I think that that’s super fun. I think you guys should talk about that.

0:43:21.6 TW: Sure. This was, yeah, this was my brainchild and Joe’s the one who actually built it. But because we, I mean, I think I come from pushing performance measurement and how, and we did an episode of the podcast around goals and KPIs and the two magic questions. And so part of what we’re trying to do, it’s actually useful was we asked the question like, how do we measure the performance of a book? And there’s the easy metric which is, well, how many did you sell? But we’re not writing the book because we’re trying to drive sales. Like we applied the Chapter 5 is all about performance measurement. And you guys are both super familiar with the two magic questions of like, what are we trying to achieve with the book? And we actually said what is our answer to that question? And we write that in the book and say we wanna arm you with a clear and actionable framework and set of techniques for efficiently and effectively getting more value from your data and analytics. And then, okay, well how are we going to decide if we’ve done that? And we’re like, there’s lots of ways we could measure that.

0:44:23.3 TW: But we said we should ask people. We want people on a chapter by chapter. And then for the book overall, we’re going to ask them. So there is a analyticstrw.com, analytics therightway.com. It’s analyticstrw.com, but that’s where the TRW is, has like an evaluation form. So at the end of every chapter we say, hey, help us measure the performance. We have a target set and it’s published, certain percentage of people to say that they somewhat agreed or strongly agree with two questions about the information and ideas presented gave me a new and better way to approach using data. And I expect to apply the information presented to the way I work with data in the next 90 days. So we said we will actually measure and when you click Submit, you will see how the cumulative respondents to date perform against those targets. Which is kind of terrifying. But it also seems like, well, that’s how if we did do a second edition of the book, we should know which the weakest chapters or which the least impactful chapters were. But so we’re doing it kind of there’s a meta one to say, yeah, if you think about it, you really do need to think one level beyond what metrics will be available to. What are we really trying to do and how could we best measure that? So yeah, I’m pretty…

0:45:50.6 VK: And where on the ladder of evidence will that fall?

0:45:54.1 TW: Well, that’s performance measurement. It’s not validating a hypothesis. Right. So that’s the. Oh, so that’s… It’s just objectively measuring.

0:46:02.9 JS: Yeah, because we’re just trying to alert, we need to, we need to alert ourselves. The thing that I’m actually worried about, Tim.

0:46:11.6 JH: Oh, here’s a great time to bring it up. I’ve got concerns.

0:46:19.6 JS: I’m like and I did do some, you can go on this website, right. And it’s like you could, I’m worried about like a botnet coming and like over just giving us a bunch of poor scores. So look, if you come on the website and you see the scores are really low, a botnet got us.

0:46:42.5 TW: We’ve had some discussion about correcting for that, but this is, yeah, our assumption is this is not going to, there aren’t going to be foreign actors saying, boy, if we can tank the performance measurement of this book, that’s going to give us a global leg up. But who knows?

0:46:58.8 VK: I have a hypothesis about which chapter is going to score the highest on the actionability over the next 90 days. So maybe should do our own little back of the napkin target setting, see if we can see how it lines up against real data in the future. Be real meta about it.

0:47:16.2 TW: You could.

0:47:17.4 JS: Actually the other fun thing about the website, if you do go on the website, is we have some merch. We’re actually not trying to make any money off this merch, but it is, I think it’s actually pretty funny. It’s funny stuff. So if you’re a real fan of the book, you can also get merch online. Printed T-shirts, etcetera.

0:47:36.5 VK: I was gonna say I’m getting myself a T-shirt.

0:47:38.7 TW: Yeah. Book writing process. Joe makes the crack, like, oh, go to this URL. It gets in a footnote. It’s like a wisecrack. So then we’re going through the process. I’m like, yeah, that’s a nice draft of the site, Joe, but you did put/store in the footnote. And then…

0:47:54.0 TW: That actually is what happened. We never intended to do it. And then we realized it’d be kind of. It’s actually not a bad idea.

0:48:03.8 VK: I love that. Well, we’re gonna have to move to wrap pretty soon, but I guess. Is there any last parting thoughts, Dr. Joe or Tim, that you want to share that you’re really excited about your future readers being able to take away from analytics the right way?

0:48:17.1 JS: Look, I started programming in 1998, okay, in a language called BASIC. I don’t know how many of the readers will even, or the audience or the listeners will even know what BASIC is, but. But it was a very easy to use programming language on Microsoft systems back in the day. And you know, I have seen over the last, how many, 26, 27 years, right? Like just things seem to get more and more complicated it almost seems like it used to be so much. Maybe I’m just being nostalgic, but everything from the documentation to the methodology, they’ve gotten more complicated. And I think that that’s for no reason in a lot of ways. And I think that that really deprives people like, I think it deprives them of the opportunity to use all these great tools that we have because they, not because they don’t have access to them. I think that access has improved. But I do think that like the self imposed misunderstanding or feeling like they don’t understand the complexity of these things, like almost is like a self deprivation of all the great tools that we have out there. And so my hope is just that the book kind of reopens that door in really simple and direct terms.

0:49:29.4 TW: And I’m gonna have to go get an advanced degree to self soothe from the fact that I also started programming in on BASIC, on Apple IIc. But I just did the math. It was in 1985. So Apple…

0:49:47.8 JS: Got me there too. Got me there.

0:49:50.5 TW: Oh my God. Writing with these kids these days, I tell you, was a little rough because he was, some of the language he was dropping. But I mean, I would die happy if there were people using some of the language in the book and finding it as a way for them to more like, act with more confidence within their organizations. I mean that’s fundamentally, deeply believe this stuff is not so complicated that needs to be treated as a mystical black box that’s so intimidating that I need magical AI to solve it. That there is so much fun and joy and hard creative thinking and that is like the core of like using analytics productively. We’re still a few generations away before human creative thought isn’t kind of at the core of that. So I’m hoping that there are readers who say, I get it. Now it’s not a hard thing or a scary thing or a frustrating thing to collaborate with my analyst or to poke around in my dashboard because I know what I’m trying to do, why I’m trying to do it, and I have ideas and I can treat those ideas as hypotheses and think about how strong is the evidence I need to validate them. I can feel fine making a decision with very weak evidence because that’s okay. You can’t like, that’s absolutely okay. What’s not okay is to not realize that’s what you’re doing. So, yeah, I guess I’m passionate about it.

0:51:42.0 VK: I like it. Just a little. All right, so when does the book come out? Where can we find it?

0:51:45.4 JS: It comes out end of January. Was it January 25th?

0:51:49.5 TW: Tomorrow. If somebody’s listening to this… January 22nd, if they’re listening to the podcast the day that it drops, then you can pre order now and you’re effectively ordering it because it is available tomorrow on Amazon, on Walmart, Target, Barnes and Noble, wherever you get your books.

0:52:09.4 VK: Oh, you fancy.

0:52:11.9 TW: You can go to analyticstrw.com and get links to it. You can go to the Wiley.com and order it there. It’ll be out as a ebook a little bit later and actually it’s coming out as an audiobook in about another month.

0:52:29.4 VK: Huh? What?

0:52:30.5 JS: And so if you want to go and listen to the sweet, sweet stories of data and lull yourself to sleep or perhaps Keep yourself busy in the car. You can do that.

0:52:41.7 TW: And Daniel Craig is reading it. No.

0:52:44.1 VK: Damn.

0:52:47.5 JS: Actually, no. It’s a professionally trained like voice actor. Luckily it was not either of us.

0:52:54.3 TW: Yeah.

0:52:55.1 JS: ‘Cause hat would have been difficult.

0:52:57.6 JH: I was hoping it would have been.

0:53:00.9 VK: I would have listened. Well, this has been. Been such a fun little reunion. Talking about analytics. Right. Way long time coming. Very excited for this. So thank you so much for joining us. Dr. Joe, it’s been a pleasure.

0:53:11.8 JS: My pleasure. Thank you for having me.

0:53:14.1 VK: And Tim. Thanks for being here. We’ll show you thank you to you. Even though you’re co-hosting…

0:53:20.2 TW: Somebody had to hit the record button. Yeah.

0:53:24.2 VK: Well, one of the things that we love to do is just to go around the horn and share a last call. Something that we think our listeners might be interested in. So Dr. Joe, you’re our guest. Would you like to share your last call first?

0:53:36.4 JS: So, yeah, Related and also unrelated. I went down to, we have a camp. Emory University has a campus in Oxford, Georgia. It’s out down in Newton County. And I did a quick presentation to their local chamber and I asked how many of you guys feel like you have reasonable facility with artificial intelligence technology such that you could use them in your business today? And it was a big room. Not one hand went up. And I actually realized we’re all talking about it here. This analytics audience, we talk about it all the time. But not everybody has access to these tools. And so we went and raised money and started basically workforce development Outreach tour. And if you’re interested in learning more about it and how to get involved, we also offer certifications and artificial intelligence, etcetera. Just go to aiandyougeorgia.com I know this is a national audience, but aiandyougeorgia.com.

0:54:32.9 TW: International.

0:54:34.2 VK: Global.

0:54:35.6 JS: Global, sorry, it’s a global audience.

0:54:37.4 TW: This is Georgia, the US Georgia. Not the country, this is the state in the US just to…

0:54:42.8 JH: Clarify.

0:54:45.9 S1: Love it. That’s a good one. All right, Julie, how about you? What’s your last call?

0:54:49.2 JH: My last call is actually a tip I got from actually all of our at least previous or current co-worker Ricky Messick. One of our faves. He was telling, well because I was sharing with him that I struggle to make it all the way through. Like listening to self help books. I was like, sometimes I just want them to get to the freaking point. I’m like, they say it so many ways to fill up pages. We’ve talked about this before. It’s one of my shortcomings. I just cannot finish them. So he told me that he does this thing where he just puts the playback speed like close to 2, like 2x or maybe more ’cause he found that when it’s going really fast, you actually have to stay more focused on what they’re saying. And you will like retain and take in the information instead of letting your mind wander. And he’s like, and then you get through the book faster. So I have been trying that slowly. I’m not up to as fast as he listens to it, but I think it works.

0:55:45.3 TW: We tell ourselves that for all of our listeners, for all of our listeners who listen to us on 1.5 or 2x, we’ll tell us because they really wanna focus on it. They wanna focus on the content of the show, not ’cause they just wanna get through it. I’m gonna tell myself.

0:56:00.6 VK: Okay. But the fact that they would listen to it all still says something.

0:56:04.0 JS: One of our former co-workers actually one time I missed a meeting and they recorded. It was like a four hour meeting. And you know, our co-worker goes, no, no, just go back and watch it. I said, oh, should I bill four hours to watch the four hour meeting? She goes, no, just watch it at double speed. And so then I think, well, if I watch it at half speed, do I get to Bill 8?

0:56:27.9 VK: Nice. That’s good, that’s good. All right, Tim, you got a last call for us?

0:56:37.4 TW: I do. It’s trivial, but because I’m a sucker for getting a random data set and pursuing it a little too far. This was a while back I got it out of, I think it was out of Philip Bumps how to Read this Chart newsletter. But it’s a guy named Colin Morris and he did this kind of deep dive. It’s called compound pejoratives on Reddit, from butt face to wank muffin, wank puffin. And he basically took compound. So to think like dumbass or scumbag, where you have the two words and he went and kind of managed to pull, I don’t know, like 20 of the front halves and 20 of the back halves. And then did like, started with just a little heat map of like, what’s the like, dumbasses is the most common occurrence and you’ve got ones that, you know are like a lib hat, like that’s not really used or a wink sucker. So you start to see, like ones that you’re like, oh, you could use that like, but it’s, it almost never shows up and then you’re like, well that’s cool. But then he wound up going deeper and deeper as to like, well, which like affixes have the most suffixes applied to them. Which suffixes have the most affixes applied to them? So it’s quite a bit of a dive and it’s really just entertaining. There’s nothing you can do with it other than come up with like oh, you’re a buttload. So you wind up, you can’t help it, like coming up with pejoratives. You’re like, somebody said it. Yeah.

0:58:17.1 VK: This is the perfect last call for the explicit rating.

0:58:20.9 TW: Yes.

0:58:21.5 VK: Analytics podcast. I love it.

0:58:23.0 TW: We had to get it there. What about you, Val? What’s your last call?

0:58:26.0 JS: Yeah, did we just get rated high? Is this like an R rated podcast now?

0:58:30.1 TW: Oh, it’s always.

0:58:30.7 VK: It always was.

0:58:32.4 S1: It always is. Yep. I steered clear of some of the specifically R plus rated ones, but they’re there. What about you, Val? What’s your last call?

0:58:46.2 VK: So mine, I wanted to keep it in the family. Search Discovery alum/Some current family. This is actually a podcast from Experiment Nation when Nick Murphy was a guest on in the summer of 2024 and it was all about building a learning library. And I’ve actually been sitting on this last call for a long time, so I’m really excited to share this one. If you don’t know Nick, he’s been a consultant for a couple years at Further, but he was an in-house practitioner before that and he’s incredibly pragmatic with his approach to consulting and helping organizations think about the power of experimentation and such a joy to work with him and his beautiful brain. But in this, he kind of walks through kind of like a base model for how you would think about repository of learnings. Because as we all know, that’s the value, that’s the reason you experiment, right, is to get smarter and make better decisions as we’ve touched upon today. So this is a way to make it something that everyone in your organization can access and query and search. So it doesn’t just live in a PowerPoint presentation on someone’s drive. But yeah, he talks about how CROs are often thought of as the numbers go up wizards, which I nearly did a spit take on when he said that. That was so, so funny. But it’s good, It’s a really great discussion and definitely walked away with some, some good tidbits of and some sound bites that I can share with my clients. So definitely recommend that one.

1:00:12.4 TW: Awesome.

1:00:13.1 VK: Woo.

1:00:13.5 JH: Go, Nick.

1:00:17.0 VK: All right, so this has been an awesome discussion, so I’m so thankful that we were able to dive into analytics the right way with, we got both authors on our episode today for the groundbreaking launch of the book, but no show would be complete if we didn’t throw a huge shout out to Josh Crowhurst, our producer who does a lot of that work behind the scenes. So thank you, Josh. And as always, listeners, we would love to hear from you. So you can find us in a couple different places. The Measure Chat, slack group, our LinkedIn page. You can also shoot us an email at contact@analyticshour.io or if you’ve been listening the past couple episodes, you will know that you can visit us in the comment section of our YouTube channel. So it’s another place you can grab and listen to this episode. So feel free to reach out. We’d love to hear from you. So with that, I know I can speak for all of my co-hosts, Julie and Tim, when I say no matter what step of the ladder of evidence you are on, keep analyzing.

1:01:18.6 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions. On Twitter at @AnalyticsHour, on the web at AnalyticsHour.io, our LinkedIn group and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:01:36.5 Charles Barkley: So smart guys wanted to fit in. So they made up a term called analytics. Analytics don’t work.

1:01:43.1 Speaker 7: Do the analytics say go for it no matter who’s going for it. So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

1:01:56.9 JS: So I’m officially. I’ve been on now three talks. This is my third time on the show.

1:02:05.2 TW: Wait. We talked about natural language processing.

1:02:08.7 JS: That’s right. NLP attribution without cookies. And this is the third one.

1:02:15.1 VK: Ding, ding, ding.

1:02:16.5 JH: I’ll reach out to Katie Bauer. That if you do five, you get the jacket like SNL. So.

1:02:22.0 JS: Well, you know what? You should tell them, the audience, you should say, Joe’s in the running for the SNL jacket.

1:02:31.3 VK: That’d be fun to design a jacket, though an aph.

1:02:35.8 JS: I would wear it everywhere to the detriment of my children and wife. I actually was gonna. I was gonna send one to Goose because Goose, like, sort of brought me in. Tim together like in life. And so I just feel like he. I wanted to just send him one. Just be like, you know what? I don’t know what is the protocol on that? If you sign a book and then gift it to somebody, is it, like, kind of juicy?

1:03:08.0 VK: You have to just make sure you do, like, a little red lipstick kiss by it?

1:03:10.3 JS: XO.

1:03:17.3 VK: Tim’s, like, unamused.

1:03:19.3 TW: Yeah, well, it’s more that, like, as we were writing the acknowledgements, Joe’s the one who thought to call out Goose. That was, I think, under his. Then there. We then figured out we can have, like, a joint acknowledgement section. So that was a good catch. Good co-author stuff. I mean, now he gushes a lot about Sarah in his acknowledgements. Julie gets no mention of my acknowledgements but…

1:03:49.8 VK: But you dedicated it to her.

1:03:53.3 JH: Yeah.

1:03:55.8 JS: It’s not a competition, Tim. You don’t have to love your wife more than I love mine.

1:04:07.5 VK: Julie, I’m kind of disappointed that you don’t have a little bit more empathy for me that I have to do this opening. You’re like, yeah, I saw. What about it?

1:04:19.4 TW: Yeah. Joe, this is a first. This is a first for Val.

1:04:22.7 JS: I suffered now you will suffer. This is like this is Val’s first time doing the intro.

1:04:28.2 JH: Yes. And the closing.

1:04:30.8 JS: Oh, yeah.

1:04:31.4 VK: No, it is big. It really is big. I didn’t give enough appreciation to that because I would not be mentally prepared for it. So I do greatly feel my first thought when I wake up. First thought before I go to bed, still not ready.

1:04:45.1 TW: I’m literally sitting on a looking at screens of three people who all rise to the occasion and come across so much more polished and coherent than I do in any situation. I’m feeling great about you opening.

1:05:01.9 VK: Okay, well, I’ll have to borrow some of your confidence, like I said before, but are we feeling ready to start?

1:05:09.4 TW: Let’s do it.

1:05:12.9 VK: Power pose. Power pose. My favorite is when you said that and you lean away. It was like…

1:05:20.2 TW: Yeah, like the microphone.

1:05:22.0 JH: Power pose. Power pose. Power pose.

1:05:27.1 VK: I was definitely, if Tim had the 5, 4, 3, 2, 1, countdown on, I was just gonna start, like, no matter who was talking, I was be like, hey, everyone, and welcome to the analytics power hour. My name is Val Kroll, and I definitely didn’t just take one, but two nervous dumps before I got on this episode tonight.

1:05:47.7 JS: Josh? It’s out there.

1:06:02.9 JH: Okay. Rock flag and you can’t eliminate uncertainty.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Recent Episodes

#263: Analytics the Right Way

#263: Analytics the Right Way

TweetShareShareEmail0 Shares