
It’s the most…won…derful…tiiiiime…of the year! And by that, we mean it’s the time of the year when we sit back, look at each other, and ask, “Where did all the time go?!” We brought back a very special someone for this episode as we collectively reflected on the year—show highlights (and what about those shows have stuck with us), industry reflections, and a little shameless shilling for Tim’s book (are you still short on a few stocking stuffers? Order now…!).
This episode’s Measurement Bite from show sponsor Recast is a brief explanation of Granger causality (and how it’s NOT actually a causal measure!) from Michael Kaminsky!
Photo by Vladyslav Tobolenko on Unsplash
00:00:05.76 [Announcer]: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.
00:00:15.19 [Michael Helbling]: Hey everybody, welcome. It’s the Analytics Power Hour and this is episode 287. Ho, ho, ho, holy shit. Another year is basically over. 2025, I mean, it never even had a chance to slow down and decompress, it feels like. I mean, we’re just running a break next beat, finding out about AI, doing our work, trying to do everything. But regardless, we’re going to try to take a look back and maybe a small peek forward. That’s the analytics power hour year in review episode. And so with no more ado, it’s time to introduce my awesome co-hosts, Moee Kisss, Director of Data Science for Marketing at Canva. How you going?
00:01:00.54 [Moe Kiss]: I’m going pretty good. But yeah, 2025, that was a time.
00:01:04.37 [Michael Helbling]: It felt fast.
00:01:06.67 [Moe Kiss]: Big year.
00:01:07.88 [Michael Helbling]: Big year, I agree. Tim Wilson, Head of Solutions and facts & feelings. Do you agree? Hello. Hello. Hello. Hello. Hello. Quite a year. Yeah. Val Coroll, head of Deliverate facts & feelings. How’s your year going? Gone.
00:01:27.76 [Val Kroll]: Lots of feelings. There were lots of feelings.
00:01:29.79 [Michael Helbling]: Yeah. I agree. And of course, we are missing Julie Hoyer as she enjoys some time off with her new baby. And so we look forward to her coming back next year.
00:01:42.29 [Tim Wilson]: So her year is going sleep deprived, right? Yeah, that’s right.
00:01:48.33 [Michael Helbling]: And of course, as a special treat, we’ve got Josh Crowhurst, Growth Marketing Director at Immanuel Life as our special guest this episode. Welcome back, Josh.
00:02:01.21 [Josh Crowhurst]: Hey, yes, great to be here.
00:02:02.81 [Michael Helbling]: You know, I don’t know Josh if our listeners actually, many of them know the story of how you became involved with the podcast in the first place. So if you don’t mind, I’d like to take a second and just tell people how that happened.
00:02:20.66 [Tim Wilson]: I thought it was gonna be like the 2025 and how you stormed away. Like, keep it as the year in review.
00:02:24.95 [Michael Helbling]: Well, I mean, it’s part of the year in review that Josh finally had to step back from his role with the podcast. So we’re actually really glad that you did rejoin for this one last episode for year in review, which is our tradition. And, you know, if you’re up for it, come back next year. We don’t care, but yeah, Josh stopped being involved. Nicely put, Michael. No, I’m just saying, it’ll be fun. It’s not pressure, it’s up to you. You got a lot going on in life. But no, early 2019, Tim and I were working out how to make the show better, and we thought we needed some help. And so we put out a call for a producer. It was a poorly written job description, one that we did not fully understand. And then- Did Tim fully understood it just to be clear? Well, in terms of like what it would take to do and what we were looking for and all those things, it was just very much like a shot in the dark. To our surprise and delight, we got a response from Josh Crowhurst. And after chatting with him a few months later, because I forgot about the email and didn’t look at it for a while, Josh joined the show as our producer and was with us for, I believe, six years, which is incredible. It’s so amazing. And so now that life has taken Josh in a new direction and he’s growing, he’s obviously stepping into bigger and bigger roles. And it’s so cool to see how your life and career has just flourished. And I like to think maybe I mean, I don’t think that, I don’t know. Anyways. You didn’t even do an audio production at all, yeah.
00:04:10.21 [Josh Crowhurst]: I couldn’t even… All thanks to you, Alves. All thanks to you.
00:04:14.12 [Michael Helbling]: No, not me personally. Just the analytics power hour generally. benefited your career in some way. I’d love to think that, but probably it did. Absolutely. Anyway, we appreciate it and we’re happy that you are able to join us for this episode. Okay, what we do on all these episodes in your review, we like to look back at the year that just went past. We did a lot of shows. We did a lot of interesting shows. We like to talk about some of them, highlight some of our favorite episodes, maybe chat about some of the things that happened this year. So who wants to kick us off? What’s an episode that really stands out for you?
00:04:49.04 [Val Kroll]: Well, obviously we started off our year strong. No show would be complete without Tim Wilson kicking off our year. with the announcement of Analytics the Right Way, episode 263. So that was a big, we were all so excited to see that come to life. And it was super fun to be a part of that episode, since I had the pleasure of working with Dr. Joe Sutherland. And that was just a really fun, big moment, like diving into all of the big themes of the book. But that was the first one of the year, was it? It feels like if it was not for a second. Yeah, starting strong. That was a good one.
00:05:32.77 [Moe Kiss]: I still hope I know about missing that one.
00:05:36.19 [Val Kroll]: We did fight to figure always get to be on it. Yeah.
00:05:40.08 [Tim Wilson]: Well, as other people have released books this year, I realized what a kind of a shit job of ongoing, rolling thunder, you know, promotion of the book. But I was in it for the writing of the book, and I figured it was going to be downhill. Once he showed up on the analytics power hours, a guest, why would there need to be any other promotion? The old APH bump, we like to call it. Yep. Clearly. Dozens, dozens of books flew off the shelf.
00:06:09.47 [Moe Kiss]: I have bought six alone, so I am definitely helping the supplies go out the door.
00:06:15.08 [Val Kroll]: Were those some of your stocking stuffers, Moe, for friends and family?
00:06:21.41 [Tim Wilson]: Folks, it’s not too late. If you’re listening now, you can… That’s right.
00:06:28.07 [Michael Helbling]: for that special someone in your life. The e-book version.
00:06:35.09 [Val Kroll]: Use code APHBump for 10% off.
00:06:39.76 [Michael Helbling]: Don’t say that. Oh, man. Well, I’m glad there’s no other episodes to talk about. Yeah, that was really the one. Let’s talk about that one more.
00:06:54.49 [Michael Helbling]: That was the one. All right. Listen, I have an episode. Here’s the thing. Okay. When we do this podcast, this is the thing I do with a lot of things. When I interview people, when I work with people, when I talk to people, I’m always looking for where their passion lies, what sparks. kind of what makes their eyes light up. And one of our episodes that I really enjoyed and it was a person I’d wanted to get on the show for a long time was Dan McCarthy, which we did episode 272 about calculated and complex metrics. It was a really fun conversation and Dan is so smart and so amazing in his role as a professor in studying these companies and the metrics they produce, especially for public reporting, for stock reporting purposes. But what was amazing was the passion he has for these topics through music. And he has a sound cloud with all these songs on it. And it was sort of after the show was over that he kind of started in on it. But that was sort of where I saw the switch kind of flip into this is fun and up a little bit of light in his eyes about that kind of thing. And I’m sure obviously he enjoys his other work too. But it was just really cool to kind of connect with In the coolest way possible, another data nerd about things they loved about their work and about data. Anyway, so that was just a moment that kind of stood out to me. As far as being a really educational and fun episode, it was just so cool to watch somebody’s eyes light up about things they were passionate about.
00:08:31.86 [Moe Kiss]: I learned so much on that episode and I even probably like a week ago sent it to someone to have a listen. The number of times I get questions about LTV2CAC and like why finance and public companies are like so interested in that specific metric and how it’s calculated. I’m just like, here is a show that I prepared earlier. Please peruse at your own leisure. And I just loved how he did such a wonderful job of really getting into the, I guess, the different perspectives and the complexities that we sometimes face as data folks in a metric that its surface might seem really simple and obvious, but actually can really change a business decision or a perspective of a business by how it’s calculated and how it’s interpreted. And also just to say, like his SoundCloud, the number of data show and tells that I’ve opened with one of those songs, and people are always like, Moe, where do you get these data songs? I’m like, blah. I know people. I know people. So yeah, I definitely had that in my top couple of episode list as well.
00:09:46.52 [Tim Wilson]: Well, that was like my finding him. So I now like see more of his stuff. And he made the point on that episode, and then he kind of continues to make it that when companies stop reporting stuff, it’s not usually for… Sometimes that’s informative. Yeah, that in and of itself. And there’s some kind of hand-waving as to why. And he’s like, but another way to look at it would be, here’s this thing I wrote two years ago that indicated this might be problematic. So yeah, he was a fun one.
00:10:21.38 [Josh Crowhurst]: So on the topic of things that people are passionate about, I think one of the episodes that I absolutely loved and maybe is a bit in line with something that I’m really passionate about was number 282, using and creating data to understand pop culture with Chris Della Riva. So for me, this was honestly probably my favorite episode ever. because it’s like so it’s so right up my alley like it’s in my backyard like it’s like he’s talking about looking up writing credits and production credits on songs and tracking that and this is something that I just do just impulsively like I’m always annoying my friends with pointless surprising facts about songs that Like, did you know Bruno Mars co-wrote Forget You or like, I don’t know, Mark Ronson produced and wrote that song from A Star Is Born? Like, just like shit like that. I’m just always, I’m always looking behind and saying, like, who’s involved in that song? And the idea that there are just people behind the scenes that maybe don’t have mainstream name recognition in a lot of cases, but have really shaped what you’re hearing on the radio or on Spotify for, you know, sometimes for decades. And so, yeah, Chris talks about tracking that and having that in a data set. And I wish I could get my hands on that data because I would absolutely just be pouring up for it. Oh, it’s there. It’s on the show facts.
00:11:52.90 [Tim Wilson]: You can. It’s on the it’s on the show notes page. Oh, my God.
00:11:56.80 [Josh Crowhurst]: Yeah, we found out. I’m out. Yeah. Okay.
00:12:03.33 [Michael Helbling]: I’m diving. Josh liner notes. Crowhurst.
00:12:10.54 [Tim Wilson]: And Michael, you really enjoyed recording that show. Is that, is that right, Michael? You know what? Thank you so much, Tim, for bringing up a sore point.
00:12:20.45 [Michael Helbling]: I just find it hilarious after 11 years of you basically being like, I don’t know anything about pop culture. Like you record that episode instead of me, like come on.
00:12:35.15 [Tim Wilson]: I read his newsletter. No, that was a fair, fair. Anyways, it was.
00:12:39.71 [Val Kroll]: We’re gonna have to rename the show, Year in Review and Erring of Grievances.
00:12:44.32 [Michael Helbling]: This is right. It’s the Festivus Erring of Grievances.
00:12:48.97 [Tim Wilson]: Which Chris’s book is now out. It was not out when we recorded the, but it is. So also, if you’re like somebody, love someone so much that you want to get them analyzed the right way, and a second book that Uncharted territory is now available at booksellers near you. Still available by Boxing Day, probably.
00:13:15.89 [Moe Kiss]: Did you guys have Boxing Day?
00:13:17.75 [Michael Helbling]: No, but it’s the day after Christmas, so you have one more day, so maybe it’ll shift the time.
00:13:23.72 [Tim Wilson]: I don’t know. Everybody has some pretentious neighbor who celebrates Boxing Day, so they can explain to you what it is. Boxing Day is awesome.
00:13:30.31 [Moe Kiss]: You have leftover food and none of the pressure of Christmas Day.
00:13:33.83 [Tim Wilson]: Right. Now, imagine that coming out of an American who’s just explaining how sophisticated they are.
00:13:39.56 [Michael Helbling]: Well, I obviously, with these book recommendations, I would think you’d be talking about Holavoka Flaude. So maybe that’s the holiday. What? Not familiar. Sorry. And it’s an Icelandic holiday where you read books right before Christmas. So there you go.
00:13:56.83 [Moe Kiss]: I was about to say, should I, like, pivot us in a totally different direction and talk about the elephant in the room?
00:14:02.22 [Michael Helbling]: Oh, yeah. I mean… What?
00:14:06.41 [Moe Kiss]: How many episodes you reckon AI came up in? Oh, damn it. I should have actually been prepared. Hold on. And I kept transcripts or some shit. That would have been a good idea.
00:14:16.95 [Val Kroll]: Yeah, use your librarian thing, Michael.
00:14:20.26 [Michael Helbling]: Yeah, well, we don’t have every episode uploaded yet. So it’s still a working process. But thank you, Val, for bringing that up, because it’s an AI project that Tim and I are working on. But I’ve got to say, Moe, it probably came up in probably 75% of our episodes.
00:14:37.03 [Moe Kiss]: You reckon 75%? Everyone put in a guess. I would say maybe higher.
00:14:41.97 [Tim Wilson]: No, I think I’d go 70. I mean, I’m counting.
00:14:48.92 [Val Kroll]: between one, whether it was a topic or it just came up. If it just came up or last calls.
00:14:55.63 [Josh Crowhurst]: Do last calls count? They do in my head.
00:14:57.83 [Val Kroll]: That’s why I got to my number.
00:15:00.86 [Michael Helbling]: I mean, there’s at least 10 episodes that have AI in the title. I’m going to say 90%.
00:15:07.19 [Moe Kiss]: Yeah, it was a lot. Let’s leave everyone hanging and not we can report back in a future day.
00:15:12.81 [Michael Helbling]: That’s right. Guess how many jelly beans are in the AI jar?
00:15:17.87 [Tim Wilson]: So I tried to go on record that I did not commit to that it will be reported out at some future date. So I think the likelihood of that happening is.
00:15:25.73 [Val Kroll]: If any of our listeners want to figure it out, sound off in the comments.
00:15:30.94 [Michael Helbling]: If only we had a producer who could go back through. You know, Tim, as we click champagne glasses on another successful year of the podcast, I think our listeners would agree that you and I almost always agree on things.
00:15:50.15 [Tim Wilson]: What? Absolutely not. I spend half or most of my time on this show, I think, just correcting your misguided thinking.
00:15:57.95 [Michael Helbling]: Well, agree to disagree. But there is one thing we both agree on. AI is starting to reshape our industry. And I think we both call bullshit on nonsense like vibe analytics. Absolutely fucking right. But here’s the flip side. Analysts do have to start using AI. Leveraging LLMs to multiplier capabilities isn’t just interesting anymore. It’s going to be table stakes in 2026.
00:16:22.30 [Tim Wilson]: Which is why I’m actually excited about our new sponsor, Ask Why. Yes, it’s an AI tool, but it’s one where analysts can do real work. And critically, Ask Why is smart about data privacy. They do not send your raw data to the LLM. Right.
00:16:38.86 [Michael Helbling]: Ask Why builds a semantic layer on top of your data and then uses that to generate SQL that answers your questions or helps you build reports on your own data set. It’s currently in beta and it’s evolving fast, but you get the upside of AI and the assurance that your data stays secure. You can actually start leveling up into being an AI analyst, starting with Ask Why.
00:17:00.00 [Tim Wilson]: For a limited time, use the code APH when you join the waitlist, and our friends at Ask Why will move you right to the top of that list. The site is ask-y.ai.
00:17:11.86 [Michael Helbling]: That’s ask-y.ai. So go sign up for the waitlist using code APH.
00:17:19.61 [Tim Wilson]: This isn’t Vibe Analytics. This is the rise of the AI analyst.
00:17:23.82 [Michael Helbling]: All right, let’s get back to the show. Yeah, it is interesting because it certainly, I mean, Moe, I think the point you’re making is like AI was everywhere and always here all year long in 2025. And it seemed to grow in speed and pace throughout the year.
00:17:44.46 [Val Kroll]: Yeah, definitely a topic that came up in the listener survey is people wanting to, wanting it covered, wanting some topics covered there. So I think that creeped into our schedule, informed in
00:17:56.23 [Tim Wilson]: And as my other hat as the fielder of the inbound pitches for show topics, I can certainly say that that percentage was definitely north of 75%. But is it fair to say, and maybe this is my normally optimistic self that you guys are so familiar with, that at the start of the year, the ratio of AI hype to AI specifically in the world of data and analytics, that it was like north of 90% of the AI hype excitement to the, wait a minute guys, it’s not going to be everything in that it’s slowly gotten a little bit more in balance just as the conversation in the zeitgeist around what AI can and can’t do as people have gotten their hands on it and realized limitations or is that me?
00:18:56.78 [Moe Kiss]: I think that’s fair.
00:18:58.12 [Michael Helbling]: It has come back a little. I still think we’re a little out over our skis, though somewhat in terms of AI. I mean, just AI in general, like a lot of people think we’re in a bubble. By the time this comes out, hopefully the stock market hasn’t crashed or anything, but that’s always a thing that people are talking about. It’s like, oh, is this all a bubble? And like the.com boom and bust, kind of an idea.
00:19:23.90 [Val Kroll]: I think it’s like with any trendy thing, it’s like cool to think of all the use cases and all the potential. And then the cool thing is to be like, but you can’t do this, can’t do that. So like, I feel like we’re in that phase of like the LinkedIn. Like I just get so tired, you know?
00:19:39.51 [Michael Helbling]: It’s like the 50th time you hear they not like us. And you’re like, no.
00:19:46.52 [Tim Wilson]: I did see a thing where somebody- That was good, Michael.
00:19:49.97 [Val Kroll]: That was good, Michael.
00:19:54.66 [Tim Wilson]: I read a piece that was saying that instead of a bubble, think of it as a forest fire, which it actually has a lot of bubble tendencies. Well, but it talks about, even if you go back to the internet, the original, the 2000 internet bubble, that it was pointing out that it’s like the bubble burst and it’s not like you’re back where you started. There are Players that were sufficiently hardy and had actually a plan that they, they were like the big trees that actually managed to weather it. And they’re like, yeah, Google, Apple, Microsoft, they’re not going anywhere if the bubble bursts. And then it talked about the ones that are basically just the thin veneer of crap that those are just going to disappear, but that it’s also that correction when it comes, there will be a smarter universe out there and there will be little shoots that come out of it that have can kind of, I don’t know how they refer to it as little green shoots that will crop up. Once all that sort of gets cleared out. It seemed like a useful metaphor. Involved metaphor. But I also find it’s crazy like just having conversations with normies and sort of where the person who’s not kind of has some responsibility to figure it out, how much they’re not, there isn’t real depth of thought. They’re just, I had a friend say, I just use ChatGPT instead of Google search now. And I was like, I don’t have the energy to say you could just use Google search and it would be Jim and I like, if you just want plain text results, and that’s kind of the extent of what they’re doing. although I could also go on some rants as well.
00:21:44.55 [Michael Helbling]: You know, it was interesting to me this year when I would go to different events and like conferences or things like that and see the pace. Like, I remember going to measure camp New York in the spring. And of course, everyone was talking about AI this, AI that, and it was all kind of like, wow, look at all this cool stuff. And then literally from then to the fall and measure camp Chicago, I felt like, we’d already gone through a maturity curve almost with the way we’re discussing AI and some of its use cases. It just seemed like we’re just blasting through the cycle really fast, feels like sometimes. Some places, there’s still quite a bit of hype, but I do think some people are getting their feet on the ground and starting to use it for actual things and starting to understand how to leverage or how to think through use cases effectively.
00:22:39.09 [Moe Kiss]: So that was literally the thing that has been on my mind when I was looking at the episodes that were my favorite. It’s probably recency bias, but they were definitely the ones towards the end of the year. Well, I suppose they weren’t all the end of the year, but like the semantic layer episodes, I thought the topics on BI with Colin were really good and then also loved the one on Bayesian stats with Michael Kaminsky. But part of me wondered, I just felt like, There was this return to us discussing. I want to say quote unquote the basics, but it’s not basics. It’s the fundamentals of data stuff. And is the reason we were discussing that is because like everyone’s trying to go so fast on AI. There was this like not reckoning, but like acknowledgement that to do that well. I don’t want to be like the usual shit of data, bad data in, bad data out, blah, blah, blah, that sort of crap. But I felt like I’ve been giving a lot of thought and energy. And I feel like folks in the industry are about the quality and how we do things well and how we measure if the output is good. And that, of its nature, means we have to have more sophisticated conversations about fundamental data concepts. And I felt like there was a return to that. And maybe that’s Similar to what you’re talking about Michael where like there was kind of a bit of a rush and then people are like Having more sophisticated discussions is probably a good summary.
00:24:09.18 [Michael Helbling]: Yeah. No, I like that framing because I think that’s exactly right. It’s sort of like The early thing I saw was like, well, your own expertise drives results in AI all the time. But it’s sort of like, OK, if you go down to some brass tacks about how to conduct analysis, how to think about data lineage, how to think about traceability, all the things that we teachers are taught as analysts to be able to compose an analysis correctly, follow it through correctly, and deliver out the other side, those are all steps we learned as analysts. And so AI is a part of that process now, but we still have to maintain all of those parts along the way, it feels like. Does that, I don’t know. And maybe AI will get so good, it can do all those steps for us at some point, but I just don’t think there’s ever gonna be any time in the near future, like a black box appropriate approach to analysis. which don’t get me started on the topic of vibe analytics, which is the most stupidest thing I’ve ever heard of in my life.
00:25:16.94 [Moe Kiss]: Well, I think we need to do a spin-off episode on that because I disagree.
00:25:20.29 [Michael Helbling]: Well, it’s probably definitional or semantically, we’re probably in agreement, but yeah, we can probably do a whole show on it. Well, what other?
00:25:32.08 [Josh Crowhurst]: This is something that I’ve also noticed, I think is kind of related is that that using AI, it really drives home to me that you really have to, especially as a manager, you need to have your critical thinking skills switched on because things will start to come up produced by, I mean, especially more junior people in their careers that are I guess more A.I. native will be using this and might at some times skip some of the steps in producing an analysis and they’ll come up with something that sounds really logical. But maybe, you know, they had a conclusion in mind that they punched it into chat GPT and worked backwards at arriving on some logic to present an idea that maybe hasn’t been fully thought through. So this is something where I think we have to be super, super aware of it, right? That there’s a lot of, I guess, convincing sounding bullshit, where if you To go one layer deeper like the thinking just isn’t isn’t there so coming back to the idea of having the fundamentals, but also just being aware that you know, this is this is around us all the time and try to Try to really focus on You know is the logic sound I mean, I think that’s That is there are
00:27:06.46 [Tim Wilson]: when it gets used as, this is something that I don’t enjoy doing, AI gets put out there as being, oh, the grunt and tedious work that you do, AI can do that. Now, I think that’s an overinflation, like how many people are literally sitting there saying, I do monotonous, tedious, repetitive work day in and day out, and no one has come out with a way to streamline that. So this monotonous tedious work gets conflated with, this is work that I don’t really enjoy or I have to kind of think about it. I hate summarizing meetings that are all over the place. Oh, look, Zoom will just record and summarize for me. And it’s like, well, you may hate doing that, but you’re missing what sort of value you should be adding along the way. And I think the same thing goes for If you think that the goal is to get a slide deck produced that looks plausible, then you’re missing what analysis is. There is stuff that is supposed to be hard and that you are having to think through it with that structure as you go.
00:28:22.21 [Michael Helbling]: Yeah, I want to step aside for a quick second and take a quick break with our friend Michael Kaminski from ReCast, the Media Mix Marketing and Geolift platform helping teams forecast accurately and make better decisions. Michael’s sharing bite-sized marketing science lessons over the coming months to help you measure smarter. Over to you, Michael.
00:28:45.82 [Michael Kaminsky (Recast)]: Granger causality might be the worst-named concept in analytics. What you need to know is that Granger causality does not demonstrate causality. Just because some variable passes a Granger check does not mean that it causes some other variable. What Granger causality actually shows is predictive ability. Effectively, the check is looking to see if past values of x can predict y. better than past values of why alone. As an example, let’s imagine we have two time series. One is the time that a rooster crows every morning, and the second is the time of the sunrise. By just eyeballing the data, we can see that the rooster crows consistently a bit before sunrise. Yet, a Granger causality test would conclude that rooster crows Granger cause the sun to come up every morning. The problem is really in the name. It confuses analysts and especially business stakeholders who, understandably, assume that a Granger causality test actually checks for causality. Here’s what to remember, Granger Causality only tests whether one variable proceeds and helps predict another. It says nothing about whether one actually causes the other.
00:29:45.81 [Michael Helbling]: Thanks, Michael. And for those who haven’t heard, our friends at ReCast just launched their new incrementality testing platform, GeoLift by ReCast. It’s a simple, powerful way for marketing and data teams to measure the true impact of their advertising spend. And even better, you can use it completely free for six months, just visit getrecast.com slash geolift to start your trial today. Okay, well, let’s talk about shows we liked maybe that didn’t always touch or didn’t touch fully on AI. What are some topics we liked this year that weren’t necessarily in the AI wheelhouse? And kind of Moee, this is coming off of you talking about sort of this fundamentals kind of an idea.
00:30:32.31 [Val Kroll]: One of the ones that I had FOMO for not being on was the ANOVA, A Hardly Know Ya, with Chelsea. Oh, that was so good. That one was so good. I mean, she’s just a joy. But I don’t know if you guys remember, but she’s one of the things that you guys started with on the episode is that she had a poem. pre-CHAT GPT times Twitter feed poem about ANOVA, which I loved. But she was just so thoughtful in the way that she was describing and getting into all the inner workings and the comparisons with ANCOVA and MANOVA. She’s like, at the end of the day, it’s linear aggression all the way down. And I thought, you guys did a really nice job probing with some really good questions that were very thoughtful from real life experiences that I’d thought. made that episode really good. I’ve definitely listened to that one more than once this year, but that was really fun. It was an easy listen, even though it’s a complex topic.
00:31:28.24 [Tim Wilson]: I’ll throw in the episode 268. You get an insight, and you get an insight with Chris Kocek, which was, I would say, very not AI, because it was so much about a human being pulling things from different directions. And that wasn’t the first. We had Rod Jacka on years, Jacka, Jacka. Chaka, on years ago to talk about what is an insight. So I feel like that’s a perpetual question in our industry. And there are certainly a million AI-powered tools that are like, it’ll find insights for you. And to me, that was like that episode, Chris is not an analytics person. He is coming from much more of a creative and messaging and branding background and getting his perspective on what the many, many facets and the inherently human nature of trying to get some deeper understanding about something. I thought it was a pretty nice corrective to the AI hype. I really liked how he defined an insight.
00:32:36.58 [Michael Helbling]: You know, another one of my favorite episodes, and Moe, you mentioned this one as well, was the one with Michael Kaminski about Bayesian statistics. I think throughout my career, I’ve learned things sort of just sort of by arriving at them, not necessarily being officially trained in them or those kinds of things, just because of how I started in analytics and how I kind of grew into the field. And it was sort of this really big light bulb moment to sort of realize like, wow, the way that I actually approached this stuff is literally what we talked about in that episode. And sort of, for the first time, kind of slammed together in my mind, like made the connection finally like, oh, that’s Bayesian statistics. So it’s just so funny. Yeah, I know what that is conceptually, like, oh, it’s your priors, blah, blah, blah. But as a model for actually doing stuff in the real world, I hadn’t really said, like, oh, I’m Bayesian in the way that I think about that.
00:33:32.67 [Moe Kiss]: It’s funny because I think one of my tendencies, and I always say this to my team, is that I oversimplify things. And I think that’s just part of my role, right, is I’m often trying to communicate something really complex to a leadership team. But I think one of the things that I really loved about that episode is in my mind, I think I had maybe perhaps oversimplified what I understood about Bayesian stats, and Michael brought a level of new depth to the topic that really add a lot of value to me personally.
00:34:04.86 [Michael Helbling]: Yeah. I really liked it. It actually was super applicable. I was literally sitting down with a client not long after we recorded that, and I was able to walk them through a process they could follow where they were in a situation where a frequentist approach would not have worked well. in that context, and I was like, well, here’s some other alternatives. We could actually do something like this, and it actually worked really well. But it’s funny because I probably would have still suggested that, but now I could actually call it what it was, as opposed to being like, I’ve got an idea. Try this. It probably has a name. I just don’t know it. Anyway, it was just really cool to connect the dots on that for me this year.
00:34:51.33 [Val Kroll]: All right, one of the other ones that I’ll throw out there, another recent one that we did was 268, the metrics layers, data dictionaries, maybe it’s all semantic layers with Cindy Hausen. So I have to admit full transparency when we were in our planning for that, I’m like, is that really a whole episode? I’m like, I don’t know. Okay, I’m not on it so I feel, but holy shit. Yes, it was a whole episode because it was with Cindy and it was really, really well done. I love that one so much.
00:35:23.20 [Michael Helbling]: Val, Tim and I both will tell you like we’ve gone into certain episodes over the years and been like, I don’t know about this. And it turns out to be amazing. So like a lot of times a little bit of doubt is almost like an indicator that like something good might happen here.
00:35:38.02 [Moe Kiss]: But also, I think the fact is that Cindy herself is such an experienced data practitioner, has such a depth of knowledge about the technologies and the topic we’re talking about. I mean, I could talk about semantic layers for hours, which I have done with Cindy from time to time. But I think that episode was really strong and really Yeah, semantic layers is a hot topic at the moment. Lots of folks are building things. There’s a DBT product, a Snowflake product. There’s a bunch of similar products that are built into BI tools. It’s a very timely episode, as well, given how quickly things are moving in the industry or maybe, I don’t know, maybe not quickly because we’re like trying to catch up. But Cindy, I think Cindy was just such a wonderful guest for that specific episode and probably is one of my favorites as well.
00:36:31.43 [Tim Wilson]: And the fact that she made the point that one, they’re not new, and two, thinking of it as one monolithic thing, I was like, those were like two. That was big. Very like, ah, this has gotten the label of this is the grand new thing, just roll it out. And I was like, it is the fact that she is very, very politely really fucking annoyed with the cycle of the latest shiny thing being treated as like, this is the thing, the answer. So Josh, you were going to say something.
00:37:06.43 [Josh Crowhurst]: Yeah, a recent one that I particularly enjoyed was 281 analytics, the view from the corner office with Annalie. Yeah, great episode. And I think we were talking about trying to get this like finding the right guess for this idea for Years, maybe? It was a long time. Yeah, that was when I think we were trying to put together for a long time. So when I saw that on my Spotify feed, I was like, oh, I have to listen to this right away. And it was it was worth the wait, for sure. And for me, it really it resonated maybe partly due to some perhaps slightly traumatic recent experiences in my previous company where I had exposure to senior leadership. A few of the things that she talked about like were really sharp. I thought talking about like setting a culture of productive curiosity. I love the term because yeah, I did see it first hand, you know, you’d be in a meeting and the CEO would make an offhand comment. And then people would just spend an inordinate time digging into that, like whatever, whatever the ask was, because, you know, the CEO said it, like I have to, you know, I have to, I have to do this. It might not be something that’s worth spending hours or days looking into. We would come back in the next meeting and the CEO wouldn’t necessarily even remember making the comment. So I kind of learned to level set in the meeting before going and saying, hey, we’re going to look into this. This is the amount of time we’re probably going to spend on it and just sort of set that. Just get that out there before leaving the room. But what Anna said was she She talks about having a level of precision that’s necessary and sufficient for the importance of the decision that’s being made. And then having the self-awareness as a leader to specify that. And then save the team some of the bandwidth. So as an analyst, when you’re in there, if it’s not clear, just state it and get it out in the open and get the alignment. But I love Anna’s perspective that taking ownership as a leader Realizing what you say, people might just take it and run with it and spend a ton of time and you didn’t necessarily intend it that way. So I loved that framing. And then I just thought, yeah, a really thoughtful perspective on what a data-driven culture can look like and how it can be established and driven from the executive level. And just one last.
00:39:40.45 [Moe Kiss]: That is the specific bit that really sung to me was how much responsibility she took as the leader for that culture versus assuming that your data scientists are responsible for the data culture alone. That was one that really stood out.
00:39:59.30 [Josh Crowhurst]: Yeah. No, it made me, I was like, I want to work there. She has such a great way of framing it and thinking about it and communicating her vision on how data can be used and should be used and then setting the example. It was really inspiring, honestly. And then one last thing that resonated, again, going back to my PTSD. But yeah, brief your analysts, right? If you want them to be set up to succeed the first time they’re presenting to the CEO, I’ll say that maybe didn’t happen for me. I might have been passed in front of the whole company group executive committee as a result of that. So please, I don’t think that a vandalist, please do that. That’s great advice. Prevent any, uh, any traumatic pantsings of your, of your team when they’re in a room with the big dogs.
00:40:53.50 [Val Kroll]: Poor Josh. Yeah. The thing that also struck me about that conversation was, I don’t think she realizes how novel her perspective is. Like, she was like, oh, she’s like, of course that that’s what leaders do. I’m like, I was like, can you say that in some of your circles? Like, I was like, where’s the, where’s the link to your jobs posting? I think I even said that, Josh. I was like, hopefully your last call is that you’re hiring. This is awesome. But yeah, no, that was a good one.
00:41:23.59 [Michael Helbling]: That was one of my favorites too, Josh, because it was in a way so affirming of a thing I’ve really come to start believing more and more is that leadership drives data culture more than the data team does. And as the only way to really drive a data-rich culture or data-informed culture in a company is if the leadership is doing it. Because even when you take on the role as a data leader in your company, you can’t force people to become data-driven. They either are, they aren’t. But if the CEO is saying it, well, that makes it a different thing altogether. But yeah, that was a great episode. And yeah, it was a long time coming. That was in our like, every year we’d have that in our list of like, yeah, we got to find somebody that could do justice to this topic. And as analytics people were always thinking like, yeah, what do they think about, you know, when they’re sitting as the CEO, what’s their perspective on data? Do they care? Do they look at these charts and graphs? Like that’s a question I think our whole audience thinks about. Anyways, Anna was amazing. That was
00:42:32.29 [Tim Wilson]: Yeah. Years ago, we had someone who agreed and was ready to come on and then ghosted us like completely. So it was like, yeah. Oh, I forgot all about that. Talk to him. Talk to him later. It was, it turned out his company was like in the midst of, it was about to get acquired. He was like, yeah, I really needed to go darling. I’m like, I don’t know that a email response of like, Hey, actually this isn’t a great time would have, you know, been too problematic, but I don’t know. So yeah, I agree.
00:43:02.61 [Michael Helbling]: Well, what trends are shaping the next year, Moe? God, I don’t know.
00:43:12.78 [Moe Kiss]: I think he’s… I’ve just obviously gone through lots of 2026 planning and thinking about the year ahead. It sounds so boring, but if I had to boil it down to a couple of key things that I’m really thinking about, it is about consistency and making sure that We have really solid consistency in metric definitions and how metrics are calculated and all those sorts. It just sounds boring, but I feel like it’s becoming more important than ever. I think the other thing that I’m spending a lot of time thinking about is, I don’t know, we’re all using AI just for internal efficiency gains and it just feels shit. If you’re using it to write a better email or a Slack message, it doesn’t feel like that is how we can be getting the best from some of these tools. And so thinking a lot more about specifically like the data products we make and how we can better automate. I’ll give you a specific example, which is going to sound really lame. It’s going to sound stupid and lame, but this is the exact thing. We used to keep a list of dashboards, like your top company dashboards. When someone on boards, you can be like, you want to know about this topic or this topic or this topic, you go here. It’s a manual list, it’s paid in the ass, it always ends up outdated, not maintained. I was like, That is a problem where we should be solving with technology, right? And I think that’s probably why I’m so hyper-focused on consistency and all the fundamentals. Because if you want to throw technology, how do we maintain this list without needing someone to go manually update some spreadsheet or whatever it is? How do you understand which your dashboards are being used, which are high value, which are going to answer the right question? To do that, the data that you’re using to build a technological solution has to be very good quality. But yeah, those are just the things that are on my mind going into 2026. Oh, Tim looks for you.
00:45:22.80 [Tim Wilson]: Well, I mean, I believe back on the fundamentals that there still is. It is so easy to get caught up and they were going to keep measure, measure, measure, measure, measure and the complexity kind of explodes. And Moe, you were at a very large massive amount of data digital native company. I have even in the last two weeks have had an experience with a massive company that their issue was much more around internal alignment on what different teams were trying to accomplish. and not the data. Every time the data people would come in, it was just kind of like puking out charts of stuff. And you could see that that wasn’t serving the business. I mean, there were some kind of comical ways in which the data people were very knowledgeable. The visualizations were fine. They could answer questions about the minutia, and that wasn’t remotely what the organization needed. So I think that’s not a direct response. I mean, I think my cringe a little bit like, well, let’s look at which dashboards people are looking at and which metrics like that, that to me winds up being coming up often saying, can AI come up with an engineering solution that’s just going to tell me the insight?
00:46:59.48 [Moe Kiss]: Like it’s kind of like, well, let’s just… No, I don’t agree. I don’t agree. I think the thing Fundamentally, you and I are very aligned that it’s about the business question that you’re trying to answer, right? Like I would say that that’s, I don’t know, I’m getting like a semi-nod. One of the concerns you have is like, Are people leveraging AI to answer a question that could be answered very easily with something that’s already built? And then it comes down to like, this is more about cost efficiency, right? Like, I don’t want someone continually asking a question every day that’s costing us money to run that is sitting on a dashboard that can be easily looked at and interrogated if they just know where it is. It’s about discoverability to answer that question. And so, like, there are multiple problems that you’re trying to solve and it might just be another way to answer that business question.
00:47:59.48 [Tim Wilson]: So I would say it’s not a, I wish it was a trend of 2025, but I think the reason I was kind of having that reaction to answering business questions goes back to momentarily mounts soapbox that the definition is if somebody in the business asks this question, it’s a business question and therefore I need to answer it. How can I answer that efficiently and most effectively? And it becomes a volume play with, and so if you instead totally shifted, I think there’s a a crap ton of questions that are kind of fishing that actually point to a much more fundamental challenge. But so if trying to solve, I mean, it goes to the, and this does come to the AI companies that are like, imagine if you could just sit there with chat GPT and just ask it questions and it would provide responses. And then the pushback winds up saying, ah, but the answers don’t have, they have hallucinations or ah, without this engineering, it can’t provide accurate. And to me, I’m like, That is not the goal in-state, is to have people who aren’t thinking rigorously about what they’re trying to do and are prematurely jumping to the data. I deeply in my soul believe that that is heading down a path of just getting more people wandering through more data to have more meaningless arguments to produce more overly lengthy PowerPoint or Canva or Google Slides decks that aren’t actually moving the business forward. So it’s actually putting fuel on the fire of something that is broken in business horribly, horribly, horribly.
00:49:45.74 [Michael Helbling]: Activity without outcome, maybe. So Tim, maybe the AI product you want to see built is the one that forces more rigorous questioning by guiding people through that process. So be like, why are you asking that question? Oh, interesting, refine that. OK, you don’t really want that analysis because you wouldn’t want to mistake this for this. So maybe you want this analysis. Like, something like that would be.
00:50:12.63 [Val Kroll]: And then at the end, does it turn into like an intake for like an intake system that goes… Oh my God! No!
00:50:20.33 [Moe Kiss]: You and me, I was telepathically communicating with you being like, it kind of sounds like a Jira intake ticket.
00:50:27.98 [Michael Helbling]: She’s throwing gaslighting. That actually looked like something I was already going to say, which was at the end of episode 279, the process of analytics, we have thoughts, that episode, we were talking about that. And at the end of that episode, I was like, now, because of AI, all these processes are going to take on even more importance. And Tim jumped down my throat and said, everyone’s been important. And he wasn’t wrong. But the reality is, is like, To get to leverage AI, you have to do those precursors. To Moee’s point, that return to some of the fundamentals is the trend. Tim was wrong to do that to me on that episode.
00:51:12.03 [Val Kroll]: That’s really the point I’m making. Here’s another poll. Do we think that AI was mentioned more this year or Tim’s blood pressure raising happened more this year?
00:51:27.22 [Tim Wilson]: Wait, what was the first option? What was the first option?
00:51:30.33 [Val Kroll]: Mentions of AI versus Tim’s blood pressure, right?
00:51:32.51 [Tim Wilson]: Oh, blood pressure, yeah. Well, they’re deeply correlated, and there is causation.
00:51:37.01 [Michael Helbling]: Well, at least for Tim’s blood pressure, there’s medications for that. Oh, brother. Well, that’s one trend that will probably continue is that Tim and I will tangle up a couple of times. No, it’s fine.
00:51:52.69 [Moe Kiss]: So I’m probably going to say something again fiery. Also, just to clarify, answering a business question does not mean that we should answer every question raised by the business, just to like caveat the former discussion before I move on to the next question.
00:52:06.60 [Tim Wilson]: But if you’re making it so that they can get to whatever the question, yeah. Okay.
00:52:11.99 [Moe Kiss]: The next topic that I also think is coming up a lot, which very much ties back to the episode with Anna, is about decision velocity. I think that is something that is really, really interesting. And again, Tim makes the point, I work in a very unique position at a company that’s probably not representative of what most companies are that data folks are working in. But it’s very much, how do you use the right level of rigor for the decision that you’re trying to make as a business? And sure, maybe there’s some AI sprinkle salt on the top of that as well.
00:52:51.89 [Tim Wilson]: So I think that making that point of giving getting the business more sophisticated that what are the stakes behind the decision and therefore is a little bit of a signal very quickly because is that. better and desirable than getting a complete answer, but way too late. I think there is starting to be some awareness on the business side that if you just are waiting for the inarguable truth, you’ll just be waiting forever. Although I think there is still the tension between the teams. They can’t get answers to me fast enough. Why can’t I just have an AI? This was secondhand. Somebody said their CMO was like, I just want to have the AI just tell me, you know, give me insights while I’m in the shower in the morning. I just want to get up and have it have sifted through the data. And I’m like, okay, we still have a ways, a ways to go because that CMO is, but that’s not the same thing as decision velocity.
00:53:56.99 [Michael Helbling]: Cause I guarantee you that CMO is not, uh, doing decisions effectively and a good speed. Because they’re doing the gathering of information incorrectly to go after decision velocity. One time, somebody told me that a CEO is just a decision engine. which I thought was actually a really cool way to think about that. We think about executive leadership generally, clearing obstacles for your team and all those things. Decision velocity is a huge part, like not getting yourself bogged down. There’s lots of frameworks for that, like the old Bezos, two-way door versus one-way door, decision matrix, that kind of stuff. There’s these things that can help, but I do think you could look at AI as an enabler of helping you frame or think about speed to decision. Because one of the things my old boss, my guest, used to do, he used to force us to write down decision journals. I don’t know if you’ve ever done this before. Very time consuming and very annoying, and I was always super bad at it. It’s probably why I’m not as good at decision makers as I should be. But it helps you then go back and look at previous decisions and what led up to them and go through that. So not everything is all data analysis. Data informs some decisions and so it would look greater or lesser extent. But to the extent that if I had to use this data, I might have made a better decision. If you’re evaluating your decision capabilities, and I think AI is really well suited to helping you remember some of those things as over time as well. So that could be another way to leverage AI in that context maybe.
00:55:44.57 [Tim Wilson]: Go faster, be smarter. So I think this is your opportunity, Michael, to make a decision to bring the show to a close. You know,
00:55:56.01 [Michael Helbling]: It’s about that time, Tim. It’s hard because I don’t want to because of two reasons. Because we’ve got Josh on the show and I don’t want it to end. And so that’s one part. And then the second part is, it’s the end of 2025. This is our last episode of the year.
00:56:15.13 [Moe Kiss]: Let’s get on with 2026. I am ready for it.
00:56:18.37 [Michael Helbling]: Moe is ready. All right. Let’s shut the door. So we’re done. Thank you all. As you’ve been listening, maybe you have a memory of 2025 you want to share. We would love to hear from you. Or what are you looking forward to in 2026? Same thing. Reach out to us. You can comment to us at our LinkedIn page or on the Measure Slack chat group. or via email at contact at analyticshour.io. We’d love to hear from you. And obviously, thank you, Josh. No show would be complete without thanking you for coming back to be one more special guest one more time. This is fun.
00:56:56.80 [Josh Crowhurst]: Thanks for having me, guys.
00:56:58.98 [Michael Helbling]: Yeah, it’s awesome. It’s awesome. We do. We do. I think you’re still in our Slack. I don’t know if you’ve just abandoned that Slack at all, or you’re still kind of peeking from time to time.
00:57:12.17 [Josh Crowhurst]: Oh, it’s still Slack, but I do still get the emails. Oh, okay. I don’t get analytics hours. Oh, gosh. Oh, yeah. I see those new ideas and suggestions coming through.
00:57:22.53 [Michael Helbling]: I’ll remove you from the email list, I guess, so that you don’t keep getting those. Yeah. Well, we didn’t really have a process for that. So under GDPR, you do have a right to be forgotten, but I don’t want to. All right, and if you listen to the show, leave a rating and review. If you’ve been listening throughout 2025, go to your favorite platform, give us a review, rate the show. That helps other people discover it. And we’ve had a lot of audience growth this year, both on our regular channels and on our YouTube channel. So if you’re ever on YouTube, subscribe to us there as well. We put every episode up on our YouTube channel, as well as some awesome shorts that the team puts together for each episode. don’t know what we’re gonna put together out of this one, but we’ll see. And then of course, for all of my co-hosts, I think I speak for everybody when I say, 2026 is gonna be an amazing year, but no matter what it brings, You know that you can always keep analyzing.
00:58:28.60 [Announcer]: Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at @analyticshour on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Grohurst. So smart guys want to fit in. So they made up a term called analytics. Analytics don’t work.
00:58:53.70 [Charles Barkley]: Do the analytics say go for it, no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.
00:59:06.71 [Michael Helbling]: I nearly, Josh, on the last episode, did a no-show-it-be-complete without a huge thank you to Josh Norris.
00:59:13.74 [Michael Helbling]: And I switched it. And I switched it at the last second. No-show-it-be-complete without cheap analyzing.
00:59:25.73 [Michael Helbling]: Yes, I know how I did it.
00:59:29.63 [Michael Helbling]: I did a huge thank you door. Yeah. And it’s just a hard cut. No joke, you complete with that. He’s analyzing.
00:59:43.12 [Josh Crowhurst]: Anyway. Yeah, I still need to see music, so I feel like I can see the original setup.
00:59:48.84 [Michael Helbling]: Yeah, you’re in there.
00:59:57.93 [Tim Wilson]: rock flag and let’s raise a glass with tim and mo with michael julie vile five hosts who guide us through the noise and make the numbers ten oh for all our power ours friends for all our power hours. We’ll toast the laughs and insight shared in all those power hours. Voice crack should have picked a different key on that one. That is awesome. That has to be it. That has to be it. That has to be it. That’s the best one we’ve ever done.