#261: 2024 Year in Review

Ten years ago, on a cold dark night, a podcast was started, ‘neath the pale moonlight. There were few there to see (or listen), but they all agreed that the show that was started looked a lot like we. And here we are a decade later with a diverse group of backgrounds, perspectives, and musical tastes (see the lyrics for “Long Black Veil” if you missed the reference in the opening of this episode description) still nattering on about analytics topics of the day. It’s our annual tradition of looking back on the year, albeit with a bit of a twist in the format for 2024: we took a few swings at identifying some of the best ideas, work, and content that we’d come across over the course of the year. Heated exchanges ensued, but so did some laughs!

Tools and Articles Mentioned in the Show

Photo by Kieran White on Unsplash

Episode Transcript

[music]

0:00:05.8 Announcer: Welcome to the analytics power hour, analytics topics covered conversationally and sometimes with explicit language.

0:00:14.2 Michael Helbling: Hi everybody, welcome. It’s the analytics power hour and this is episode 261. A long time ago, it was January 3rd, 2015, the first episode of the analytics power hour aired.

0:00:32.6 MH: And we didn’t think much about the future back then. I think we were just trying to share ideas and keep a conversation going with people that we liked from different conferences we’d attended. And we were trying out a new format and wow, 10 years can go by really fast.

0:00:53.2 MH: And over the years, we’ve had some amazing opportunities to take the analytics power hour to places we never would have imagined. 261 episodes, multiple live events across multiple continents, and a chance to interact with so many amazing analytics people who made the choice to listen and interact with us along this journey. And to everyone who’s been a part, both as a co-host, as a guest, and as a listener, I mean, we can only just say thank you. Thanks for an amazing 10 years.

0:01:27.7 Tim Wilson: It’s been a good run and we’re done.

0:01:30.8 MH: That’s it. And this is the final. No, it’s not. But it is the year review episode.

0:01:36.1 MH: All right. So one of my favorite episodes every year. And in our profession, a lot of times in analytics, we get more attention for what’s wrong with everything. In fact, we analytics people, we tend to be good complainers about all the different things that are broken and don’t work right. People don’t listen. And maybe this time around, we’re just gonna do something where we focus on the positive, Tim.

0:02:04.6 Valerie Kroll: Do your best. It’s just an hour.

0:02:11.1 MH: In this episode, we’re just gonna talk about some of the best things we saw this past year. So let’s jump into it. Let me introduce my co-host, Tim Wilson. Welcome.

0:02:21.9 TW: Two continents is the third we do in Antarctica in 2025.

[overlapping conversation]

0:02:26.8 MH: I sure hope so. Live with the penguins. And we can do that in Australia, apparently now.

0:02:31.9 TW: I mean I was right.

0:02:33.2 MH: Speaking of that, Moe Kiss. Welcome.

0:02:37.6 VK: Hi, how you doing? I feel like today’s gonna be a real role reversal ’cause I’m feeling particularly pessimistic. And I might have to steal that from Tim today.

0:02:45.8 TW: All right, well you.

0:02:48.2 VK: Freaky Friday.

0:02:50.1 MH: That’s right. And, of course, Josh Crowhurst. Welcome.

0:02:55.7 Moe Kiss: Hey, good to be here.

0:02:58.3 MH: Awesome. Thanks for being here.

0:03:01.9 MH: Val Kroll. Welcome.

0:03:02.9 Josh Crowhurst: Hello.

0:03:04.6 MH: Holding things down from Chicago.

0:03:05.0 JC: Fourth Floor Productions.

0:03:08.6 MH: ‘s right. Fourth Floor Productions. And Julie Hoyer. Welcome.

0:03:13.5 Julie Hoyer: Hey there.

0:03:14.0 MH: Awesome. And I’m Michael Helbling. All right. So it’s worth mentioning as we get started. Like, each of us does kind of very different things in our day-to-day. And so what I’m excited about as we talk through some of these topics, I actually think there’s a lot of different ways we look at the world and a lot of diversity in what we handle. And so I think actually what we’re gonna talk about, it kind of makes me really excited. Because when we go through this stuff, we’ll all have different perspectives and different experiences. And sometimes maybe even differing opinions. But that usually never happens, right? If you’ve listened to this show for 10 years.

0:03:51.2 VK: For 260 episodes, we’ve always been on the same page.

0:03:52.6 MH: That’s right. In lockstep on every idea. All right. Let’s dive into it. So we’re gonna talk first about the best thing we saw this year in data collection and pipelines in our work.

0:04:07.7 MH: So I don’t know who wants to kick us off, but I’ll throw it out there. It’s sort of like one of the things we do in analytics. We collect a lot of data.

0:04:14.2 MH: What was something awesome you saw in that area this year?

0:04:16.2 VK: Okay. Well, full disclosure. I feel like this year has been the year that I have really started to work more closely with Snowflake. And full disclosure, that’s who Canva uses as one of our partners. And I’ve just really been blown away by some of the technology they’ve been building.

0:04:34.8 VK: The Snowflake feature store is particularly cool because I feel like lots of companies try to build, like, a feature store, or a CDP, or whatever you wanna call it, in-house, especially if you have an in-house data warehouse. And Snowflake have made it, like, they’ve really simplified the ability to build a feature store that then particularly can be used for ML models. So that’s been really cool.

0:05:00.0 VK: And Snowpark is the other thing that’s been pretty awesome to see ’cause it lets our analytics engineering team program in multiple languages. I know my team in particular have managed to build a bunch of pipelines this year using Snowpark that have saved, like, days of full-time, like, FTE every week. And so for me, it’s really been kind of leveraging what other companies are doing well and then kind of using that to drive our innovation.

0:05:30.1 VK: But, like I said, I am very much on the Snowflake bandwagon in 2024.

0:05:31.9 MH: And they did not pay for that endorsement.

0:05:39.2 VK: They did not.

0:05:43.0 MK: Well, speaking of partnership, Moe, I’m glad you brought that up ’cause one of the greatest things I’ve seen in this year in, like, data collection in general, is actually we’ve been working, like, cross groups, skill sets, capabilities, we call them practices, on some clients. And it’s been really great to see we’ve started to partner better around implementations in general. We actually have had a couple clients where we’ve had implementation work and helping run their paid media.

0:06:12.2 MK: And in the past, that hasn’t always happened as often when we’re, like, in an engagement with a client. And so we were able to kind of take a step back and talk about, like, how do we take all of this into context when designing our implementations and better serve the client and kind of the big picture? We talk about that a lot, obviously, here on the podcast of, like, how do you take that bigger context into consideration?

0:06:33.0 MK: And so I was really happy that we were able to start to actually put that within our process for some re-implementations that we’re currently doing for clients. And it was interesting, too, because we had to really break down the understanding for the client and even internally, like the best practices they do in paid media compared to the best practices we do in certain reporting tools like Adobe, for example. And so making that a single connection point was actually something we spent a lot of time on.

0:07:06.2 MK: And I think it was really valuable for like, us and then obviously the client and the output they’re getting.

0:07:13.5 MH: It’s interesting. In my world, a lot of my clients this year were struggling with various parts of consent management, and that’s been around for a while.

0:07:25.1 MH: But there were some changes to some things this year that a lot of companies had to grapple with. And it was cool ’cause one of the folks here at Stacked Analytics, Charlie Tice, who’s pretty amazing, so not something I did in data collection, but I got to be part of a team that did some amazing things. And just watching him break that down into very simple terms for our clients and create elegant solutions that actually solve the problems that we’re facing to get that right.

0:07:55.0 MH: Because it’s kind of really imperative to get that stuff right now, like you can’t really fake it. And it’s crazy to me how badly as an industry we still are at that aspect of data collection. So that was my highlight for the year is just sort of how we were able to track down and really get some good solutions in place for companies this year. And mostly Charlie did it all, but I was very thankful to be part of it.

[music]

0:08:20.7 Announcer: It’s time to step away from the show for a quick word about Piwik Pro. Tim, tell us about it.

0:08:28.0 TW: Well, Piwik Pro has really exploded in popularity and keeps adding new functionality.

0:08:32.3 MH: They sure have. They’ve got an easy to use interface, a full set of features with capabilities like custom reports, enhanced e-commerce tracking, and a customer data platform.

0:08:44.7 TW: We love running Piwik Pro’s free plan on the podcast website. But they also have a paid plan that adds scale and some additional features.

0:08:49.9 MH: Yeah, head over to piwik.pro and check them out for yourself. You can get started with their free plan. That’s piwik.pro. And now let’s get back to the show.

0:09:05.2 MH: My contrarian. Well, first, I will say I’ve not worked with it, so I’ve just seen kind of the videos around it. But on a data collection, the nightjar from Moonbird.ai, very much Adobe Analytics focused thing. Just what I liked about it is they seem very, very focused and not like this is gonna solve everything.

0:09:21.6 MH: They’re like, we have identified a problem that we have lived and felt the pain with. But I haven’t actually used it. I’ve only sort of… I know the people who are doing it and have liked what they’ve been saying it is doing.

0:09:35.8 MH: But for me, there seems to… Picking up more signals around doing less data collection has kind of been what I’ve gotten most kind of excited about this year. And I probably pointed to Matt Gershoff kind of laying out the just enough, just in time mindset as opposed to just in case. And you kind of stack.

0:09:57.3 MH: I mean, Annie Duke just last month came out with something where she was kind of theorizing that we overcollect data partly so that we can talk about all the data that we looked at if something doesn’t turn out as we’d like it to. So, I’ve been… And even Jen Koontz has kind of been also talking about that. A lot of it’s been under the starting point of privacy and how we need to be very disciplined about what data we’re collecting on users for privacy reasons kind of writ large.

0:10:28.7 MH: But I’ve spent a good chunk of this year thinking around I think a lot of organizations. They have enough data. It’s clean enough. And maybe they shouldn’t be worrying about doing more or better or more sophisticated. So, that may be… That’s tying up what I’ve seen with kind of the focus of like my personal journey this year. But I really liked Gershoff’s framing of just in case is what we default to but getting just enough, just in time.

0:11:03.4 MH: I love that framing as a way to go about thinking about data collection.

0:11:09.9 VK: Me too. Like that actually is like… I started to do Chef Kiss and say it instead of just having the emoji. But, I mean, it’s a podcast. The emoji wouldn’t work. I guess the only caveat or the catch is we can see that in the data, I guess, community in 2024. But I don’t feel like our stakeholders are there yet. And I think that’s the really difficult thing is like we might say to them, just enough, just in time. Did I get it right?

0:11:39.0 MH: Yeah, just enough, just in time. Yeah.

0:11:42.1 VK: But I still feel like there’s a tension, right, where the business is like track everything. And you’re like, no, that’s really not a great idea anymore. So, maybe that’s where we get to for 2025.

0:11:55.4 MH: Mean, I think that’s a great point. I think that it’s an easy thing to default to. I mean, I go back over it probably 15 years ago being told, hey, this tracking is gonna be really easy. We just want you to track everything. Like it’s easy to articulate. So, there’s definitely a business partner management because it’s… They feel like obviously we need this. Like in the absence… Like clearly we need the data.

0:12:19.0 MH: And I think AI has driven a lot more around the, oh, you got to have the more data the better to feed into the monster machine. And it’s got to be really, really clean. Ben Stancil had a post last month about kind of calling that into question, which I thought was really good as well.

0:12:42.5 MH: So, I think there is a need to bring along our business partners with, wait a minute, what are you trying to do? Let’s not be prescriptive that you need this data. There’s no value in the data. There’s only value in what you’re doing with it. So, let’s talk about what you’re gonna do. But that’s a… I agree. I think it’s got to… Maybe it’s got to start with the data community to have that mindset to not just kind of take when somebody says we need this data to just say, well, not having it or having it. Clearly, yeah, we don’t have it. We can default to we need it instead of defaulting to maybe we don’t need it and we should question it without burning our relationships.

0:13:24.7 MK: And even if you were to look retrospectively… And, Julie, I’ve heard you talk about this several times, and I love the way that you put it. So, if I bastardize it, please correct me. But the number of times when, like, the just-in-case actually saves you is actually pretty slim.

0:13:40.7 MK: Like, trying to torture data for the purpose it wasn’t intended to be collected for gives you that, like, where’s the wind blowing direction kind of take on things, but it’s actually not always customized to answer the specific question at hand. And so, how much are you really missing out on, right? Like, I’ve heard you… I don’t know, Julie, if you have thoughts. This is definitely coming from your brain, but I totally agree with you.

0:14:05.8 JH: Yeah, no, I would say most of the time people say, oh, well, we’re collecting, data in the vein of this question. We should be good. And then by the time we really get into it, it’s like, oh, actually, sorry, you’re missing, like, a key setup, configuration, the way the data is actually coming in or is implemented. Like, it’s not gonna do what you think it’s gonna do. So, I run into that way more than, oh, cool, you already have the data. It’s perfectly set to answer this question. That, like, never happens. Like, I’ve learned my lesson.

0:14:35.2 MK: Serendipity.

0:14:38.0 MH: Well, and of course, with any topic, we’re gonna find some things to be concerned about. But let’s move on to our next category.

0:14:46.4 MH: So, this year, what was the best thing we saw in the area of experimentation and optimization? Like, what things did we see in that category this year that we liked?

0:14:56.6 MK: Okay. So, I’ll go first in this one. I’m actually excited to talk about this. So, just before I left further earlier this year, I was working with the optimization team on a cool project for consent banner optimization.

0:15:12.5 MK: So, as we know, Consent Banners are a little bit Wild, Wild West. As you mentioned, Michael, some changes happening this year. And so, I say Wild, Wild West ’cause we don’t really know kind of, like, what works in terms of right practices or how do we create the right transparency and allow people to really hold the remote on their preferences. So, the team started with a lot of research on, like, what capabilities are out there. OneTrust offers some A-B split testing. So, we could do a little research on what levers there were to pull.

0:15:40.0 MK: We did a lot of research to develop a POV on what further wanted to recommend to clients to be testing and what did we wanna ethically support to make sure that we weren’t injecting any dark patterns and that we were really focused on aiming on education and not obfuscation of information to really increase brand trust. And then we put some of that together to run some tests on their own for their website. And the first one, which was really just kind of playing around with some of those basic, I’m embarrassed to say it, almost button color principles of contrast just to really make sure people didn’t think it was a modal, that it wasn’t gonna stop their behavior, but that it allowed them to interact and set their preferences, we saw a huge increase.

0:16:23.9 MK: And so, we were able to kind of bundle that together to put together a nice solution to be able to share with clients and having that case study already in hand after 60 days of research and work ’cause it was definitely a labor of love to pull that together quickly. But it was really kind of fun ’cause it kind of felt like something I hadn’t seen before. I was able to engage with lots of different clients across a lot of different industries, but there’s definitely thematic, some of the similar things that come up time and time again. So, it was fun to kind of engage with this totally new and kind of like total clean blank slate.

0:17:00.5 MH: That’s cool. So, I missed it maybe, but what are you optimizing for in the cookie batter?

0:17:06.4 TW: Yeah, like what’s the actual metric that you’re…

0:17:09.7 MK: So, some of the times when you’re testing, it’s just about interaction with it altogether so that as maybe new regulation comes in, that affects it, that they know how to get people to interact with the banner itself to either set your preferences or accept all cookies. But there was some other tests that we hadn’t seen the results of by the time I left.

0:17:26.6 MK: So, definitely reach out to Lucy over at Further if you want more information. But we were trying to figure out how could we assess its impact on brand trust. That was one that we were really interested in because with some companies, with some sites, the cookies are the cookies, you have to be able to navigate.

0:17:45.7 MK: But because the banner has to be present, if there’s really no other lever than the limited levers that you have to pull, I should say, that brand trust is the one that feels like the most meaningful way to invest in that. So, that was one that was kind of in the works.

0:18:03.6 MH: Yeah, very cool.

0:18:03.8 JC: Well, maybe I’ll chime in with one here. It’s a little bit out of left field, but I’ve been becoming really interested in experimentation in the realm of sports science this year. So, yeah, I’ve gone down a lot of rabbit holes on studies that have been done on how athletes should train to do better in endurance events. So, like marathons, ultra marathons, cycling events. And there’s been a lot of RCTs done, a lot of research and experiments into it. And one area that I’m looking at is like the research into just training most of the time in a low intensity, really light. Like if you’re training for a marathon, just doing like 80% plus, long, slow, light running and not just feeling like you need to work as hard as you possibly can like every training session. And like this is personally relevant for me because I do this long distance paddling endurance sport, right? So, I’m kind of the nerd in the team, like going out there and being like, okay, what does the science say?

0:19:12.3 JC: And that’s totally not the culture at all. Like, it’s really like people are very resistant to…

0:19:18.0 MH: Trust the data. We should back off, just know, easy.

0:19:24.0 JC: Yeah, yeah. This guy. Yeah. So I’m just the one nerd in the corner doing that. And it’s definitely not the culture to train in that way. But the thing is, our rival team this year kind of did that. Like, they put that into practice and just did all this really light training. And even within their team, I know there was a lot of skepticism and doubt and people were questioning it. And then we had our races and like they smoked us, they completely smoked us. Whereas last year they didn’t. So I know that’s like n equals one. But I still thought it was pretty interesting to see, applying some of these learnings from other sports and the exercise science that’s being done and getting a good outcome. So anyways, maybe I could push for next year to listen to the nerds.

0:20:14.8 MH: Well, I was glad to hear this wasn’t associated with like a gambling thing or something, so that’s good.

0:20:22.6 VK: Yeah. Josh, I find this really, when you discovered this, were you really surprised? I feel like this is counterintuitive to what I thought the research said about like, doing short burst in intervals and like it seems really… Is it just because it’s endurance sport that it’s different?

0:20:39.3 JC: Yeah, so it’s… You sort of need to have both. And I’m definitely looking mainly in an endurance context, but I think this also applies to like shorter events. But basically the idea is that you do the long slow as a way to build up your aerobic base, which is gonna help you basically absorb oxygen more efficiently into your muscles and clear the byproducts more efficiently. And then you still need to do that high intensity as a component of your training. Sorry, I’m getting into the weeds a little bit in this.

0:21:14.3 VK: No, I’m here for it.

0:21:17.5 JC: But you only need to do a little bit of it because your system reacts really quickly to high intensity training. So you get those adaptations quickly, but it reacts really slowly to the long endurance training you need to get that base, but it just takes way longer to do it. So you can get away with just maybe leading up to the races, you incorporate more of that high intensity. So that’s sort of my understanding of it.

0:21:41.8 TW: I can say I’ve actually been seeing a lot of articles about that. My challenge is that like, I’m like, the range from my max effort to trying to go slow is really, really small. So if I’m running, I’m like, wait a minute, oh, I’m supposed to slow down. I’m like, I’m already going pretty slow. So what do you do?

0:22:03.2 MK: You already optimized, Tim is what you’re saying.

0:22:05.7 TW: Like can’t go much. Like you can’t really call it a run if you’re walking.

0:22:13.1 MH: My 80% looks like walking. Oh, that’s a good one.

0:22:19.5 JH: Yeah. I love that. One that I wanted to bring up is actually against the popular opinion as well. I had gone to one of the latest TLC Friday sessions and it was Georgi Georgi event, and he was talking about the difference between observed power and observed MDE compared to like what you set when you’re designing your test and how a lot of people will conclude because the test ends and then they check their power and they say, oh, it was underpowered, we like can’t use it and it’s conclusions. And he was pretty much saying, that’s not true and you shouldn’t actually look at your observed power because it’s different than the power that you use to design your test. And I don’t fully understand it enough to give you a nice synopsis here, but he has blog posts out and some LinkedIn posts about it.

0:23:14.6 JH: But it was really interesting and he just said how it really changed his understanding and way he actually uses power and even like MDE and that a lot of people in practice are still relying on observed power to make conclusions and that that is actually not a best practice. But I know that’s not a popular thing yet and I think a lot of people, it’s gonna take a while to see that change in the actual practitioners. And I was pretty shocked and it was a really great session. So highly recommend and I’ll definitely be rewatching it to try to wrap my head around it.

0:23:49.3 TW: Can I share that you and I were slacking each other during that session? I just pulled that back up to look at it. Where I was like, you were like, yeah, this is hard. And I’m like, yeah, if you’re not hanging on, I’m really not hanging on. That’s okay. I’ll ask. I have no shame. I’ll ask a question. And I asked the question and the response was like, yeah, I’m gonna have to take that offline. Which I interpreted as being like, that was the dumbest question. Anybody?

0:24:12.6 JH: No, he said it was a really good question. He just needed time to, like, he wanted to think about his answer.

0:24:17.0 TW: Yeah. I think that was kind of the, oh, that’s a really good question. You’re adorable. Idiot. I’m not gonna, yeah, I lost you clearly at slide two.

0:24:26.9 JC: Did you guys see those recently sort of tangentially related, but did you guys see those recent, I guess memes about like statistical significance and in some of the really terrible analysis out there, they’re sort of classifying like p equals 0.10 as like borderline semi significant or like extremely statistically significant for like a low value, Like some of these like really awful descriptions and kind of just the arbitrary nature of that.

0:25:02.7 JH: Oh my God.

0:25:04.6 MH: P equals 0.05 just in general made me laugh.

0:25:08.7 JH: Oh, that’s funny.

0:25:09.9 TW: I feel like I could… I’m not gonna name them, but there are people who’ve already gotten triggered by that Yeah. There. Are some very divergent schools of thought as to how useful it is to go down the path of parsing those versus not.

0:25:27.3 MH: Can of worms. Yeah.

0:25:27.4 JH: Deep breaths Tim.

0:25:30.0 TW: Save us Michael.

0:25:31.0 MH: Alright. Let’s move on to another category. One thing most all analytics people are involved with is getting data out into the hands of people who need to use it, whether it be marketers or operators or things like that. So what did we see this year in reporting? Like what were the best things we saw there? It’s not always the most glamorous part of the job, but it’s necessary and crucial.

0:25:57.6 JH: Ooh, Michael, I’ll start because you’re gonna be really excited about this one.

0:26:01.6 MH: Okay.

0:26:02.4 JH: So if you all remember the great episode we had with Cedric Chin and it was about XMR charts. Well, I have seen them in the wild for clients and they went over really, really well. Lucy, we already name dropped her earlier, she is just a rock star. She did it for one of our large clients that heavily relies on us for reporting. And it did a great job of showing variation like we talked about and showing the client that like, you’re always gonna have variation. Your KPI will move up and down, but like, let’s put a little thought around when we should really pay attention to it. So I know that they did a lot of like education and change management with the client. They worked really hard on the different visualizations to kind of show those bands and how they were setting them and when their KPIs they cared about actually needed to be paid attention to. So I think they said it went really well with the client and it’s been a great tool for them this year.

0:26:54.0 MH: Awesome. And yes, I am a huge fan of XMR charts now. Thank you.

0:27:02.5 JC: I have another one that’s a shout out to another former guest from this year. So there was an article by Eric Sandham recently, one of his weekly articles he’s putting out and he did one called The Joy of Business Reporting. So yeah, counteracting that narrative of reporting is just something you have to slog through and get through. It’s a necessary evil, not the most exciting part of the job, but actually he made a few points about how it’s a great way to gain deep subject matter expertise in how your business works and gives you an opportunity to link up with like people across the company that you otherwise wouldn’t have a chance to talk to, build your network. And I just thought it was a nice perspective to see, what are actually the benefits that it’ll bring to you as an analyst to do some of this instead of just sort of seeing it as okay, monthly reporting, again, I gotta go chase the stakeholders, the numbers are late, that whole thing, like you can kind of look at it in a different way and say, yeah, actually this is kind of a good opportunity. Especially if you’re relatively early in your career. It’s a great way to build some of those relationships and understanding. So I felt that was a nice article.

0:28:16.1 TW: I like that ’cause I mean, it’s like the double-edged sword. You can go work with your business partner and have them just… I was having coffee with somebody a little while back and she was like, oh my God, like the head of marketing is just like, where’s my dashboard? Where’s my dashboard? And she’s like, no matter what the context, whether it’s a… She just perceives that everything that comes out of analytics and data science winds up on a dashboard. And this was driving this fairly senior person who’s managing the analytics and data science teams a little nuts. Because she was having to sort of push back. So the the flip side is saying, oh, if the reporting exercise is an opportunity for me to ask you questions about the business and what you really need, then that is an opportunity being mindful of don’t turn into an order taker where they say, and I wanna look at this and I wanna look at this and I wanna look at this and I need to slice it this way and that way. Like that’s the challenge. But I do like that framing. I mean anything out of Eric.

0:29:24.7 MK: I was gonna say, I don’t know if we’re gonna do a category for best of matchmaking in 2024, but I will definitely take credit for the love affair that is Eric and Tim.

0:29:36.7 TW: But I also like, I found him first and now I’m like, yeah.

0:29:40.7 MK: Obsessed. As obsessed as we all are.

0:29:45.3 MH: Interestingly enough in this area for me this year, I think this is where I used AI the most was in using reporting and developing reports. I feel like this…

0:29:55.7 JH: Really?

0:29:58.2 MH: Yeah, like lots of other places I use AI but like in terms of like practical application to my work, it had the most impact in terms of a lot of times. I’ve been around a long time, so I’ve forgotten a ton of stuff that I used to know really well. And so going back into the weeds and like creating things and data, I’m a little rusty sometimes. And so AI was amazing to be like, okay, I know what I’m trying to do. And it would just be like, oh, you just do this, this, and this. And I’m like, oh, that just saved me like two hours of Googling for an answer and I’m off to the races.

0:30:32.3 MH: And it helped me do things that I’d never done before that actually were really helpful because sometimes when you’re trying to think through a way to display data in a certain way, like you’re limited in different ways by the tools you’ve got and what you’re trying to show is not well supported by both the underlying data as well as the platform you’re on. And so getting to a couple of really cool solutions and then being able to show that to clients and then being like, wow, how’d you do that? I be like, that’s why you pay me no big deal. Even though it’s like I was stayed awake half the night being like, I wonder if that’s possible. And then AI could be like, this is what I wanna do. And the AI is like, oh, I think if you did this, this and this and boom, you’re often going. So that was like the cool thing for me this year. It was sort of like leveraging AI to compress some of the work and help me to remember stuff I used to know how to do better.

0:31:32.2 TW: So using it in the context of kind of the execution, the development of the report.

0:31:36.2 MH: Yeah. Building stuff. Yeah.

0:31:40.6 TW: Interesting.

0:31:41.6 VK: I funnily enough had the same thing, but I actually put it under analysis. So one example, I had someone in my team the other day who I’d asked them to run a query. It was a bunch of data sources I’m not super familiar with. ‘Cause also I haven’t written a line of code in probably two years and I was overseas so I didn’t wanna like wake this person up ’cause it was Australia time. And I basically was like, I just wanna make sure that I’m like… It’s the logic that I intended when I asked him to pull this data. And so I was like, Hey, ChatGPT, here’s the query. Can you tell me what this query’s doing? Is this the calculation it’s doing? And I was like, I’d never thought about using it that way, but fuck it was a beautiful thing. Like it basically came back and was like, this is how it’s calculating this field and this is what it’s doing. And I was like, great. I didn’t have to bug one of my data scientists to be like, am I definitely interpreting this correctly? Because it was really complicated. And see, I would put that under analysis though Tim. Do you think that’s under reporting?

0:32:42.4 TW: No, I mean those are both aspects of kind of supporting the development tasks. So to me that feels like something that cuts across like legit cuts across both of them.

0:33:01.9 VK: Yeah, I suppose ’cause you could do the same thing with your queries for reporting purposes.

0:33:09.4 MH: Well, since we’re talking about analysis, what are some of the best things we saw in analysis this year?

0:33:17.1 TW: I’ll start with like, there’s a whole movement and maybe this cuts across reporting and analysis as well, that was like some of the worst thing I saw, which is all of the hype around, you point AI at your data, throw your data at AI and insights will emerge. Which I mean, being the inbox manager for our inbound pitches, there are certainly plenty of people who would love to fill your ears dear listener about how their AI solution is gonna generate insights. So that was very, very off-putting ’cause it feels so wrong. But the flip side, and I’ll credit seeing Jim Stern at MeasureCamp Austin, seeing Jim again at Marketing Analytics Summit, John Lovett at Adobe Summit. I saw him then again at MeasureCamp Chicago and both the really pushing for using generative AI as kind of an ideation companion to think about hypotheses to like be a smart companion to, it’s kind of like rubber ducking on like super, super steroids to say I can have an exchange where I am forced to be in conversation and I am encouraged, like using prompt engineering as a way to get to ideas for analysis, to get to hypotheses, to think through, to kind of think a little bit more broadly.

0:34:46.8 TW: That to me seems really, really useful. And even with like John sort of building GPT specifically around training them how to be good ideation companions was pretty exciting.

0:35:01.4 VK: Well, I think fundamentally, look, I was saying to someone the other day, I feel like ChatGPT is changing my job and it’s making me better at my job in some places and worse at my job in some places. In some cases, like yes, it is definitely the ability to come up with a list of hypotheses or what often happens with me is like, I will do all the thinking, get 90% of the way there. It’s the last 10% that I really struggle to do. And so I’ve been leveraging that to do the last 10% very, very effectively. But yeah, I don’t know. I feel like I use it across the full stack. So like companion, bounce ideas off, ask stupid questions, that’s probably my favorite one. I’ll be like, explain this thing to me. And previously like I probably would’ve struggled to find a simple way to do that.

0:35:51.8 TW: And you keep saying ChatGPT, is that your one go-to or do you use…

0:35:57.0 VK: Yes. Well, I use that because we have enterprise and so we can put confidential information in there, so it makes a really big difference when you can upload a data set and be like, hey, the other one that I loved so much, we had this Slack thread the other day and we were like doing an investigation into a metric that had declined. And everyone’s like, what’s going on here? And of course we have all the like best brains in the business and you have all these senior data scientists like updating the channel of what they’ve looked at because it’s cross company, there’s all these senior leaders in it. And I caught up on the thread like, I don’t know, seven hours later and I was like, what the fuck is going on here?

0:36:39.3 VK: Like, I can’t tell what we’ve looked into, what we haven’t looked into, what’s next, like, where are we at? And I basically just copy pasted the entire chat into ChatGPT and was like, give me a summarized version of what we’ve looked at, what we’ve ruled out and what’s still outstanding and who should follow it up. And it was beautiful. I mean so succinct.

0:36:56.6 JH: That was so good.

0:37:01.6 VK: And then I pasted it back in the channel and was like, Hey guys, here’s where we’re at. And everyone was like, again, Chef Kiss, as this way to just like keep everyone on track almost without anyone having to digest all that information and do the summary. But I think the point that I made that I take for granted is like having an instance where you can put confidential company information is a real game changer, right?

0:37:29.6 JH: Absolutely. Yeah. Yeah.

0:37:32.2 VK: I’m just not gonna fucking talk anymore ’cause every time I do it’s like stone cold silence because I’m making really [0:37:37.4] ____ points and I’m being super annoying.

0:37:39.6 JH: You are not.

0:37:40.0 TW: No, not at all. Not at all. No, mine’s just totally unrelated, so I couldn’t think of a segue, but totally agree with what you said. I’m dying to have our own instance of ChatGPT, like my company still blocks access to all Gen AI applications despite having a Gen AI team and COE in the company. So that’s where we’re at.

0:37:58.3 VK: Do you know what my founder did the other day? My founder the other day was like, I think everyone just needs to like brainstorm this. And people were like, writing on post-its with pens. She’s like, no, no, no. Someone grabs some like permanent markers. Write in big letters. And ’cause we had like 30 people in this brainstorm, everyone threw it up on a board and she just took photos on her phone, uploaded it straight away and it summarized brainstorm. And I was like, boom. It’s pretty nice.

0:38:26.8 MK: Yeah. Some of those like virtual whiteboard tools, like the murals and the murals have some that built into where you can like.

0:38:32.1 VK: Like Canva?

0:38:33.2 MK: And like Canva. Didn’t realize that one.

0:38:37.2 TW: That’s the one we use.

0:38:40.9 VK: Yeah. But you can, like the summarize is super helpful, but there’s other ones where you’re asking it, like you can draw a lasso around to certain ones and say like, what’s the theme of these? Or like, you can have it do a assistance.

0:38:53.9 TW: That’s awesome.

0:38:54.9 VK: A couple things that it’s pretty manual. That definitely is a time saver. Especially if you’re in a facilitation mode or there for a day long workshop with clients kind of thing.

0:39:03.8 JH: Moe you gave me some ideas of how to use it without having to put like client data in there. ‘Cause we obviously we can’t do that for privacy stuff, so, but yeah, Moe you have given me good use cases of how to use it for the ideation and the creative side, but a little more applicable to the day to day. ‘Cause I’m like, I don’t get to be that creative in my job, so thank you.

0:39:23.5 VK: I’ll keep throwing you ’cause I just keep finding more and more ways and I honestly was like, I think I’ve mentioned this example before, but like we had another like metric deep dive where something had gone down and one of my stakeholders had put like, what are the possible hypotheses for these in ChatGPT? And what came back was very good. And the thing that I really liked about it, it actually reminded me so much of analysis of competing hypothesis where it kind of like came up with 10 hypotheses. It was really structured. And then we actually… I was like, let’s lean into this. And we used it for our analysis and we’re like, okay, we’ve ruled this one out, this one we’ve ruled. Like we didn’t take the same process exactly as analysis of competing hypotheses, but it definitely had that mentality of like, okay, here are all the causes. Like let’s disprove them. And the thing that I keep coming back to is…

0:40:14.3 TW: But why did you not tell it to take an analysis an ACH or ACHA in your…

0:40:24.0 VK: Ooh, I don’t… That is interesting and I will try it. The only caveat is I don’t know if the conclusions it would draw about the different pieces of evidence would be sufficient. Plus you would probably need so many varied pieces of evidence in there. But I’m gonna try it. I’ll come back to you.

0:40:44.2 TW: Okay.

0:40:45.6 JH: Well, it kind of goes back to that would be a little more like asking it to do some fact finding for you, which then you have to check. Like we always say like check it, gut check it. Which I think would be hard to your point, like asking it to do that. And then you’d have to go through the exercise to check it kind of, compared to what you’re doing. You’re doing more of like the brainstorming, creative summarization get you moving in the right direction stuff.

0:41:08.9 MH: Nice. Alright. Last category. How about we talk about what we saw this year that we love that was in analytic strategy or strategy related?

0:41:20.8 JH: I have to start out with something that Valerie actually sent me at a very timely time that I needed to read this. ‘Cause you always have existential moments where you’re like, what am I doing? Should I be more hands on? Should I stay more generalists? You know, we all have those.

0:41:34.6 MH: And then it’s Julie, get back on the podcast. We’re trying to have a monthly call here.

0:41:42.7 JH: Yeah. Exactly. But Valerie sent me a quote by Adam Grant, and I feel like it perfectly summarizes what I hope we see more of in 2025. Like, people embrace this. The quote is, the hallmark of expertise is no longer how much, it’s how well you synthesize, information scarcity rewarded knowledge acquisition, information abundance requires pattern recognition. It’s not enough to collect facts. The future belongs to those who connect dots. And I was like, oh my heart just like settled my self-talk where I was kind of spiraling. So I really loved that.

0:42:20.2 MH: So like an analytics translator. Absolutely.

0:42:25.6 TW: Yeah. I mean I’m…

0:42:26.8 MH: I’m sorry.

[overlapping conversation]

0:42:26.9 TW: Like I wasn’t already, like that was like it, the switch was 80% triggered by this. Anyway, yeah.

0:42:33.3 MH: You just…

0:42:33.9 JH: Really, you’re triggered by this?

0:42:35.7 VK: By Adam Grant. That’s all you had to say.

0:42:38.3 TW: No, I mean it’s stating that like that is not new like that is not new. Like there is like kind of. That is…

0:42:43.0 VK: Okay. I was gonna say that Tim, I was gonna say his perspective is not new, but I think he sometimes has a way of packaging an idea that makes people, I don’t know if I wanna say believe in it, but makes people wanna follow it forward like in a really concise way. So like that’s the value add.

0:43:09.6 TW: Okay.

0:43:09.7 MH: Anyway.

0:43:10.7 JH: I love that.

0:43:10.8 MH: I’m glad it was helpful ’cause…

0:43:12.2 JH: I liked it.

0:43:13.9 MH: A lot of people like I don’t think that what you were grappling with Julie is uncommon at all as people progress. They’re trying to figure this stuff out and with everything going on in our industry with what we’re just talking about with Generative AI, like there’s all these things that people are trying to say like, what is the shape of my career? And so it really is helpful to get guidance or guide posts from people that like that.

0:43:38.1 TW: But you know what’s gonna happen is a bunch of people are gonna say, sweet, so I can just take all this stuff and throw it at Generative AI and tell it to connect the dots for me. And I found the shortcut. I mean.

0:43:48.8 VK: Okay. Well then you can swim on past him.

0:43:53.7 MH: So, Tim, none of us here are gonna do that. We are all gonna come to you. Okay.

0:43:58.5 VK: Yeah.

0:44:00.1 MH: So don’t worry about it.

0:44:00.4 MK: But I think it’s nice to show that like also there’s been a lot of talk in the, not a lot of talk in the industry, but there’s definitely been a change, right? Of like, especially in being like a consultant, being able to deliver a point solution for a very specific pain point is valuable. But there has definitely been a bigger demand, from clients of like, now that’s like table stakes as I know Tim, you’re gonna say it should have been all along, but like there is a bigger push of them wanting bigger pictures painted for them, more guidance, help them connect the dots. Like they are now feeling the pain of like, I have so much stuff going on, I don’t know how to make it work together. They still wanna grab onto, we’ll switch the tools of what I’m comfortable with, but then they do that and they’re not getting right the outcome they were expecting or the benefit from the swapping the tool that they thought.

0:44:51.7 MK: And so like this quote to me was also hopefully to Moe’s point, like if more people are seeing this and there’s a critical mass of people understanding that, like it’s making everything you have work together and seeing the bigger picture. Like one, the industry’s feeling the pain, I think they’re pushing towards wanting more strategy. And two, I thought this did succinctly say like that I could pass around to my colleagues at work and say like, Hey, us thinking this way. Like yes, it’s painful and it can be hard sometimes, but like we’re on a great learning curve and it is like way more valuable than classically like the great work we’ve done. Like yes, we’re still gonna do that and need that expertise, but like we do have to upskill in this area. And so I think that’s why it was so encouraging.

0:45:33.3 VK: The thing though that I would add, which I was gonna say earlier is that, like Tim said, I don’t feel like that’s new. I feel that kind of like working to connect the dots is kind of a core data scientist. Like that’s what makes a good data practitioner. And I feel like it’s still a work in progress and what can sometimes be attention particularly is that like I feel the industry right now, I don’t know about anyone else, but I feel like our foot is down and we are moving so super fast that it almost feels like a bit of a tension has been taken away from that core skill. And we need to like refocus. I feel like I still work on this with my team all the time. Like it’s not…

0:46:13.9 MK: That’s a good point.

0:46:15.1 VK: Yeah. I don’t feel that we’re there.

0:46:16.5 TW: I mean ’cause my concern is that this goes to, say this goes to our business partners and now they’ve just been given another cudgel to say, well this is great. You gave me what I asked for, but I need you to really like connect the dots and I really need you to do the pattern recognition. Or they throw it. I mean I’m not opposed to the idea, but like the data practitioner who is constantly coming up short from the expectations of their business partners, like the critique. I mean it is a partnership, to me there’s more about communication and partnership and deeply empathizing with and understanding what our business partners really need is a lot way more important than where analysts a lot of times wanna scurry off and say, let me just keep digging into the data and let me just keep finding something. So I think the danger is how somebody passes it. Because if they read too much into synthesizing the information, they’re like, well, I just gotta get more and more data and I’ve gotta synthesize it and then I’m gonna come back with, look, I found this relationship between these two things that are completely divorced from the actual business context. So I’m not just trying to shit all over it, I promise.

0:47:45.8 MK: No, no, but it’s just interesting the way you’re reading it is definitely in a different light than I was. And I think some of it is like the space I was in, I was primed to read it and take it completely differently than you are taking it. So it’s not to say like, yours is wrong and I disagree. Like I agree with what you say. It’s just interesting. I didn’t have that point of view when I first read it. I guess.

0:48:04.8 MH: I like it.

0:48:05.8 VK: Okay. Controversial question. How important is data strategy right now? Like, I made that comment about things moving as fast. Like I feel like it’s moving as fast as possible. I feel like there are a lot of documents sitting on shelves somewhere. I don’t know someone said something to me in 2024 and I had this like.

0:48:33.0 TW: You mean this year?

0:48:35.0 VK: Yes. This year, but I am already in 2025. My mind is in 2025 and has been for the last three months.

0:48:41.6 MH: Okay.

0:48:43.7 VK: But someone said something to me which was, oh well the data strategy should just connect up to the overall company goals. And I was like, well fuck, that’s obvious. I think that’s harder to do in reality sometimes. But I guess I’m just having this like existential crisis of like what is the purpose and it like, and I’m thinking when I say data strategy, they’re like 12 to 15 page document that probably does sit on a shelf. Like is it as useful as it used to be with the pace that we’re at? Like I don’t know. Like is it more about the behaviors and the ways of working that are important or the concepts and less about the how?

0:49:24.3 MH: At times like these Moe, I like to think about the Canadian band, The Arrogant Worms and the song they sang called star trekking across the universe, always going forward. ‘Cause we cannot go in reverse. And I think when we’re going so fast, in my opinion, that is when data strategy is actually even more crucial because we’re going so quickly and we have to respond so quickly. We do have to have a good strategy for where we’re trying to go or else we’ll get pulled 45 different directions and end up nowhere.

0:50:04.8 TW: Counterpoint.

0:50:05.9 MH: Okay.

0:50:05.8 TW: To put me in most camp.

0:50:05.9 VK: Yes.

0:50:09.1 TW: I think most if you say data and I struggle with, I mean I’ve asked what is a data strategy? ‘Cause I don’t even know. So, and I’ve seen defined different ways. One of the more recent ones in discussion with a company, the data strategy is like, what data are we gonna have? How’s it gonna be hooked together? What are we gonna gather? What are we gonna collect? How are we gonna manage it? And nine out 10 of those come down to, okay, here’s our strategy. We’ve gotta spend the next 12 months getting all the data hooked into these just, we need the whole year to kind of execute it. Which I think that’s where they default to. And they’ll wave the flag of AI and all the data has to be super clean. And so there is a tendency, I don’t know Moe, if I’m articulating the same thing you’re saying, there’s a tendency to over index, towards collecting, getting all the stuff and getting all the process in a good place with this belief that you’ve gotta build this strong foundation and that means the next year then we’ll be off to the races and the Generative AI will be so much more powerful and that’s why it winds up as opposed to being more nimble.

0:51:24.8 TW: What’s interesting, Tim, is what you just commented on is actually the execution on the strategy being too slow. Not the strategy itself.

0:51:34.8 VK: I think both are true. Both are true.

0:51:37.2 MH: Yeah. But I think that’s the issue you have there is sort of like, well how do we execute on the strategy effectively? And it does your strategy. I think Moe to your point, does have to fly back to what is the business trying to do. One of the things we did this year with one of our clients was… As you’re going along helping companies do stuff, you get to an inflection point and you start, you get a chance to actually say like, Hey, does our business better serve by like altering our strategy? And then what are the steps we can take now with what we’ve got? And then how do we need to adjust out into the future to build that strategy forward? And like that’s work we are doing. But I tend to take personally a much more iterative approach, which is sort of like, okay, if this is where we see the puck going, don’t be like, I need 12 months to get everything ready to be perfect. The boil, the ocean work is never right. It’s always sort of like, here’s where we’re at and here’s the five steps we can take in the next 90 days that are gonna push us a little closer. And I think…

0:52:38.6 VK: But sorry, but when you run a strategy, you don’t run a 90 day strategy, it’s typically one to five years minimum.

0:52:45.5 TW: I didn’t say 90 day strategy. I said here’s where we’re at today and to get to our strategy, here’s what we’re gonna do now and then across the next so that we’re not taking forever to get there. ‘Cause you’ve gotta be incremental about it or iterative. I just don’t think it works to try to do the big bang every time. Sometimes you have to do it that way, but I don’t think it’s effective every time.

0:53:08.6 VK: I don’t think we do the big bang. I think the problem is exactly what Tim said where we said we’re gonna spend the next 12 months and we’re gonna get everything up to scratch and everything’s gonna be perfect. And then 12 months later, a bunch of shit happens in the business. You’ve got a bunch of new tools.

0:53:22.0 MH: Sure.

0:53:22.5 VK: A bunch of other shitty data and you’re like, we’re gonna spend the next 12 months making everything perfect because if it’s perfect we can help the company achieve their user and revenue goal. And you’re like, no one gives a shit.

0:53:35.5 MH: Yeah, but that’s not strategy, that’s execution.

0:53:36.9 JH: I was gonna say I definitely, am saying strategy. I just have to give you the context. I definitely was not meaning like data collection strategy. I was definitely talking concepts in the other things you mentioned Moe. Just to put that out there. I can’t go on the record people thinking I was talking about the collection side. ‘Cause I wasn’t.

0:53:54.1 VK: I definitely wasn’t talking about collection. But I think that’s where we landed.

0:53:57.9 MH: Oh, yeah.

[overlapping conversation]

0:53:58.0 TW: But I think I’ve learned this, I mean I’m finally starting to understand that the generally accepted thing is like data versus analytics. The data tends to be the collection, the piping, the governments, the management. But at the same time, I worked with somebody for years who, when he would do a data strategy, he meant more Julie what you meant. So there is… Yeah, it’s exhausting. And this starts to feel brutal.

0:54:22.6 MH: It sounds like a podcast episode. Alright.

0:54:27.6 VK: But I think this is… Sorry just to round it out though, this is why I think it comes back to maybe what the direction I need to go in is less about the what are we trying to achieve and like what are the behaviors and ways of working we want to use to get there and to support the business. And I don’t know if those are the same things or different things, but that’s kind of what’s been rolling around in my head is like what do we wanna stand for as a team? And anyway, yeah we do need a whole episode.

0:54:58.2 JH: Totally.

0:54:58.7 TW: Agreed.

0:55:00.4 MK: So I’m gonna take us on a little bit of a left turn for my…

0:55:03.4 TW: Good.

0:55:04.7 MK: Observation for 2024, hopefully a little less contentious positive. We shall see.

0:55:08.9 TW: At this point. Don’t count on it.

0:55:11.3 MH: That’s right.

0:55:12.2 MK: I feel like the jumping on happens later and later in the episode. So I am a little nervous. No, I had the opportunity to attend…

[overlapping conversation]

0:55:23.8 MH: Oh, okay. Wow, Tim.

0:55:28.9 MK: Well, you were there so you know that I was there, and there were a lot of great presentations. I really was impressed overall, but there was one in particular by Noam Lovinsky who was the CPO of Grammarly and his presentation was a little provocative. It was have LLMs killed Grammarly. And so it was just super interesting. And I have the recording in the slides that we’ll definitely link in the show notes if you’re interested. But one thing that really has stuck with me ever since he presented it, is thinking about the question, has the problem that you’re working on truly been solved for your customers and your users, so the example that he gave to kind of like illustrate this point was back in the day, I guess in the 70s and 80s, there were these things called Thomas guides, which were essentially like these paper like little manuals that had like maps and places you could go so that if you were out of town, it was one of the best ways to like navigate the city.

0:56:24.4 MK: And Thomas guides were kind of outrun by MapQuest in the 90s because now you could just type in where you wanted to go and print out your directions and that was considered the solve. But he thinks that where we are today with Gen AI is the MapQuest stage because we all know what comes after MapQuest, which is the smartphone and the constant access to Google Maps. And that’s interesting to think about Gen AI just being at MapQuest. But the part that really sticks with me is he was saying, actually, having Google maps on your smartphone or on in your car, like always accessible still isn’t the solve, that if you’re really obsessed and focused on if you’re actually solving the problems that would be self-driving cars because that’s what gets rid of the need for navigation altogether. And so like that’s actually where you get there. So he was kind of giving some examples in like the product sense, but thinking about how to be really deeply connected with the problems that you’re solving. I’ve just found lots of opportunities to share that story and applications of the way of analyzing just really deeply understanding the problem that your users and customers are facing. So that was definitely a takeaway.

0:57:32.2 TW: So in a data or analytic strategy context, it’s really getting to that perfect dashboard, right? That’s really where.

0:57:37.7 MH: The ones.

[overlapping conversation]

0:57:38.3 MK: There’s the takeaways.

0:57:39.8 MH: Awesome. Alright, well this has been sort of the intent, but we did pretty good and it’s also nice to see that after 10 years we’re a 100% on the same page and aligned on everything analytics related. So, I guess we can’t quit doing the podcast yet. We still got work to do. So, 2025 is looking to be a good year I think. Thank all of you, Moe and Julie and Josh and Valerie and Tim, thank you for being on the podcast with me and doing this show together. It’s always not only enlightening but fun.

0:58:27.0 TW: And thank you too Michael.

0:58:27.1 MK: Thank you too Michael.

0:58:27.8 Announcer: Oh, that’s what I was looking for. Thank you.

0:58:32.1 TW: I’m gonna leave that hook out there for a while to see if anybody was gonna bite.

0:58:36.0 MH: Anybody?

0:58:37.8 TW: No.

0:58:42.6 MH: But I think as you’re listening, maybe you’ve got things you’re thinking about in 2024 learnings that you want to take into 2025 or things in Year 11 of the podcast. You’re like, you haven’t talked about this enough. It sounds like we learned today we’re gonna talk about data strategy just a little more, but there’s probably lots of other things and we would like hear from you.

0:59:02.1 TW: To be fair, it’s been on the list for a while.

0:59:03.1 MH: It has.

0:59:03.9 VK: I have had that on the list for like three years have it been noted?

0:59:07.4 MH: Yep, you have.

0:59:09.9 VK: Yep. Yep.

0:59:10.0 MH: But it’s not important Moe ’cause we’re too fast paced so.

0:59:12.4 TW: And I maintain the list that backs up your contention.

0:59:17.4 MH: So anyways, but we would love to hear from you. There’s a bunch of great ways to do that. The Measure Slack Chat group is one of them. Obviously you can email us at contact@analyticshour.io and so please feel free to reach out. We actually really enjoy hearing from listeners and as we hear from you, we do incorporate what we hear into our show topics and to our guests and things like that. So, we do appreciate it. No show would be complete without a huge thank you to Josh Crowhurst. I know Josh, you’re here. So.

0:59:49.7 JC: This is awkward.

0:59:50.5 MH: He’s got his work cut out for him on this one too.

0:59:52.7 VK: Josh gonna say, you’re welcome.

0:59:54.1 JC: You are welcome.

0:59:55.2 MH: Don’t mean to make it awkward, but we really do appreciate you. Thank you very much. And I think, this has been… 2024 has been a very interesting year. A lot of learning, a lot of growth, a lot of change. And I think that’s always the case in our industry. And I think I speak for all of my co-hosts when I say no matter what 2025 brings in the next 10 years of this podcast. Remember, keep analyzing.

1:00:25.5 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at, @analyticshour, on the web, @analyticshour.io, our LinkedIn group and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:00:40.7 Charles Barkley: So smart guys wanted to fit in. So they made up a term called analytics. Analytics don’t work.

1:00:49.4 Announcer: Do the analytics. Say go for it, no matter who’s going for it. So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

1:01:03.7 MH: You should try to troll them on Blue Sky, Tim.

1:01:06.5 TW: Well, they’re not on Blue Sky. Oh, I went to look at where they were.

1:01:07.9 VK: What’s the fuck is Blue Sky.

1:01:09.2 MK: I’m not on…

[overlapping conversation]

1:01:10.3 TW: It’s the new social network for liberals who don’t like Twitter anymore.

1:01:15.8 VK: I don’t want another network. I don’t want any networks.

1:01:19.6 MH: I know.

1:01:19.6 TW: It is like Twitter was in like 2010. It’s delightful. I love it.

1:01:29.5 MH: Like we wanna kind of wing that or like, do we want to just sort of like pick what order we want to go in ahead of time? Sometimes people…

1:01:36.3 MK: Michael, I know which way we’re gonna go. We’re gonna wing it. That wasn’t really a question.

1:01:39.9 MH: I feel like we’ve been doing this for 10 years. One’s always been winging it. One’s always over planning.

1:01:46.5 MK: It is being won already. We are wanging it.

1:01:49.0 TW: And that’s number wang.

1:01:53.9 MH: Okay. Suffice it to say it will be wang.

1:01:55.5 MK: It will be wang.

1:01:56.5 VK: Well, let’s put it this way, if we go in order and we don’t get to all of them, I might not have anything to contribute except for. Are you serious?

1:02:07.0 TW: Wow.

1:02:09.0 VK: When other people say theirs. So.

1:02:09.2 MH: Well, I’m hoping Valerie, you can get a few. Do you mean to tell me, Tim, you spent this long and then.

1:02:18.5 MK: People were still confused.

1:02:19.0 TW: That’s right.

1:02:25.3 VK: Well. Fuck guys. I tried. Was it really bad?

1:02:31.3 TW: No.

1:02:32.0 MK: No.

1:02:32.6 TW: It was fine.

1:02:33.4 VK: I thought…

1:02:33.5 TW: That was great.

1:02:33.6 MK: Tim was gonna jump in so I was being polite and waiting. Do you want to go, Tim?

1:02:36.3 TW: No. Like literally I will always die. Like I always have something to say, but I can’t be like, I’m not gonna do it this time.

1:02:46.7 MK: All right, I’ll do it. I’ll do it. I’ll do it. I’m up. Rough start.

1:02:54.1 VK: Poor Josh. He’s gonna be like, this is the worst year edit of my life.

1:02:57.7 JC: That’s fine. Every show that I’m on is the worst show to edit. ‘Cause I have to listen to myself.

1:03:05.3 TW: That’s an outtake right there.

1:03:07.3 VK: That should be an outtake. If it’s not in there.

1:03:10.7 MK: We’ll know who. Yeah.

1:03:12.4 VK: All right.

1:03:13.7 MH: Rock Flag. And it’s the data strategy. Power Hour rebrand.

1:03:23.8 MK: Woof. Good one.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#261: 2024 Year in Review

#261: 2024 Year in Review

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_261_-_2024_Year_in_Review.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares