#284: I Used to Think...But Not Any More

As the world turns, a couple of things happen: 1) we grow and learn, and 2) the world changes. On this episode, inspired by a job interview question, the hosts walked through a range of thoughts and beliefs they had at one time that they no longer have today. Analytics intake forms are good…or bad? Analytics centers of excellence are the sign of a mature organization…or they’re just one of many potential options? Privacy concerns are something no one really cares about…or they are something everyone cares deeply about? Voices were raised. Light profanity was employed. Laughter ensued.

This episode’s Measurement Bite from show sponsor Recast is a brief explanation of statistical significance (and why shorthanding it is problematic…and why confidence intervals are often more practically useful in business than p-values) from Michael Kaminsky

Links to Resources Mentioned in the Show

Photo by Isaac Smith on Unsplash

Episode Transcript

00:00:05.73 [Announcer]: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

00:00:14.91 [Michael Helbling]: Hey everybody, welcome. It’s the Analytics Power Hour. This is episode 284. In the data and analytics world, we love the old George Box quote, all models are wrong, but some are useful. And to extend the thought a little bit, sometimes that usefulness degrades over time, making that way of understanding the world something we have to leave behind. Whatever the situation, growth comes from updating your priors, as we say in the biz. So that’s what we’re going to talk about in this episode, a retrospective on some of our retired beliefs. So let me introduce my co-hosts, Val Kroll. Welcome.

00:00:52.75 [Val Kroll]: Hi, Michael. Good to see you.

00:00:54.82 [Michael Helbling]: It’s good to see you too, Tim Wilson. I don’t know how you’re going to do this episode, Tim. I don’t think you have a single belief you’ve had to change because you’re always right. So that hurts.

00:01:06.28 [Val Kroll]: Not even two minutes in. Here they come.

00:01:11.32 [Michael Helbling]: OK, well, I’m sorry. We should let ourselves get started first. And Moee Kiss, how you going?

00:01:16.91 [Moe Kiss]: I’m going wonderfully. Thank you.

00:01:19.07 [Michael Helbling]: Awesome. And I’m Michael Helbling. As we get started in this episode, Val, the inspiration for this comes from an interview question you got one time. And so I wanted to start with you and say, what made that interview question stand out and what kind of made it pop out to you?

00:01:36.15 [Val Kroll]: Yeah, it was an interview question that I received and it also ended up turning into an interview question that I leveraged when I was at search discovery now further. And the question was asking, what is a deeply held belief that you have had that you’ve changed your mind on over the past year? And they did have a time bound element to it in the interview question. Not that we’ll necessarily be using that today. But I thought that was a very thoughtful question. I think that there’s a lot of things that you can intuit by someone’s response to that, just even kind of understanding what their mindset is or what could cause them to change their mind and how are they investing in kind of like that continuous learning, who they’re kind of learning from to potentially cause that change of mind or what experiences are they gathering. But also, you know, if you have someone that says like, well, I’ve never changed my mind about anything. That could be a good checking the box on does not fit culture here at whatever company.

00:02:39.00 [Tim Wilson]: Do you remember your answer to that question?

00:02:41.20 [Val Kroll]: Oh, no, I completely blacked out, but that’s pretty common for me during an interview. I definitely have things that I’ll be able to share for the purposes of today’s episode, but I have definitely no idea what I said. Were you asked that question, Michael or Tim, when you joined, since three of the four of us went through that process there?

00:03:01.99 [Tim Wilson]: Well, Michael was, I mean, did they even have it?

00:03:04.99 [Val Kroll]: It’s one of the OGs, yeah.

00:03:08.52 [Tim Wilson]: I didn’t have, I was well before or I did not go through any sort of standardized interview process. I was soaking wet sitting in a hotel lobby in New York.

00:03:21.47 [Moe Kiss]: I’m glad you said hotel lobby. I was a bit worried where that was going hotel.

00:03:29.60 [Michael Helbling]: Yeah, I mean, I don’t think Tim, your interview process was as much of an interview as much as it was sort of a recruitment. And I think the interview question that stood out to me when I joined Search Discovery was I was asked by the CEO the difference between an S-Prop and an EVAR. So that was one of the questions. It was a different time. And you said web trends. Web trends doesn’t use S-Props and EVARs. Well, if you want to set the WT. I knew you were going to say that. We go way back. What are the best campaign parameters? That’s one that we all use UTMs now, but back in the day, we did think there were better ones, you know? Like what are the core metrics, the CM underscore MMC? There you go. So, okay, let’s not go into that, but that’s an elit… Are these things you’ve changed your mind’s on?

00:04:28.17 [Val Kroll]: Exactly. I used to believe that shit mattered and it was gonna give me better measurement. How meta?

00:04:33.37 [Michael Helbling]: Yeah, exactly. That’s right. All right. All right.

00:04:39.15 [Moe Kiss]: Well, to be fair, the thing that that did trigger in me was one of my assumptions. Perfect. Exactly. I still wrestle with, which is this idea of the single view of the customer or a unified view. Or if you collect all these events, these variables, we will have a complete and perfect picture and be able to understand what our users want. And I especially think that was true in the world of attribution.

00:05:06.75 [Tim Wilson]: I literally sat in a meeting today with people I didn’t really know well, and there wound up being a discussion around if we could just get our UTMs kind of in better shape and standardized. And as Val noted, I’m drinking during this episode. I’m just gonna say, are you kidding me? And it’s a 9.9 ABV beer. It’s a stout. Brace yourselves. I mean, there are plenty of people who still Believe that, and I feel like I was certainly there. I just got to get the more perfect data, and that’s going to get me to the answer. It was so easy to obsess about, let’s build processes to make that data capture better.

00:05:53.61 [Val Kroll]: So for you guys, was it like a specific event or a moment in time where it was like the before times and the after times on how you thought about this? Or was it like a slow like eroding of your confidence and that that was like the right path? I’m curious.

00:06:10.13 [Moe Kiss]: For me, it was definitely a series of debates at a measure camp about why attribution sucks. And they would always get very, very heated and dramatic. And like, I would definitely say for me, it was a slow burn. And I would still say that fundamentally, I think you can learn things from attribution or collecting different attributes or events about users. That’s not to say that I fundamentally don’t believe in it entirely. I just think it was the overconfidence we had in the answers it was giving us.

00:06:41.96 [Michael Helbling]: Yeah, with attribution, I think my sort of skeptical nature always sort of was like, what is really going on here? And then a big moment for me was I was looking at a tool that this company was using and they were kind of showing me like, here’s how we do it. And it got to this one point where they’re like, and here’s where we just put in the weights we want for each channel. And I was like, so that’s not attribution. That’s just adding things up to 100 based on whatever you feel like. I was like, that’s what I was like, that’s, that’s in your tool. That was in the tool. And so that was when I was like, okay, I’m not for this anymore. Like whatever, whatever’s going on. I now know I’m against it.

00:07:25.10 [Tim Wilson]: I had two cases. I had built so many spreadsheets or repurposed spreadsheets or looked at other people’s spreadsheets and then looked at platforms that would manage. This is very digital analytics specific around managing campaign tracking parameters in the digital context and then working with one client that had built this thing that they were paying for a tool. They had 50 different values they were keying off of this thing. And the guy who owned that tool was just like, we just got to get all these disparate agencies to always use this thing. And then I was like, and then what? Like these values nobody cares about and it’s not going to get you anything. So I was like, this is getting so in love with the data collection and totally losing sight. There were fields that were being mapped that nobody in the business even thought about the business in that way. It had been dreamed up somewhere years before. At the same time, which maybe blends to a second related topic, is I fundamentally did not understand that, again, in a digital analytics marketing world, that there was no model that was actually going to demonstrate incrementality, that there was so much around, if you get the right algorithm, the right data-driven thing, the right Markov chains, it’s going to tell you what value each channel is contributing and literally that not addressing incrementality. So I think it was actually probably talking with Joe Sutherland that was when that light bulb went on. And I feel like I’ve watched it go on pretty broadly across the industry. And it kind of stops at marketing all too often.

00:09:18.64 [Moe Kiss]: It’s funny, Tim, one of the, because I said, like, for me, it was a slower burn and Sam Redfern, like, berating me with the word incrementality over many, many months. But I think one of the standout moments for me also was your chip analogy at the grocery store, where someone puts the packet chips in their trolley. And it was that analogy that really clicked of like, sorry, why don’t you explain the analogy? Because you have a very great visual that I semi use?

00:09:49.55 [Tim Wilson]: Well, I feel like that visual came from what I would put, I think I saw Rand Fishkin write about it first. It was the pizza shop analogy.

00:09:56.10 [Moe Kiss]: No, the pizza shop analogy is the one I use.

00:10:00.69 [Tim Wilson]: But I feel like that would have been around. And then you, I remember you struggling with the visual for it. But just that, that idea of, of, I don’t know what, I think the potato chips came up with what could I find an image for? And then I retrofitted a story to it. But just that, idea if somebody is buying potato chips, they’re going to go buy potato chips and they’re loaded up their basket with potato chips and they’re rolling towards the checkout aisle and some advertisement person jumps in front of them and says, you should buy these potato chips that are already in their cart. Again, in the digital analytics, seems so laughable among anyone who’s at all deep in statistics or causality or causal inference. For all those tools, they’d say, yep, you’re going to get some credit because you were the last touch. Then it spirals off into a, well, yeah, but if you get the right algorithmic, last touch shouldn’t get all the… It’s such an easy story to tell. So it had been a slow burn for me for a while. The fact that I was standing on stage just talking about that to me indicates I certainly felt like a lot of people were still operating under that confusion. And I mean, I was 15 years into my career where I think I was operating under that lack of understanding. Everything else though, no changes. I had to go changes. That’s all nailed down.

00:11:28.57 [Val Kroll]: I can just sit back and have a drink for the rest of the episode.

00:11:32.83 [Moe Kiss]: What about you, Val? What’s one deeply held belief you had that you’ve changed your perspective on?

00:11:39.22 [Val Kroll]: As I was thinking about this, I think one of the ones that is bigger because it has a lot of parts to it. It’s kind of systemic in some ways. And I would say a change maybe in the past three years is that I used to think that there was this like optimal model of operating inside of an organization and that a center of excellence meant that you were the most mature and that every organization should try to strive to build out a center of excellence as the ideal model for delivery of analytics and experimentation inside of the organization. And I think part of that was just because of the organizations that I had worked inside of that when there was like a significant investment made that there was enough to create a guild-like team of people to kind of support the whole organization. So it was based on my own individual experiences and also it felt like When I started consulting and I saw kind of pockets of analysts across the organization, it felt very scattered or they weren’t using same definitions and people had very different experiences engaging with those teams and it felt like, well, that can’t be the most mature. So we should all just try to strive for this org model. But I think the more time I spent in consulting, I started to realize that there was a lot of very mature organizations that worked off of a more decentralized model, or the hub and spoke, or fractional roles, things like that, and that the way that the org is designed isn’t necessarily the signal of maturity necessarily, and that those two things should be decoupled. But there was a lot of assessments, I don’t know if you guys remember at the time, like assess your maturity in whatever practice it was, It did always feel like it gave a lot of credit to orgs that had dedicated governance teams within the COE or people that supported the different pillars of the business. I don’t know if that’s one you guys have come across too, but that was definitely one for me that was pretty recent.

00:13:50.39 [Tim Wilson]: Do you feel like maturity can be measured or just not measured easily?

00:13:53.62 [Val Kroll]: Yes, I think majority can be measured, but not easily. I thought that a lot of those assessments that you would take, and a lot of them were lead generation tactics, that the gap analysis that would be provided back would be the list of things that you could go do and that assumed you were striving towards a center of excellence model. I thought that there was a lot of things that you know, not just me that we’re kind of saying like, hey, this is the right way to do things, but I don’t think that those two things need to be coupled.

00:14:28.10 [Moe Kiss]: Yeah, it’s funny. The thing that has changed for me in this space is that I think previously I more had a deeply held belief that one way or the other was better. And this is probably like, is it, I don’t know, experience or whatever age, whatever thing you want to call it. But I think over time, you just start to be like, it’s a series of trade-offs and you either trade with this for this or they’re opposed to this and cons to this. And it just depends on the specific business. And it does make for kind of an apathetic answer, though, because you feel like you’re always being like, well, it depends. But I feel like I used to have a strong opinion about the structure, especially centralized or decentralized. And I no longer do. I’m just like, like, We’ll figure it out.

00:15:12.86 [Tim Wilson]: Yeah. I wonder, is there an analogy to chasing the tool that there’s a tendency, I mean, organizations even broader than the center of excellence, we’ve all worked in organizations where things aren’t going well. So it’s a reorg and you really shuffle the reorg and it’s like that. If you blow everything up, you kind of just hope that stuff lands on the deck in a better, more organized way. And really, it just gives you six months of everything being blown up. And then you’re kind of back. You’re always going to have trade-offs. And like making that decision of saying, do we need to fundamentally shift our how we do our analytics support? Or do we need to work within what we have and figure out how we should adjusted, that seems like the never ending question. Michael, you were going to say something though.

00:16:04.66 [Michael Helbling]: Yeah. On the maturity topic, Val, like you mentioned that, like I kind of came to a conclusion at one point, it was like, why do these never seem like they work right? And it’s like, oh, because these maturity models always assume sort of like this linear function and no organization works that way. So it’s like, oh, first you’ll go from descriptive, then to prescriptive, you know. It’s like, well, no, you don’t. In some areas, you’re already way out in front. In other areas, you’re barely getting started and different functions in the business do different things. It never rang true whenever I would sit down and try to apply a model like that to any real-world examples. Eventually, it started being like, oh, pull back a little bit and realize it’s fun for a PowerPoint slide, but it actually doesn’t work in the real world this way. The real world is many points on a map or, you know, some type of quadrant chart or something, I don’t know.

00:17:03.07 [Tim Wilson]: Well, what do you think of the, and there’s the maturity model, I, boy, I could not plus one that more, the like, you’re moving up some curve to prescriptive. Yes. But what about the maturity models that are giving you like six different lenses through which to evaluate, you know, governance and tooling and

00:17:22.85 [Michael Helbling]: Well, since I was pushing one of those at one point in time at Search Discovery, I love those.

00:17:30.98 [Val Kroll]: I think everyone’s pushed those.

00:17:36.05 [Michael Helbling]: Yeah, but no, I definitely was. So I’m like, yeah, exactly.

00:17:40.63 [Val Kroll]: You’re on one end of pushing.

00:17:42.46 [Michael Helbling]: Somebody tried to sell it to you. I mean, in a certain sense, it does like Tim, to your point, it does try to sort of like evaluate it on more axes to try to get a clearer picture. But even that is sort of a little difficult because it’s really like I don’t know the right way to say it, but I’ve been running into this thing in the AI world lately where people are like, yes, if you just get all your data out there and you have the right quality of data, you could just apply AI to it and go to the moon. And it’s sort of like, at what point do you know you have the right quality of data to be able to do that? Business people don’t know that. So you’re telling people they should be shooting for this crazy AI goal of using their data without having any kind of understanding about how they’ve ever achieved or arrived at that number. And so the same thing with maturity models a little bit, like a lot of times you’re kind of throwing something out there that sort of doesn’t necessarily have like measurables against it that are really gonna work for them. So I think you can apply it, but you have to like go through the weeds more. So like no one model is gonna work exactly the same for everybody, I guess.

00:18:49.32 [Val Kroll]: And I think a lot of like, not every industry is going to be like analytics nirvana, like scoring a 10 out of a 10, does it look the same for every industry or every company? And I think like that’s back to most point that it depends. But I even think that like, if I, you know, 10 years ago, Val wrote the questions for a maturity survey would look very different than they are today, right? Like today it’s would be things about like, how are you ensuring that analytics moves at the speed of the business? Whereas, 10 years ago, it might have been like, what’s the governance for those UTMs? How are you making sure that every marketer… What taxonomy are you using?

00:19:26.92 [Tim Wilson]: Now you would be asking, are you using just multiple shades of pink in all of your data visualizations? This is true. That hasn’t changed. That’s still… That’s a constant. Well, to some models, you don’t need to update.

00:19:41.39 [Michael Helbling]: All right. What about you?

00:19:43.10 [Moe Kiss]: I was going to say, though, just on Val’s point that that is one area that I feel like I definitely evolved my thinking, which is I’m not going to say that the answer was the answer or the measurement approach needed to be perfect. But I think over time, I definitely index a lot more to the right answer or the right measurement approach for the business question. And that has to be at pace with the business. And I think previously, especially when I was the one doing more of the work, I’d be like, but it’s not ready yet. It’s not ready yet. I still need to look at this. I still need to do with that. And then I feel like now I see the other spectrum, which is you work at it so long that the business has made a decision and moved past you. And so I wouldn’t say I’ve completely changed my thoughts. And I wouldn’t say that it’s binary. But on that spectrum, I would say I wait a lot more now towards let’s use the measurement approach that’s relative to the size of the business decision that we’re making.

00:20:42.80 [Michael Helbling]: All right. Yes. So what’s fun is Tim and I in the background, just for everybody listening, been working on creating sort of a custom GPT from all of our episodes in the past. And so I use that partially to come up with some of the ideas. And I found out in episode 10, which went way back in 2015, I said, nobody actually cares about privacy. At that point in time, that was kind of true, but I think we’ve all been made to care. Certainly, the tide has turned on that topic quite a bit. Even my own thinking on it has changed a lot over the years because As I’ve seen more and more, at first, I was kind of like, all right, we don’t need a bunch of regulations. We just all know what we need to do. But then you just see bad actor after bad actor after bad actor. And you’re like, OK, well, never mind. We’re not all nice people out here. So we’ve got to have some rules for the road. But I did think it was pretty funny that back in 2015, I was like, yeah, people talk about privacy, but nobody actually cares.

00:21:45.91 [Moe Kiss]: And I was like, yeah. I think I said something similar to be fair, though. And again, it’s not a binary thing. It’s not either that people care or they don’t. I would definitely say there’s more attention and nuance to it now, and folks are partially better informed.

00:22:04.12 [Michael Helbling]: Yeah, well, there’s still a long way to go. But I think because there’s regulations, now people have to pay attention to it and now are starting to ask the question of like, okay, yeah, are we doing things with privacy in mind and consent in mind?

00:22:19.64 [Tim Wilson]: We’re uniquely the non-European set of… I remember a very, very impactful presentation that I saw that was a European saying, yeah, you Americans like you don’t fucking get it because you have not been so burned in like it’s deeply embedded in our DNA. And this is why the GDPR and What’s the ePrivacy Act? I should know that. I feel like Europe was out ahead on that front, but I have related to that. I think I was on the board, and it may have been what we were even talking about in episode 10, that there was a direct relationship between protecting privacy and the value you could get from the data. The more you respect privacy, the less value you’re going to get from your data because on one end you have total anonymity and no data even collected, therefore no value. And on the other end, you’re following somebody around and tracking them online and offline, no privacy. valuable and I have radically changed my belief. I think we fetishized the user level tracking to the point that we get caught up in things like multi-touch attribution. That’s all this obsession with tracking a single user and then you’re like, but wait a minute. What if we just have aggregate data? What if we do a mixed model? What if we run a controlled experiment where we don’t need any of that? I think there’s still a lot of, it’s an easy sell to say, we’re going to for vendors to say, we have a solution to basically violate people’s privacy without violating a regulation or a law. And ergo, you will have more valuable data. And that last part is, I mean, it’s all bullshit. It’s bullshit all the way along. So I’ve radically, I’ve definitely changed. I’m much more on the, let’s do privacy by design. There’s lots of value you can get without violating privacy or even getting close to it. Yeah.

00:24:45.78 [Michael Helbling]: And for reference, episode 10 was that we were reviewing the CMO survey results.

00:24:53.04 [Tim Wilson]: What’s her name, Mary? No. What’s the other survey? That was a CMO survey. I’m thinking of your state of the annual, there’s some other big long report you read every year. I read a few. Yeah. I used to believe that I could remember the things that Michael talks about all the time. And now I clearly have aged out of that. There’s going to be a GPT for that though. Yeah, don’t worry. No problem. Also to clarify, when Michael said that he and I had been working on this, it is like 90% Michael and I just happened to have stumbled across having done something useful to plug into that.

00:25:31.58 [Michael Helbling]: Well, I think, but it is really cool. All the source material came from you because you rebuilt all of our trade scripts. So I have to give you quite a bit of credit. Yeah. All right. Who’s got another one?

00:25:45.37 [Val Kroll]: Well, I actually had like a slightly small just like story slash confession that feels like the right time to get this recorded and put it on to the public. I had a neighbor that was a problem. And so I researched this person and found out that they had a leadership role in marketing at their company that sold full column widgets. And so I may or may not have created a link with some campaign parameters after I found out that they were using the Google Stack that I asked lots of friends and family to hit that included, we’ll call his name Chad, that Chad drinks warm milk. Just because I loved envisioning someone on his team being like, what is this campaign that people are coming from? And why are they talking about our boss and how he likes to drink more milk? But yeah, I don’t know if anything ever happened from that. But just because we’re talking about privacy and parameters.

00:26:47.56 [Tim Wilson]: That’s OK. Moest data doesn’t ever get looked at. Yeah.

00:26:51.12 [Moe Kiss]: I think we’ve all done UTM pranks before.

00:26:53.84 [Michael Helbling]: Oh, that’s so funny. I love it.

00:26:58.26 [Val Kroll]: It’s good. Good times.

00:26:59.47 [Michael Helbling]: That’s a time-honored tradition of the analytics community.

00:27:01.99 [Val Kroll]: I’ll never change my opinion on that.

00:27:04.03 [Michael Helbling]: That’s right.

00:27:04.69 [Tim Wilson]: Usually, you are all shortener every time you post in the measure select largely to mitigate shenanigans. There’s a step in our production process that is largely around not making our Campaign tracking parameters is visible. Yeah.

00:27:20.35 [Michael Helbling]: And frankly, there’s quite a few people who might hear this and take that as a challenge and we just urge you not to. Please. Oh. Do we don’t like that? We’re not looking at your analytics. Yeah. Don’t worry.

00:27:32.72 [Val Kroll]: It’s going to take a lot for that anomaly detected.

00:27:35.07 [Michael Helbling]: That’s true. We don’t care. We’re not running in campaigns.

00:27:38.63 [Tim Wilson]: We had a whole episode that was the last episode around small data. And after we finished recording, Julie said, You know, we have a number of small data sets related to the podcast, and it was just occurred to me at the very end of the episode. So, fair point.

00:27:57.91 [Moe Kiss]: I don’t know if mine is a confession. It feels a bit like one. Because as I think about it, I sometimes feel like I’m betraying data people when I say it. And I set it to a group of data experts at my work the other day. And one person was like, yeah, that’s fair. And I’m still anyway. So obviously, I’m going to say it on a podcast where everyone can judge me publicly and please rate and review us. So I actually give a lot of credit to Kathleen Malley for this because I think you know how sometimes someone gives you an analogy and when you get the analogy, you’re like, yes. So we were talking about data as part of a meal and the fact that In some businesses, data is the garnish that goes on at the end, or it’s the salt and the pepper just to give it the extra nice taste. And in some lots of companies, the full and other end of the spectrum is they assume data is the meal. But if you make a decision without data, then it’s a stupid decision. And I feel like I definitely fell prey to this about this idea that data was like right or that if you made good decisions, it was because you were using a data, you know, informed decision. And what I would say now that I strive for a lot more is like data is part of the meal. So it might be like the roast potatoes and then there’s some roast meat and some vegetables that go with it. But it’s one of the things that you use as a business leader to make a decision. And for some people, that might take up a bit more of the plate than others. But I feel like it’s really something that I’ve found to be that I’ve really changed my opinion on. Because the context of this discussion is some folks were talking about, Well, this experiment result said that this thing was bad and therefore we shouldn’t do it. And I’m like, but that’s one part of the meal. And there are all of these other things. And sometimes it’s history. Sometimes it’s strategic intuition. Sometimes it’s like, there’s another form of research or there are so many other facets that come into making a decision. And I’m going to get off my fucking soapbox now. Step down.

00:30:09.52 [Val Kroll]: No, it’s good. You could take it further and be like, you know, grandma’s recipe that was passed on from generations as like the intuition component of it. But I like that. Yes. For those mashed potatoes or whatever it was, yeah.

00:30:24.65 [Tim Wilson]: You’ve hit on the… people who are, I heard this anecdote when I was talking to a past colleague who said, oh, the CMO wants the AI just to, you know, with their speaker and their shower in the morning, you know, just go through the data and tell them what they should do, right? That’s again, this idea that if you just have enough data and it’s clean enough and you run it through the right tool or model, like that is the start and the end. everything, and it never has been, and it never will be. And I feel like that’s, and it’s not even a progression of like, yes, but if data was only 20% of how we made the decision now, then tomorrow it should be 30%, and the next quarter it should be 40%. And eventually, just the data will do everything is, it feels like a wreck people are on that is totally misguided. I think that’s the same thing you’re saying.

00:31:28.71 [Moe Kiss]: So wait, do people agree with me?

00:31:32.61 [Tim Wilson]: 100%.

00:31:34.37 [Moe Kiss]: Huh. I don’t know why I expected that to be controversial.

00:31:38.67 [Val Kroll]: Not with this crew. Or unless Michael’s like biting his tongue over there.

00:31:42.81 [Tim Wilson]: No, no, no. He’s like, let me ask, let me ask our custom GPT what it has to say. Yeah. You know, on episode 14. That’s right.

00:31:51.40 [Michael Helbling]: Moe, you said, yeah, let’s not go digging into the, into our previous comments.

00:31:58.00 [Moe Kiss]: Please do not go digging into shit that I used to say because I said lots of things that were wrong.

00:32:03.09 [Val Kroll]: Well, that’s, that’s the whole perspective of this episode is we evolved.

00:32:06.09 [Tim Wilson]: There you go. I mean, getting into like the confessional, the confessional term, like this is like a trivial one, but I did, as somebody who deeply, deeply cares about effective data visualization, I did have a phase when I first thought I deeply cared about it where I put a gradient background on like every chart that I did. Because I thought that added like class and panache and credibility because I knew how to put a, put a gradient background on a chart in Excel, that that would make my charts stand out. And I’m happy to say it was a brief phase. It was a long time ago.

00:32:46.40 [Moe Kiss]: You used to have presentations with lots of moving things, too. Do you remember that?

00:32:50.57 [Tim Wilson]: I remember that.

00:32:51.21 [Val Kroll]: It was the Prezi phase. That was the Prezi phase.

00:32:53.67 [Tim Wilson]: We all had a Prezi phase. John Lovett was right up there on a pedestal. And if John Lovett was doing it, then I was going to do it, too. And if people got motion sick in the room, then so be it.

00:33:05.32 [Val Kroll]: Oh my gosh.

00:33:07.02 [Tim Wilson]: It’s Dramamine. What’s the motion sickness? Yeah, there we go.

00:33:14.07 [Michael Helbling]: Yeah, I used to think that you could just sit down with your data and come up with insights. You could just valuable insights were in the data. You just spend time with your data. You’ll just come up with cool stuff. And it’s interesting because I used to be in the role as an analyst where my job was coming up with insights. And I thought that’s kind of what we were doing. But really, what we were doing was a little more nuanced. And so as time has gone on, I’ve sort of learned like, no, you can’t just sit in front of a data set, and the data itself is valuable. It’s like, no, first there has to be a perspective and a question, and then there is potentially some information or knowledge the data can bring to you. It was sort of like a process of learning all the steps that were happening, not just looking at the data, but there were other steps that were part of that process. But at first, I think I would actually think that, yeah, I could just sit here. And I do remember sitting at my desk being like, all right, just come up with something cool. Just find something in the data.

00:34:18.78 [Moe Kiss]: Oh, I’ve done that.

00:34:20.02 [Michael Helbling]: Yeah, it’s brutal, honestly. But like, that’s what I thought.

00:34:24.07 [Moe Kiss]: I also remember Tim and I having a huge argument once about monthly reports. And I was there and he was against for that reason.

00:34:35.03 [Tim Wilson]: I knew I was well past that. I think I was at that point at one point because that was kind of the draw to doing stuff with data is like, oh, I can poke around and once I learn my way around our tools and our data, I can do that. But I remember a client telling me and the fact that my head wanted to explode. It was like, look, we have a bi-weekly meeting with these stakeholders. This was multiple groups. He managed a team that everyone on his team with different brands had a bi-weekly meeting, and he was like, look, we have a bi-weekly meeting. We’ve got to shuffle through the data. We’ve got to find something new to show them every time, which Totally predictably, if you asked who they were presenting to, they were like, every other week, we have to sit in a meeting and watch charts be thrown at us with what the analyst looking at us like, is there something useful here? Is there something useful here? Oh, look, look, this number went up because we did X, which is not a surprise to anyone, but we didn’t do X two weeks ago, so we did it this time. I was so, I was done. I was so done with that guy at that point. And then I continued to be a client for like another three years. It was rough.

00:35:56.51 [Michael Helbling]: What amazing moments in time where that haphazard analysis somehow interconnected with the interest of the moment. And everyone was like so focused on what you’re bringing to the table.

00:36:08.48 [Tim Wilson]: Well, that’s what gets, that’s what gets shouted from the rooftops. Exactly. And then boom, that’s the case to happen. Yeah, every week. Yeah.

00:36:17.27 [Moe Kiss]: I still believe in monthly reports.

00:36:22.36 [Tim Wilson]: I can mount my little soapbox very quickly on that, but I’m not opposed to monthly reports. If Moe, if you’re remembering me saying monthly reports were…

00:36:32.53 [Moe Kiss]: No, I think what we talked about was this idea that every month the data team were going to come and bring insights. I feel like probably where we landed, maturely, obviously, was an agreement that that’s not feasible or possible, but there is still value in meeting regularly with your stakeholders to talk about how things are performing and that sort of stuff and discuss priorities for the next chunk of time. Speaking of priorities, there is one that Val put on here, which I also used to have as a deeply held belief.

00:37:09.27 [Tim Wilson]: Do we have to go to another one that is Tim having a heated discussion off mic? about ticketing systems.

00:37:17.23 [Val Kroll]: Do you want to say more Val? Yeah, I believed that having like a ticketing system, I’m using air quotes here, something like a JIRA-like interface for people to enter the question of the moment was a great front end to the analytics experimentation team. with the huge caveat that I never thought there would be no conversation once ticket was entered before work began as I’m like bracing myself. I’ve definitely evolved on this and that was my, when this was, I was kind of growing up in digital analytics in the American Medical Association and we had 11 different lines of business that were being supported by a small but mighty team of three people. And so we were like doing a lot at first to like drum up interest and like for people to understand what this capability was and the ways that we could integrate. And then all of a sudden it was like the interest was there and we were like, Oh, holy shit, how do we handle all this demand? And so we’re like, I’ve got a great idea, which I think is like something that a lot of people turn to, right? But again, not as a replacement for that discussion. However, Tim and I did go a couple rounds as previously mentioned on, I think at least two other episodes that some people got to witness because I think Tim had a different deeply held belief.

00:38:42.26 [Tim Wilson]: It was one of those moments where like everybody around, like when you’re sitting in a restaurant and there’s like a couple having a heated, like the marriage is falling apart and they are intensely disagreeing and they have no awareness that they’re out at a nice, I couldn’t even tell you, I could guess who a couple of the other people who were witnessing her, but I think everybody got quiet and we were… It was just like everyone’s forks were just like,

00:39:09.85 [Moe Kiss]: This was actually at a restaurant or metaphorically at a restaurant.

00:39:15.50 [Tim Wilson]: Very physically. It was after we’d been with a client all day and the client had a ticketing system and I was like, well, we got to get that misconception off. And Val was like, I think it could work. And then we have to make some edits. We should update the intake form. And I’m like, whoa.

00:39:35.00 [Moe Kiss]: So, Tim, just to shorten it for a person on my team who deeply believes in the intake system and is an avid listener, talk us through your reasoning. Why is your deeply held belief that you shouldn’t have a ticketing system?

00:39:48.90 [Tim Wilson]: Well, so not to pull back and say that we do have a document we’re kind of referring to, but I want to throw it to Michael, because there was, to me, a hugely related thing that he had and what he used to believe. And we’ll see if I can get him to tee that up. And that’s kind of my main case. As it relates to ticketing systems? I believe it does. I think you could tell by the fact that I’ve made it 30 point font. Yeah, no, I see the font in the notes. Yes.

00:40:18.20 [Michael Helbling]: The thing I believed early on in my career and I think there was a couple of unfortunate events that kind of made me go down that path for a while was that business users knew what they needed from the analytics tool or what metrics and reporting they needed. So I would just try to answer their question exactly how they asked it. So it’s like, oh yeah, here is time on site for all these pages. And now you’ll know exactly what to do with this or whatever the terrible request was. It’s funny because really early in my career, what I did day in, day out was a lot of web trends implementations. I had a client that I did the implementation for. When you do the implementation, you’ll go build out a set of initial reports. They had a list of reports that they wanted. I made some extra ones that I thought were nice. And they called up the company and we’re like, we don’t like these reports and we don’t know what they are. So I had to go back out to that client and take those away. So that was sort of like a learning moment. But that was sort of one of those things where it’s like, okay, well, just give people what they ask for. Don’t ask too many questions. And as time went on, I learned more. There’s some discovery that you need to do with people when they ask for digital data, especially because there’s a pretty big divide or disconnect between what people are usually asking for and what they actually need. That was a learning process for me.

00:41:48.31 [Moe Kiss]: I’m a bit scared to poke the bear. I’m scared to poke the bear, but I’m going to do it anyway. That could happen in an email, a Slack message, a ticketing system. The business user is not knowing or not asking for the right thing or not knowing the question. That’s irrelevant of how they ask. That’s just the process to do the asking.

00:42:12.87 [Tim Wilson]: When somebody sends an email, when you put a ticketing system up, I know how it happens. We have so many coming in, like let’s just, and they’re incomplete if they’re in an email or if they’re in a Slack, there’s a problem. So if we just give the structure of what we really need and we put it in a ticketing system, then it’ll come in. It’s still, the math never works. You’ve got three analysts or five analysts or 10 analysts. If you have 10 analysts, they’re supporting 200 people. 198% of those intake requests is gonna take two to five minutes to put the request in and one to 20 hours to resolve it. And the business user is doing the best they can. They’re trying to be as prescriptive and helpful as possible. And they even believe because they filled out your damn form. Like you had seven things you wanted to know. What strategic goal does it tie to? What area of the site? Is this an analysis or a report? Is it a dashboard update? So they are, They are doing their best, but you’ve put a structure in front of them that implies that if they put this in, you will be good. And I don’t know why everybody’s laughing so hard.

00:43:29.51 [Val Kroll]: We’re not laughing at YouTube at all.

00:43:37.22 [Tim Wilson]: I don’t want to break the fourth wall. I think Merle or Val, I don’t know who’s going to pee themselves first. They’re laughing so hard.

00:43:45.44 [Val Kroll]: No, it’s Vowson. It’s going to be her. No, I’m pretty close. I’ve never seen someone take longer. I’ve never seen someone take longer.

00:44:07.87 [Michael Helbling]: I listen, we’re not under any kind of time pressure to do it.

00:44:09.59 [Moe Kiss]: All we can hear is tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick, tick

00:44:25.67 [Val Kroll]: I tried so hard to ignore it.

00:44:29.37 [Val Kroll]: I almost smiled.

00:44:30.13 [Val Kroll]: I’m like, is she watching the same thing I am?

00:44:33.46 [Val Kroll]: I was watching, guys. I’m enough of a tear.

00:44:38.10 [Moe Kiss]: OK. I really needed that laugh. That was the best moment of my day.

00:44:42.03 [Val Kroll]: I wish we could have played that back.

00:44:47.32 [Tim Wilson]: Yeah, much better.

00:44:48.38 [Val Kroll]: But that was a well put kind of summary on that, Tim.

00:44:51.05 [Tim Wilson]: Yeah, I mean, when you put the gate to the point of saying, after you’ve put it in, then we’ll reach out to it. Like you’ve already put a burden on them to, in their mind, structure things and like, this is what you, I’ve filled in your fucking form. You know, like why are you now coming back and wanting to engage with me? Like you’re actually making it harder on the analysts to say, no, what do you really want? So I think that’s, that’s to me, like way, way better to,

00:45:17.49 [Val Kroll]: Counterpoint there, though, is like, oh, I don’t want you to feel like you need to schedule time with me or whatever. When you’re in your flow in your moment, I just need three critical pieces of information or whatever I had at the time. And if you can complete these things, we won’t lose the thread that you don’t have to have the mental weight of remembering, that we don’t have to, whatever. So it was more about how is it, and then as the manager of the team, I can look across all the things that come in and think about a way to prioritize and triage. because we were working so collaboratively to support the business. So just to say that like, I understand what you’re saying there, but we did always go back and that was always a good valuable conversation use of time if we went back.

00:46:02.17 [Tim Wilson]: But you’re always setting that, that flow is coming in will be a, you’re never going to be on top of it. So you’re setting up the organization to always be in a point of saying, this is the stuff we’re not going to get to because it’s not, not a priority. And when they say, I just want these three critical things, I read it. I skimmed an article that said, these are things you should never say in a work context. And I was like, oh, this is related to one of them, which is this will just take a second, which is the same to me of saying, hey, I just have a quick. Can you just pull these three critical things? Well, yeah, one of them is on our dashboard. I can pull it. The second one is super easy to get to. The third one is damn near impossible. I could get close if I spend 12 hours on it. I can’t pull those three things. So the business users or the flip side, the business users ask for what they know is readily accessible. And there’s something else that’s readily accessible that would be way more valuable.

00:46:56.69 [Moe Kiss]: But that’s the job of a good data scientist is to realize that. I don’t know. I think one of the arguments that I often get is that it does introduce some friction, which is a good thing, right? Is because stakeholders are stupid shit. Sorry, I’ll be nice. And sometimes they’ll ask questions that are interesting, but not a good use of time or exist somewhere else or something else. And by creating this friction of forcing someone to fill out a form, to give thought to what they’re asking for, to not drop a sentence, which I do to my team all the time, of like, yo, can you find me this thing? And they’re like, Moee, it stops that behavior, right? Because you have to fill it out and be like, what am I asking for? It does require thought. And so, yeah, I don’t think there’s a rumor right here, but I just keep poking at it. Oh, there’s a wrong.

00:47:49.70 [Val Kroll]: Oh, okay, there is a wrong.

00:47:51.06 [Tim Wilson]: So let me paint what I think is realistic and practical, and it goes a little bit to the center of excellence, like much, much better for there to be something of a mind meld, or at least a deep enough relationship so that there is a analysts or data scientists, depending on the role, so deeply kind of collaborating with their business partners that instead they’re kind of coming up with, what do we really need to know? What are our ideas? Like it’s starting at that level and I get it. There’s the, I mean, the, it’s such a knee jerk, easy thing. I’ve got presentations that are available online. Sometimes people just need a number because they’re heading into a meeting with the executives. Sure. That’s a, There are times where I just need this quick number. So put those aside. But I think almost everything else would be much, much better served by having a thoughtful coming together of what are we really trying to do here?

00:48:52.68 [Moe Kiss]: But Tim, they don’t have time for all those conversations.

00:48:56.16 [Tim Wilson]: Bull, they don’t have time to not have those conversations. That was intense. That was intense at all. Okay, sorry. You were like, I’ll push this button and watch. Yeah. You guys are mean.

00:49:06.47 [Michael Helbling]: Seems like what I hear you talking about, Tim, is the need for a data sommelier.

00:49:12.44 [Tim Wilson]: I was afraid you were going to say an analytics translator was long ago.

00:49:19.59 [Michael Helbling]: Data sommelier, but sometimes you just need a beer. And I think that’s the thing is that we’re getting into is like sometimes you just need a Bud Light and then other times you need a Chateau de Neufpap, whatever.

00:49:33.42 [Tim Wilson]: It goes to Moe’s point of saying sometimes giving an imperfect answer quickly is better than giving the perfect answer too late. I just totally paraphrased. I think I might have said something shorter than you said it earlier, which has never happened in my life. But you can only do that if you’ve got the trust in the relationship and the understanding of the question behind the question to be able to to do that. I don’t think it’s more time consuming. I think it’s less time consuming.

00:50:08.21 [Val Kroll]: When we had originally implemented this, when I always worked with clients who were using it, I thought this way is removing friction. It’s interesting. I understand why you guys are saying that. But again, wanting to keep it within their flow. But it was so interesting, the number of times when we would get a question, and they would say, can you look at all the content in slash whatever section of the site? What content is popping? Or whatever kind of generic question? And so then we would look at that and say, look at it friendsically. I wonder if they’re thinking about revamping the content within this. section of the site or I wonder if they’re thinking about, you know, launching a new campaign. And so we’d have all these hypotheses about why they even ask the question just to go back. So what is this really rooted in, right? To kind of like, you know, start the conversation there, which is funny because if we had just started there and not with the interface to Tim’s point, we wouldn’t have to play that little game. But yeah, I guess again, intent being one thing, not always there showing up in execution. Sometimes people do forms because they think it’s reducing the friction. Because the worst thing is it just goes away, just off into the ether, right? That was what I was always trying to fix.

00:51:25.14 [Moe Kiss]: I do also think, though, for more junior folk, the reality is Like when you have a team of 40, you have to have some way to manage this stuff, because otherwise you have some data scientists that are just getting DMs constantly, and they tend to be the folks who are giving a great thought. But it’s also like, it is partially about teaching younger data scientists what are the questions we should put energy and time into. It helps them sometimes say, yeah, I hear that you’ve got this question, but I’m going to focus on like it does help with prioritization.

00:51:59.57 [Tim Wilson]: I absolutely believe there should be a log of what’s the work that’s been done at the proper level of fidelity to know to build that little data set to say, these are all the requests we fulfilled, which if you take, that is a nice upside of having a ticketing system is you have a data set being built and your requesters, your business partners are helping you build But I think that can be built that that’s not that doesn’t require an intake form to track.

00:52:27.36 [Michael Helbling]: As a data sommelier, you keep a log of all your tasting notes. Yeah. This table has exquisite robustness. Anyways, OK, we have to start to wrap up because yeah, yeah, we’ve been going at this for a little while now. Yeah, I know.

00:52:50.08 [Moe Kiss]: of so many that we haven’t discussed.

00:52:53.47 [Michael Helbling]: As is always the case, yeah, we should do a part two.

00:52:57.71 [Moe Kiss]: Okay, one last one. One last one. I’m just going to say it out loud and I want like a quick, quick hot take. You should test everything. Thumbs down.

00:53:07.94 [Michael Helbling]: Yeah. That is a good one because it’s so easy to agree with. It’s like, especially because so many of the big internet companies are like, we run 10,000 tests a year, like those kinds of things. And so you’re like, oh, well, then testing velocity is the thing we should really be trying to do. Like, oh, we just got just more. And then. Do you know, I think you should test everything replaced.

00:53:30.10 [Moe Kiss]: You should track everything.

00:53:31.66 [Michael Helbling]: Well, I think people who think you should track everything also think you should test everything.

00:53:37.19 [Val Kroll]: I mean, there was a book. It wasn’t that a Chris Gowered book, which I know that he’s like, you know, since like moved on from, but you should test that was the title of the book. That was like, uh, it was a pretty big concept at the time. I do remember that’s like when I was really getting into testing. I was like, yeah, that’s a really good one.

00:53:52.76 [Tim Wilson]: I am, I am three weeks away from moving on from analytics the right way that I’ll, I’ll move on from my book and say,

00:53:59.52 [Michael Helbling]: Oh, well, let’s save that for part two. All right. We do have to start to wrap up. And that makes me delighted because now it’s time for a quick break with our friend, Michael Kaminsky, from ReCast, to the MediumX modeling and GLIF platform that helps teams forecast accurately and make better decisions. And as you know, Michael’s been sharing some bite-sized marketing science lessons over the last couple of months and the next couple of months to help you measure smarter. So over to you, Michael.

00:54:28.36 [Michael Kaminsky (Recast)]: Let’s talk about one of the most misinterpreted terms in analytics, statistical significance. Many people say that the results of the experiment were stat-sig, but that abbreviation is misleading. Instead, you should say the full sentence. We found strong evidence that the difference between treatment and control was different from zero, assuming all of the model’s assumptions are correct and allowing for a 5% false positive rate. And even when communicating statistical significance in this way, you should be aware of its limitations. First, it assumes that the underlying model is the true model. In the case of analyzing a simple two-cell experiment, that might be the case, but in more complex models, we’re often operating in a world where we know that our model isn’t the true model of the world, just an approximation. So p-values tell us very little. Second, it assumes that zero is the most important comparison. Is it zero or is it not in zero? In business, it’s very often the case that we care about something other than zero. we might care about whether or not the difference is sufficient to make A versus B more profitable. And simply knowing if the difference is greater than zero doesn’t actually help us make the right decision. So in general, let’s say focus on reporting confidence intervals, which give a range of likely outcomes instead of P values. And if you’re going to share P values, make sure you communicate your assumptions and what you’re comparing against.

00:55:40.91 [Michael Helbling]: Thanks, Michael. And for those who haven’t heard, our friends at ReCast just launched their new incrementality testing platform, GeoLift, by ReCast. It’s a simple, powerful way for marketing and data teams to measure the true impact of their advertising spend. And even better, you can use it completely free for six months just visit www.getrecast.com slash geolift, G-E-O-L-I-F-T. That’s getrecast.com slash geolift. And you can start your trial today. So go check that out. That’s actually really cool. All right. Now we want to jump into last calls, something that might be of interest to our listeners that we go around the horn and share. Moe, why don’t we start with you? What’s your last call?

00:56:24.90 [Moe Kiss]: Well, we know that I binge listen to things in a quick succession. So I’ll go through like a podcast or a series or an author. That’s just kind of like how I work. And I have been binging work life by Adam Grant. And there are two episodes in particular that I thought were really interesting. And one was called The Truth About the Attention Crisis. I don’t know if that’s the exact name, but that was the topic. And it was with a historian Daniel Imowar. I’m going to butcher someone else’s name on a podcast again. This is what I do. But he wrote a really interesting article about the fact that he thinks it’s a myth that we can’t pay attention to anything. And he talks about examples like video games and opera when first opera was the cultural pastime and books. And yeah, it was just one that I thought was a bit of a different perspective on the usual phones are killing our brains type of thing. There was also a really interesting one on the case against personal branding. But the one that I probably most enjoyed was one with Linda Babcock. And it was about protecting your time. And she specifically talks about a no-club that she’s in and how women take on more non-promotable tasks at work and what are some strategies that you can employ to guard your time. So I highly recommend checking out the podcast.

00:57:45.90 [Michael Helbling]: That sounds good. Awesome. All right, Tim, what about you?

00:57:51.57 [Tim Wilson]: So I’m going to start off by acknowledging to paraphrase Mike Berbiglia on his last latest special. Moest of my last calls are for you. But this last call is for me. So I’m going to last call a podcast that is no more. But I don’t think I realized exactly how much I was deeply into this podcast until it actually went off the air with the last episode being with Barack Obama. This is really more for that handful of listeners who had slowly come up becoming WTF with Marc Maron fans like I had over the years. I remember he’d been doing a thought for a long time when he had Barack Obama on the first time. is he has had his slow burn and his last episode was almost exactly a month ago. I’ve listened to so many. I feel like I’ve learned about storytelling. I’ve learned about addiction. I have learned about thinking deeply. And I will warn you now that at the very end of our outtakes, there’s probably going to be my final homage to Mark Marin. And for all of you who are like, what the hell are you talking about? That’s fine. Just skip it for those handful of you who are like, I fucking love WTF with Mark Marin and I too am grappling with his departure. This is for you and it’s for me. So that’s my last call.

00:59:28.02 [Michael Helbling]: Very nice. Very nice. All right, Val. What about you? What’s your last call?

00:59:33.53 [Val Kroll]: Yeah, so this is another UX collective medium post. And so this one is like from that design world. I love, I love this one a lot. It’s a good follow. But this was a specific article by Fabriza Osiello. called building trust in opaque systems, why the better AI gets at conversation, the worse we get at questioning it, which is something that we’ve heard and talked about before, but this take I’m telling you is really good. About halfway down, there’s this concept where they’re talking about scaffolding over crutches, and I’ll just read this one sentence out. that this reveals the fundamental flaw in the current approach. They’re asking us to bear the weight of responsible use while providing tools designed to discourage the very skepticism that they require. And so there’s also this whole section at the bottom that’s like reimagining this world about what if it was asking you to have curiosity about the answers provided when it was appropriate. And I thought it was just really interesting and well thought out. And I was like, yeah, they should absolutely do some of this. Because you think like, oh, you know, you just can’t trust it. But like, what does that actually mean in practice? And so I thought it was really interesting. And then my like, 0.5 plus additional last call, because it made me think of it, is a podcast that I’ve been addicted to for years is the daily. And there was one in mid-September called Trapped in a Chat GPT Spiral. A look at just how close and dangerous our relationships with AI can be. And everyone’s heard like, you know, getting too close or using it as a therapist and this crutch or, you know, having these like relationships. And you think, oh, that’s just for like X type of people, like whatever your brain kind of assumes is the type of person who would be susceptible to that, but I thought that this podcast was really interesting because it actually interviews some of the people who have fallen victim to that and they kind of express, I’m just not that kind of person who could get duped, but I still got caught up in it. And so, anyways, that’s a fun, quick, interesting listen as well. So that’s my 1.5 last calls for today.

01:01:46.92 [Michael Helbling]: How about you, Michael? Well, you know, if you’re like me, you’re constantly reading articles and things online or somebody sends you something and you’re like, oh, yeah, I’m definitely going to get to that and read it at some point. And then it stays in an open tab for about three weeks and you never actually get to it. Or you do read a great article, and you’re like, where was that article? I can’t remember what it was. So I recently stumbled across something that I’ve been looking for for a long time. It’s somebody built it, which is a personal data lake, which is a little bookmarking tool that you just go in and say, oh, yeah, I’m reading this. You click it, and it saves the entire page into your own little personal data store, which you can then search later. You also connect with MCP, so you could even do some AI stuff with it if you want down the road. Anyways, it’s called IDEXA. We’ll share a link to that because I just started using it and started saving some pages. The first page I saved was recast geolift page at get recast slash geolift. No, that was just accidental, but I did save that page anyways. A little plug for our sponsor there. Anyways, but anyways, it’s super helpful for me because now I can close like 30 tabs and know that they’re still saved somewhere where I can get back to them at some point. So that’s nice. And as you’ve been listening, maybe you’ve got thoughts on stuff you used to believe and found out later on some new way of looking at it in the data world. We’d love to hear from you. The best way to do that is you can reach out to us on LinkedIn or the measure slash chat group or through email at contact at analyticshour.io. And if you’re going to listen to the show, Leave us a review or a rating. We really appreciate it. And you can also request stickers for the show because Tim loves mailing them out. And you can do that on analyticshour.io. There’s a form you can fill out and we can send you stickers that you can put on your laptop or on your phone case or wherever you put stickers for your favorite podcasts. So yeah. All right. Well, thank you all for doing this show. We didn’t have like a specific guest. You’re all the guests today. So congratulations. But I think this was fun. I saw a lot of fun. So thank you. And I think I speak for all of my co-hosts, Moe, Tim, and Val. And I say, no matter what your priors and how they’re changing over time, keep analyzing.

01:04:19.51 [Announcer]: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Grohurst.

01:04:37.53 [Charles Barkley]: So smart guys want to fit in. So they made up a term called analytics. Analytics don’t work. Do the analytics say go for it, no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

01:04:57.09 [Val Kroll]: Tony let us hit the cutting room floor.

01:05:01.94 [Tim Wilson]: Let it hit the cutting room floor.

01:05:05.08 [Michael Helbling]: When Michael’s the guest. Well, I mean, we can be like, you’ve got 60 seconds to teach us something.

01:05:12.01 [Val Kroll]: No pressure.

01:05:16.06 [Michael Helbling]: I think he’d be like, no problem. Yeah. Yes, Moe. You can. I’ll start the show. When you get back, you’ll be your turn to say, how you going? I’m just kidding.

01:05:27.54 [Tim Wilson]: Now there’s a loss in that, right? Nope. That’s not the… Nothing at all. It’s perfect. The act of retweeting. Some people thinking I was just on my computer the whole time, but I was like, well, I’m the one who has the best network.

01:05:43.24 [Michael Helbling]: Just our, our buddies. Yucking it up. It’s a thing we do. It’s called the back channel. We just make fun of anybody who’s speaking.

01:05:51.84 [Val Kroll]: If I ever see the two owners of who is the company that shut this down. I was like, I will literally throat punch them. That’s the level of aggression that I feel. And she was like, oh, never expected to hear something like that come out of your mouth. And I’m like, and I’m not joking.

01:06:08.81 [Tim Wilson]: That’s the Chicago way. So then there’s the question, can that go in the outtakes? Yes. Okay. So, you know, Tony, clip it down to whatever you think is the funniest, but definitely throat punch that definitely needs to be in the perfect. So yeah, definitely going to throw it to you. Your intros are getting like more concise and minor just getting longer.

01:06:35.20 [Moe Kiss]: I was thinking that too. I observed that.

01:06:41.35 [Michael Helbling]: I don’t know, is this an idea?

01:06:42.17 [Val Kroll]: So you’re saying that Tim sounds like AI?

01:06:45.32 [Michael Helbling]: No, no. I’m saying that I use em dashes and I am like barfing shit out. What I’m saying is, Tim is basically an AI in terms of a super intelligence, okay?

01:06:59.60 [Tim Wilson]: Mm-hmm. Content for like one medium length blog post and it became an entire book like that’s that’s what you’re saying Well, that’s not encouraging me to start reading analytics the right way At least I’m honest Rock flag and cat angels everywhere!

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#290: Always Be Learning

#290: Always Be Learning

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_290_-_Always_Be_Learning.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares