#296: Avoiding Major Oopsies: Twyman's Law, Intuition, and Valuing Accuracy Over Precision

What do diamond ring shopping, Uber pricing psychology, and active user metrics gone wrong have in common? They all highlight our complicated relationship with precision versus accuracy—and how that relationship can either build or destroy trust in our data. Arik Friedman from Atlassian joins us to unpack why being “about right” often beats being “exactly wrong,” and why your nagging feeling that something’s off might be a useful insight in and of itself. From the discipline of documenting assumptions to the art of knowing when to round your numbers, we tackle the very human challenge of working with data that’s supposed to be objective but rarely is. Plus, we explore Twyman’s Law (if data looks too good to be true, it probably is) and why sometimes your intuition is your last line of defense against embarrassing mistakes.

This episode is brought to you by Prism from Ask-Y—your agentic analytics platform for automating analytics, exploring data, creating repeatable workflows, and delivering accurate insights—all without the need for manual query writing.

Links to Resources Mentioned in the Show

Photo by Sarah Kilian on Unsplash

Episode Transcript

00:00:00.00 [Tim Wilson]: Welcome to the Analytics Power Hour.

00:00:08.92 [Announcer]: Analytics topics covered conversationally and sometimes with explicit language.

00:00:15.20 [Tim Wilson]: Hi everyone, welcome to the Analytics Power Hour. This is episode number 290, oh wait, no, wait, let me check, this is episode number 296.

00:00:26.04 [Tim Wilson]: That was a close one.

00:00:28.32 [Tim Wilson]: I mean, how embarrassing would that have been? We’re all set to have a discussion about data accuracy and getting business partners to trust the data and I almost whiffed on something as simple as the episode number. Already though, perhaps I’ve undermined your trust in me, Tim Wilson, sitting in the host chair that is usually occupied by Michael Helbling. Perhaps I have. Let’s ask my co-hosts. Moe Kiss from Canva, welcome to the show. What do you think? Did my little gaff destroy my credibility?

00:00:57.68 [Moe Kiss]: Yeah, trust is completely lost. No, I’m joking, you nailed it.

00:01:02.44 [Tim Wilson]: It’s been lost, it was lost years ago with you, so, and Julie Hoyer, from further, have I already destroyed our audience’s confidence in me?

00:01:14.16 [Julie Hoyer]: What’s a mistake among friends, right? I’m just your pre-read.

00:01:17.20 [Tim Wilson]: Okay, I like it. Well, that’s the topic for this episode. I mean, sort of. All data can be digitized and an overwhelming amount of it is and digitized data breaks down into like discrete little ones and zeros somewhere down the chain. It should be cold and objective and to use it effectively, we need to get it to be as accurate and precise as possible, right? I mean, well, maybe. So joining us for a discussion on just that topic is Arik Friedman.

00:01:48.88 [Tim Wilson]: Arik started his career as a software engineer, then took a turn and did a PhD in computer

00:01:54.28 [Tim Wilson]: science focused on privacy-preserving data mining, which is a tongue twister, then went on to work a few years as a program manager for Microsoft R&D, then he popped back over to research for a while, and now he’s been at Atlassian for over 10 years where he is currently a senior principal data scientist focused on product analytics. And today he is our guest.

00:02:18.40 [Tim Wilson]: So welcome to the show, Arik.

00:02:21.52 [Arik Friedman]: Thank you very much. Long time listener. Happy to be here. And I assure you, like all of these career moves made a lot of sense, like it, it made a lot of sense and at the time, it was a logical progression at the time. Absolutely.

00:02:36.34 [Tim Wilson]: I went architecture, technical writing, marcom analytics. So I, I sympathize and I actually, I don’t know that any of those made any sense.

00:02:45.84 [Tim Wilson]: But that’ll be something for me to ponder on my deathbed, but so Arik, you presented

00:02:53.40 [Tim Wilson]: at the last measure camp in Sydney and in that presentation, I think you had a, a fun or maybe it’s relatable at least, or maybe it’s a tragic story about an active users metric that looked like almost too good. Like the results were brilliant. Everyone was thrilled, but you had this like nagging feeling that something was off and you couldn’t really nail down the specific problem. So maybe that’s a good way to sort of start the show. Cause I think it gets to something really human about our relationship with data. So maybe we’ll kick things off by having this, having you walk us through kind of what happened

00:03:28.88 [Tim Wilson]: there.

00:03:29.88 [Arik Friedman]: Sure. So like this is a story from a while back, but you know, at the time the product team was working on a product feature, a big change in the user interface and a lot of investment went into that. So we put a lot of work on actually testing things, you know, A, B testing. I worked with another data scientist to make sure that we got things right. And I remember back then, you know, us standing in the boardroom with all the PMS and it’s looked like really a big change. Like it moved the needle quite significantly. So that was, that was a big deal.

00:04:11.24 [Tim Wilson]: Everybody were happy. And then as we went on and rolled out this feature, we saw what we expected to see. Active users went up.

00:04:22.96 [Arik Friedman]: But then, you know, over time, I’m starting to develop this, you know, old feeling, you know, you look at data over time, you get a sense of how it looks, how growth looks

00:04:34.28 [Tim Wilson]: like.

00:04:36.04 [Arik Friedman]: And I was looking at the curse and they looked a bit suspicious, right? Like it’s not supposed to go like that. And I started looking into that, okay? Because it felt a bit odd. And I remember, you know, trying to go down to specific trace logs, trying to find what went wrong, I found like everything checked. So you know, after putting some time into that, I said everything was good. We actually have a reason to believe that, you know, active users goes up, all is fine, right? I cannot say anything about this. I don’t have a smoking gun. Fast forward, sometime later, another data scientist and a software engineer, of course, we keep looking into this and they actually found a bug.

00:05:23.32 [Tim Wilson]: It turned out that because of a bug, it caused inflation of active usage monitoring.

00:05:30.00 [Arik Friedman]: And that was an unpleasant surprise for the product team. So for me, it definitely caused a lot of thinking, you know, like, how come I didn’t find that?

00:05:40.24 [Tim Wilson]: What could I have done different?

00:05:43.00 [Arik Friedman]: And that, you know, that causes a lot of reflection.

00:05:47.60 [Tim Wilson]: Well, did you go tell them, like, immediately, abruptly, did you ease it out? Like how did you actually communicate, like, what’s the rest of the story on that?

00:06:01.08 [Tim Wilson]: Yeah, so I think that’s probably the biggest mistake I made at the time because, you know,

00:06:08.36 [Arik Friedman]: I didn’t find an issue. And you know, I, at least back then, I thought, you know, we are the data people, we are the evidence people. And if we don’t have the data to back up what we say, we should shut up. And what I learned from that is actually, you know, our intuition are, you know, that’s part of our expert opinion.

00:06:31.08 [Tim Wilson]: And we should sometimes just go with it.

00:06:36.12 [Arik Friedman]: And I think there are a lot of things we can do ahead of time, you know, to prevent mistakes or to check things. But at least for me, like, for this story, like, when I look, you know, what could I’ve done different, I actually knew to do all the checks. And eventually, when your intuition is your last line of defense, and sometimes you just have to go with that.

00:06:58.04 [Moe Kiss]: So sometimes, I don’t know, I, firstly, I just want to say thank you. I’m really grateful that you’re sharing, you know, straight off the bat, an experience of where you made a mistake that’s incredibly humble and wonderful for folks to be able to learn from. So a very big thank you. And I suppose I just, I’m curious to understand this intuition piece, right? Like I feel we all have it. And I know you and I have had conversations previously about like when we’re making decisions, you know, we have data, we have intuition, we have, you know, maybe previous experience, we have all these different factors and part of our role is helping pull those things together. But I’m curious to understand. So how has this changed how you would show up? Like, let’s say this happens next time, you can’t find a smoking gun, you can’t find a data point, but you had your, you know, your intuition in your gut that’s like something’s not right here. Like, do you think this time you would say something and how would you frame it?

00:07:49.92 [Arik Friedman]: Yeah.

00:07:50.92 [Tim Wilson]: I think, first of all, it’s about being more opinionated.

00:07:56.88 [Arik Friedman]: And in the, like over time, that’s part of my growth journey. I think in the past I would, I tended to be a bit more, you know, impartial, right? We let the data speak. Like we’re just there to give the data its voice. And I think over time I feel a bit more confident, you know, to be opinionated, like I have my own opinion of things. And I think we’re actually expected to be opinionated. So I think that’s, that’s one thing, just like be more confident in our own expertise. And then I think it’s probably, I mean, the opinion on its own is not enough, right? Like a probably, even if we have just this intuition, we will still be expected to, you know, okay, dig into that. Okay, can you actually find the evidence, but I think it’s a start of a conversation, right?

00:08:50.52 [Tim Wilson]: Like this is what I think. Maybe the data looks a bit odd and we maybe want to dig a bit more into that.

00:08:59.16 [Arik Friedman]: And then the business can decide, you know what, no, like we did all our checks and balances. It’s good. We did our due diligence.

00:09:05.20 [Tim Wilson]: Let’s go on.

00:09:06.36 [Arik Friedman]: Or they could say, you know, let’s, let’s take more time and actually give it more space

00:09:10.24 [Tim Wilson]: to dig into that.

00:09:13.00 [Arik Friedman]: And at the time I basically made my own decision and said, you know, I looked enough into that and I kept it with myself. And this is something where we need to be more open and just share what we think with a business partners about that.

00:09:28.08 [Tim Wilson]: So I will sometimes find myself not doing this necessarily with a ton of structure and rigor, but trying to think what I expect to see before I actually look at the data, or sometimes asking if I really don’t think I have context and my business partner says, I think it’ll go up. I’m like, what is, like, what does up mean for this metric or what is, what would that mean? Like I don’t give me, use it as an opportunity to try to get a little more context about their intuition about their business, like there’s, there’s some psychological trick or something where you, you sort of force yourself to set what you think you’re going to see as opposed to waiting until you see it and then you immediately rationalize why it makes sense. Like I had an example from years ago where a product marketing manager came racing in because he had already told the whole company that this tiny little change he’d made to a webpage had like drastically driven the track traffic up, which made no sense. And that was one where I dug in and found that it was, it was a bot, there was a company that was selling us software and they pointed literally at his page to gather some data for the pitch in, in a few days. But it was one where it was like, he saw it, he was a good news, kind of like the active users. But if I’d gone back, if I’d known he was making that change, I would say, what do you

00:10:59.28 [Tim Wilson]: realistically think this is going to do?

00:11:03.24 [Tim Wilson]: Because if it goes way past that, maybe we want to, you know, apply a little bit of climate law or, you know, something to it. But I think there’s, I don’t know, I feel like that’s a whole, it’s a challenge, especially for people just entering a field or the space to have that intuition and to have the confidence on top of it. Like you said that a few times that it’s like, as you get more experience, the more faith you have in your intuition and your conviction. And that’s like, how do you, how do you develop that if other than letting time pass?

00:11:42.00 [Arik Friedman]: I think a lot of this is getting to live a bit in your business partner’s world, like getting to speak the language and, you know, getting to know what they care about, what they’re after. I think that adds a lot of context. And for example, I remember at some point, you know, I started hearing PMs talk all

00:12:01.48 [Tim Wilson]: the time about like JTBD, JTBD this, JTBD that, like jobs to be done.

00:12:06.72 [Arik Friedman]: I had no idea what it was all about. And, you know, I started digging into that, reading a bit about that. And, and, you know, what I found, like, I didn’t really get what they were talking about. So getting into the world, learning about that, I think helps create this common language and thinking about it from their perspective. So I think that definitely helps us, you know, to get, so to speak, like in their head.

00:12:34.20 [Tim Wilson]: When that was one where you actually, if I, I had this happen with OKRs once where I was working with someone who was, and I knew what OKRs were, I’d lived in kind of

00:12:43.72 [Tim Wilson]: an OKRs training, but it was like, oh, this client is really into it.

00:12:49.12 [Tim Wilson]: I’m going to go get a book. And didn’t you do that? Didn’t you wind up going and like read, like whatever the JTBD Bible is? Like, OK, if we’re going to use this, I want to understand it. I mean, you probably found out that they actually weren’t using it correctly. It’s like Agile or any other number of things that use the acronym, but maybe aren’t applying the process.

00:13:10.36 [Arik Friedman]: Yeah. So I went on and I actually read about jobs to be done. And I, it was actually weird because like what I was reading and what I was hearing from the PMs were not exactly the same things. And at the time, like in our confluence, like internal wiki, I wrote a blog post. Hey, like, are we doing JTBD wrong?

00:13:32.04 [Tim Wilson]: And it was a bit of a clickbait, but it started a conversation

00:13:36.96 [Arik Friedman]: because I mean, then I started like through the conversation with them. I started to find out, oh, actually there are different kinds of definitions of JTBD out there and having this conversation and trying to understand, you know, what they really mean when they talk about that, that really helped. And I did it from the perspective of, you know, how it can be more useful as a data scientist and what is it that they actually after when they talk about understanding the JTBD. So I think it was first, you know, just to catch up with them, but also to see, you know, how can I answer their questions and how can I better understand the question and, you know, improve myself as a data scientist

00:14:18.68 [Tim Wilson]: that helps them. Michael, I have news.

00:14:23.48 [Tim Wilson]: The AI analyst is emerging.

00:14:27.08 [Tim Wilson]: Oh, that’s big coming from the quintessential analyst. What do you mean, like a cryptid? Well, I mean, more like a more like a job promotion, but, you know,

00:14:35.32 [Tim Wilson]: with more existential dread, you know, how foundation models created the AI engineer role. Yeah. Developers got all these cool titles and analysts got. Can you pull this by in today? Exactly. But now we’re watching the birth of the AI analyst, someone who uses LOMs to multiply their capabilities without, you know, multiplying their stress rash. Nice. So an analyst, but with superpowers and fewer open tabs. Exactly. And the tool for that is Prism by ask-y.ai.

00:15:07.92 [Tim Wilson]: Yeah. Prism is basically the interface between what I mean and the 900 steps I don’t want to do. You ask in plain English and it helps you get from question to analysis really fast.

00:15:19.72 [Tim Wilson]: And it doesn’t forget your world. Prism’s jam memory as J.A.M. their jam memory remembers your definition like what your means by conversion. Which table is the source of truth? And that July data, don’t forget, it’s, you know, cursed.

00:15:37.64 [Tim Wilson]: Yes. Thank you. I’m tired of explaining our metrics like it’s the extended

00:15:43.92 [Tim Wilson]: MCU universe. Plus, you can capture like repeatable workflows as skills, portable expertise, like, you know, clean UTMs or fixed GA for channel grouping or standardized campaign naming and reuse those across different data

00:16:01.84 [Tim Wilson]: sets. I like that because I do a lot of copy paste. This is can’t continue type of feeling. And I feel like it would be nice to be like run skill, look smart, drink water.

00:16:15.64 [Tim Wilson]: Nice. Want to become the AI analyst before your coworker does? Go to ask-y.ai and join the wait list.

00:16:24.16 [Tim Wilson]: Yeah. And use code APH and ask why it pushes you to the top of the wait list. That’s ask-y, the letter Y.ai and use code APH. Yeah. The AI analyst is here. This product is in beta, but you can get in on the ground floor. And it’s coming for your busy work, not your job.

00:16:44.64 [Moe Kiss]: So, you know, relax, chill out. So, Arik, you and I have spent a lot of time talking about this whole like accuracy versus precision playoff. And it’s something that has just really resonated with me because I would say I always lean to kind of like the best possible answer with the time that we have in the business to help make a decision. Like, I suppose I would say I’m like quite pragmatic, but a lot of what I hear coming from data scientists is this number is wrong. This one’s right. We can’t do it this way. This, you know, we have to do it this way because this is the right way. And I guess I just wanted to hear your framing about this like playoff between accuracy and precision. And like, I don’t know, you have such an elevated way of thinking about it.

00:17:31.40 [Arik Friedman]: Yes. So I think like straight from primary school, the way we’re taught things is that accurate and precision are like being accurate and precise are the same things, right? So at school, you get like this big equation. And like in the data science world, you can think about a business question,

00:17:51.20 [Tim Wilson]: like, you know, what, what kind of features correlate with user satisfaction

00:17:55.96 [Arik Friedman]: or something like that? Or how can we predict those kind of things in parallel?

00:18:00.60 [Tim Wilson]: Like, like at school, you might get a question, like, you know,

00:18:06.24 [Arik Friedman]: how much is like 2,124 times 3,926? Like you get this big equation. And what you’re taught is that you need to go through the methods, right? Like you have to multiply this digit by that digit, carry over. If you get anything off, like any single digit is off, you lose all your marks. So if your answer is not precise, it’s not accurate. You lose all your points. And I think we kind of like carry on this mindset with us into our jobs. And in the business domain, actually precision and accuracy are not the same thing because, you know, because, oh, like these big numbers, it’s about, you know, 2k times 4k, it’s about 8 million. And like about 8 million is accurate. It’s not precise, but it’s accurate. And I think that also, like when we get, you know, business questions, there are many ways we can go and approach and solve them. You know, we can throw them, I don’t know, like hidden Markov models or, you know, clustering algorithms. So there’s all this arsenal of like all these methodologies that we learned. And like, you know, that’s kind of like even part of our pride in the craft, right? Like we want to show that we know to do all those things. But sometimes you can get a quite a good answer just by writing like a very simple SQL query. And, you know, it’s maybe not the best answer, but it’s good enough and it’s faster. And there is this quote from John Tukey from like his, he had like this paper about the future of data analysis.

00:19:43.44 [Tim Wilson]: And like this is from like the sixties.

00:19:46.44 [Arik Friedman]: And he says, like far better and approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise.

00:19:56.92 [Tim Wilson]: And I think it really captures well that, you know, it’s first of all,

00:20:02.28 [Arik Friedman]: it’s better to answer the right questions. And I think that part of our job is to help get to those right questions. But also when we answer, it’s not just about precision. It’s, you know, the answer can be accurate without being precise.

00:20:18.24 [Tim Wilson]: And I think that’s a way for us to be more fast and effective

00:20:23.24 [Arik Friedman]: and be focused on business impact.

00:20:24.84 [Tim Wilson]: But there’s there’s a there’s a challenge, right?

00:20:27.56 [Tim Wilson]: That that our business partners accessing tools and data, they’re working with spreadsheets, they’re working with dashboards because they’re in this digital world and processing and multiplication is is is cheap. Like they are conditioned to see things that are precise. I mean, I love the like the the accuracy versus precision. Like a dark archery thing that sort of shows accurate the grid, the two by two of accurate and precise, accurate, not precise, precise, not accurate, not accurate, not precise. And it shows like the spread around the target.

00:21:07.80 [Tim Wilson]: And I think that’s good for us to be aware of. And part of it is, I mean, can that be handled a little bit with the language

00:21:17.72 [Tim Wilson]: of saying, try to get yourself out of the boat of providing. Precise answers, like get comfortable with even if you have a precise answer, better to kind of back off the precision of it a little bit so that somebody isn’t looking at you’re not arming them with ways to. To find another person, it may be too reasonably accurate,

00:21:44.64 [Tim Wilson]: but the precision means that the the first three digits kind of differ.

00:21:49.92 [Tim Wilson]: Like there’s there’s a challenge with us understanding that and our business partners thinking that way. And the levers we can pull to try to help them think in language of. Of this is this is accurate, you know, it’s about it’s about eight million,

00:22:08.88 [Tim Wilson]: you know, or whatever the answer is.

00:22:11.92 [Tim Wilson]: And they’re probably not going to object to that. They’re not going to come back and say, what do you mean about a million?

00:22:16.08 [Tim Wilson]: I need that down to the to the first digit, right? I mean, it’s the final line to walk.

00:22:22.76 [Arik Friedman]: Yeah, and I think it really depends on the context and the question being cast because in some contexts, like if you deal with financial data, you definitely want to be precise. It’s important, but most like a lot of the business questions are more direct directional in nature. And when you deal with directional questions, it doesn’t matter as much. You know, if you’re precise and actually precision can kind of draw the attention to the wrong thing because you don’t care about the position necessarily in these kind of questions. So this is about getting to the bottom of what are the decisions that they are about to take

00:22:58.80 [Tim Wilson]: and what really matters for answering those decisions.

00:23:02.36 [Moe Kiss]: So one of the things I wanted to like challenge and have a discussion on. So I’m trying to find a specific article. And of course, I can’t. But there was some research that was done with pricing and Uber. And I often quote it when I’m counselling people on buying engagement rings. Buying a two-carat diamond ring is more expensive than a one point nine nine nine carat ring, right? Like we have this bias and heuristic to think that the two carat, the rounded number is better. So hold on.

00:23:30.16 [Tim Wilson]: Just wait, wait a minute. How many people are you counselling on buying rings? Like, what is your world?

00:23:37.52 [Moe Kiss]: I my team are quite young. And so a lot of them are going through that life stage.

00:23:43.76 [Tim Wilson]: Those are just one on ones. They’re like, hey, what’s going on? I could really do some help.

00:23:48.36 [Moe Kiss]: Anyway, OK, carry on. So also, I don’t know who’s buying your two-carat diamond ring.

00:23:54.52 [Tim Wilson]: It’s a it’s a anyway.

00:23:59.32 [Moe Kiss]: Is that that’s a that’s a big one? Three carat is quite big. Yes, that that’s not for America, but for Australia. Americans tend to buy much bigger rings. Anyway, back to the point of the story. Well, and now the lab, you should see these rings.

00:24:12.16 [Julie Hoyer]: People are thrown around.

00:24:13.76 [Tim Wilson]: Sorry. You guys are living in a different world.

00:24:18.44 [Moe Kiss]: But what I was going to back to my point, one of my concerns is like, I hear you and I agree with everything that you’re saying, right? But especially, OK, the same thing happens when you’re trying to counsel people on like a day right. Sometimes like people will intentionally like make a number, not a round number, because people are more inclined to believe it, that you’ve put more work into it, because we have this bias that say if you say $1,600 a day, that like it’s not it’s you you’ve just picked a number from the sky versus having put thought into it. So like I’m just trying to think about how that plays off with our discussion here about not necessarily always wanting that precision. So you’re saying if we don’t go precise enough,

00:25:04.72 [Julie Hoyer]: they won’t trust the number.

00:25:06.48 [Moe Kiss]: Yes. So if we say, oh, the number is eight million versus eight million two thousand and ninety seven, will people just be like, is that bias going to come to life? Where they’re like, oh, maybe that number is not right. And like I can kind of disregard it because it’s come from nowhere. Or like, do you see what I’m saying about like, does the precision give any extra credibility or build more trust?

00:25:29.84 [Arik Friedman]: Right. So in those cases, like precision sometimes is kind of like the signal that you did the work. Right. And again, I think these kind of things are context dependent because

00:25:42.84 [Tim Wilson]: ideally, if you have established a good like

00:25:47.72 [Arik Friedman]: trust trust relationship with your business partner, then they trust your judgment and said you made the call the right call about what’s a good methodology to approach this. And in some cases, yeah, maybe precision does matter. Like maybe it’s a signal that’s important to them. And in that case, yeah, maybe you do need to put a bit more work

00:26:11.44 [Tim Wilson]: and and give that kind of signal.

00:26:15.60 [Arik Friedman]: So that definitely is context dependent.

00:26:18.28 [Julie Hoyer]: And part of the context, I’m just thinking like, Tim, like back to some of our discussions we’ve had with clients when they have really low volume, right? The difference of one is greater than if you have a large volume. The difference of 10 is like minuscule or, you know, like you can kind of ignore it. So I do feel like, too, it’s like, are you speaking of a really small end? Because then you don’t want to round compared to when you’re speaking large volumes, you can round to eight million and it’s probably OK. The other thing with precision that I think is interesting as we get sucked into a lot is when the stakeholders want to compare systems, numbers from systems, which we know are not going to give you the exact same one. Like directionally, they should be close enough. But I do feel like that is a scenario where they really get stuck on. Like, why is this one percent different? And it’s like, well, it’s OK.

00:27:16.12 [Tim Wilson]: It’s like eight million.

00:27:18.56 [Julie Hoyer]: It’s all right. And we’re not talking dollars and you know, which one’s your source of truth. So we should be OK to continue to use, you know, the other number in our decision making. But that’s really hard, I think, for stakeholders and even for analysts

00:27:33.68 [Tim Wilson]: to navigate. But I think we tend to make, I mean, if it’s like, you’re going to err on one side or the other. I mean, I look at somebody makes a line chart to show the results and they label every single data point down to the first value. And that’s not so like there’s there there are choices made in the presentation of the information where I think you can say there’s precision precision there. I’m showing a dashboard, but this is rounded to the nearest million or the nearest thousand. It’s not I’m not giving you eight zero zero zero zero zero zero. I’m saying eight million, you know, or eight point one million. You know, give them a give them a decimal place. So I think there are choices. I mean, certainly are to your point that like the context and actually Julia point as well, like the context does matter. When does it matter? But I I feel like there’s like the trap we can fall into on the analytics side is we have the precision precision might as well show it. You know, our P value is not only is a less than point oh five. It’s every model spits out. It’s the P values point oh oh oh one three four seven two. Like please don’t put that, you know, just kind of paste. Yeah. P values less than point oh five.

00:28:51.64 [Arik Friedman]: And Julia, I think, by the way, it’s a great point because it matters a lot like why the numbers are different. And, you know, scenario one, the numbers are different. We have no idea why. OK, that’s that’s not a good place to be. And that’s that’s a place where you probably do want to ask, you know, questions why why do these things don’t match versus a different scenario where, you know, we actually expect those numbers not to be perfectly the same because maybe we measure slightly different things or, you know, there are known reasons why they should mismatch. So I think usually it’s more about the confidence. Like we know what’s going what’s going on here and team to your point. Yeah, like maybe if we think that this will just distract and raise, you know, questions that are not relevant, then yeah, maybe it’s better off to reduce the position to just avoid that altogether.

00:29:44.04 [Tim Wilson]: So I think the other I think business our business partners and then data teams get sucked into it when it’s the different systems aren’t going to reconcile and I think to your point, like you should understand not down to the perfect we can net out everything, but say these are the the big movers of the differences. A lot of times there’s the ability to say over time, look, these move in the same direction, I watch companies say, well, we just need to pick the system of record for that metric, which feels like my the hairs on the back of my neck go up. Like that’s a cool idea that somebody in the C-suite or one level down said, we have the solution. We just pick our system of record. But the reality is all those systems exist for different reasons and it gets back to context. So I think there’s a part that says we need to put this to rest at some point by doing a little bit of an analysis of saying these differ. These are the main reasons they differ. Let us show you that they generally move in the same direction over time. And now we’re just going to have our standard little footnote that these, you know, move differently. But I think also we’re going to go look at the system that gives us the most what we need because it’s often kind of like upstream system in some process has this data, has this data, hands off to another system, which goes downstream from that. So you have to pick the system where you can get the slice. I mean, I’m thinking in a CRM digital analytics CRM sales world, you can necessarily track the marketing channel all the way through to a sale. And in the middle, you’ve got something like a lead that the downstream ability to follow it comes out of one system. The upstream to follow it comes from another system and heading off and understanding, and maybe this gets back to your point earlier. You’re like, you really need to be figuring out is the question being asked very clear and precise and let that drive everything as to where you look, what level of precision, how you pull it, what the intuition is. And now it just went on a bit of a tirade.

00:32:05.88 [Julie Hoyer]: Well, no, but you bring up a point that, you know, maybe this is me still like wrapping my head totally around the accuracy versus precision. But I do keep thinking of what you were calling out like the target. So we were talking about people being most comfortable with precision of systems matching, but we just discussed, you know, thoroughly, like why that’s, we know it doesn’t happen. But to your point, Tim, if they are, if we can explain in at least broad strokes, Arik, as you said, like why these things are differing. And if we then can say, well, as long as we’re pointing this at the right like target, the right question, the right problem to be focused on, then can we be comfortable with accuracy without precision between the tools? Like, am I taking this way off? But you’re saying, you know, like, if they’re always about 1% off, they’re

00:33:03.28 [Tim Wilson]: all hitting the right target, maybe not closely bunched together, that would

00:33:08.16 [Julie Hoyer]: make people happy with accuracy and precision.

00:33:14.08 [Arik Friedman]: Yeah. So I think that data science teams usually have a good opportunity to collaborate and aim to standardize measurements. And, you know, one example, like I’ve seen in the past where, you know, growth teams and product teams had different measurements of impact.

00:33:31.52 [Tim Wilson]: And that can be very confusing because, you know, when one team says, oh,

00:33:35.68 [Arik Friedman]: like we had that impact, and then the other team kind of like interprets it in their own language. So I think definitely for data scientists, we can be in a role where we, you know, coordinate between our teams to see, you know, can we actually standardize and measure things the same way and ensure that we actually get the same numbers all over the place. But as you said, like sometimes it’s not possible, right? Sometimes, you know, there are actually very good reasons why things should be different.

00:34:08.60 [Tim Wilson]: And maybe then we should just use different names for them.

00:34:11.96 [Arik Friedman]: So it’s very clear that, you know, when this team talks about something, it’s not the same thing that the other team talks about. So I think we definitely have a role to play with this.

00:34:20.88 [Tim Wilson]: So can we shift a little bit, because some of this talking about kind of getting trust in the data, and it feels like there is a, there’s a big one we covered, which is like, we’ll get people to trust in the accuracy. We’ve covered two things. If I’m going to mid-show recap, which we don’t really do recaps and I don’t know what the hell I’m doing. But we’ve got kind of the develop and trust your intuition. We’ve got the think about accuracy and precision differently. We haven’t actually covered like, how do you not push bad info out? And the intuition is kind of like one hook into that, but I like the frenetic pace everybody in modern business is working in. There is a drive to get the task completed off my desk over the wall and into the hands, which just like begs for mistakes to be made. And the cost of making a mistake with the query, with the report is of damaging like trust. So like how, what other like ways do we have? We, we, we briefly referenced like time is law, which really goes to the intuition and doing checks, but like, what are other ways to maintain trust in the data from our business partners? How do we prevent ourselves from undermining trust? It’s like a long ramble with a broad question.

00:36:01.00 [Arik Friedman]: Yeah.

00:36:01.36 [Tim Wilson]: I think first thing and something that I definitely would like to see people do

00:36:08.16 [Arik Friedman]: more is just stop and ask, does this make sense? And I don’t think people do that enough. And it’s like at every stage, right? Like, you know, you look at the raw data, does it make sense? Is there anything off there? If you apply methodology, does it make sense in this business context? You get a certain outcome. Does this outcome make sense? So I think this is kind of like a first line of defense. You know, does it make sense?

00:36:38.60 [Tim Wilson]: But aren’t there, there are times, there are times where it makes sense, but it’s actually wrong. Like if you’re like, oh, it made sense. Therefore, I’m going to go full steam ahead. And then you find out later that. Oops. Just because it passed that hurdle, it was plausible, but.

00:36:57.44 [Moe Kiss]: But you’re making the best decision you have with the information available at the time. I feel like the dust doesn’t make sense. Actually, it’s just a like, it’s a spot check, right? I actually think the thing that we need to do better at is. Does it make sense with our stakeholders? Like actually bringing them into some of those checkpoints versus kind of like, does it make sense to me individually?

00:37:21.44 [Arik Friedman]: Absolutely. I mean, I guess like this is the first check that you need to do with yourself before you engage with others. And I think that’s already a checkpoint that I think can probably spot some of the issues. Beyond that, yeah, I mean. Involving other people in this thought process. And like in general, you want to do peer review process for everything. And by the way, I’m coming from a computer science background. My brain is wired as a software engineer. So basically any time that I do anything that involves statistical modeling or math, I will always pull someone else that has like, you know, strong. Math jobs and statistical modeling, you know, hey, check by work. Like, did I actually apply the methodology correctly? Because I know that they think about this in a different way than I do. And they can spot things that I want.

00:38:17.40 [Tim Wilson]: Will the question come up in that context? Because you said like, does the raw data make sense? And from like the EDA of saying, I’ve made my initial query.

00:38:30.12 [Tim Wilson]: Now I’ve got a table of data that has say 25 columns and 150,000 rows.

00:38:40.72 [Tim Wilson]: Do have I gone through and checked that one is 150,000 rows seem like the right number of rows? Like that’s like a saying to like, is the, does the size of the data make sense? But going kind of checking for gaps, distributions, the values, like checking the columns, doing kind of a, doing that step of saying, this should be the right data, but have I actually done, have I checked all the values, all the variables to see if the distributions and values are reasonable? Like not just trust the query, because that adds time and it’s probably a judgment.

00:39:25.36 [Tim Wilson]: It depends on how solid and simple the SQL is.

00:39:29.24 [Tim Wilson]: Does that make sense? And would somebody, if you’re having, if you’re running it by, is this the right model, how likely are they to say, did you double check that the data that you’re

00:39:38.64 [Tim Wilson]: feeding into this model is the data that you think it is?

00:39:43.80 [Arik Friedman]: Yeah. And I think it’s probably both those things. And it depends also if it’s like, is this the first time I’m answering this kind of question or not? And I’ll give us an example. Like there were cases, you know, at some point we worked on, say, internal developer effectiveness metrics, and we tried to understand, you know, flow of work and things like that. And this was the first time we went to calculate these kind of things. So, you know, I worked on this, there was another data scientist

00:40:15.16 [Tim Wilson]: calculating the same metric in parallel, and we got completely different results.

00:40:22.32 [Arik Friedman]: And the thing is that none of us was wrong. I mean, technically, none of us, we didn’t make any mistakes or technical mistakes in the process, but we started working through the calculations step by step and really as we realized that, oh, we actually made different assumptions about what the data meant, and we made different assumptions about what we wanted to measure. So just by working through this process until the numbers matched, it allowed us to align on the assumption and it gave us the confidence that we were actually measuring the right thing. And so, like, it’s definitely not something you will do with any metric or any calculation that you do, but definitely, like, the first time that you do something, it’s good to have, like, this cross check. And once you got these right ones, okay, the next time you already know what you’re doing and you don’t need the same level of, you know, due diligence.

00:41:18.56 [Tim Wilson]: So Julie’s, like, lighting up on the, like, oh, yeah, yeah, assumptions. Like, that’s another thing we haven’t, we’ve talked about it in the past, but the discipline of documenting the assumptions that you made, and it may be, like, the list of assumptions I made were these 20. If I’m having another analyst review it, I’m going to show them all 20. When I present this to my business partner, maybe I even pre-read it, I’m like, these are the three or four that I want to make sure they understood that I made these assumptions. Like, we’ve talked about that with, as a practice of analytics of you are making these decisions and just, like, making an assumption and moving ahead is, like, pretty dangerous actually having in the process a place to write down those assumptions, one from a repeatability, somebody checking your work, but also to actually go back to the business and actually include that in the results. Not the exhaustive, here’s how we made the sausage, but, hey, I just want you to know, going into this, and if that’s your opening slide and they say, well, that’s a bad assumption, then you actually screwed up because you should have done a pre-read with, hey, before we present this, these are like the assumptions we made and here’s why we made them. But how much, how much do you think the business do that?

00:42:41.20 [Moe Kiss]: That they actually, like, I feel like assumptions are included, they’re often, like, at the bottom of the slide, in the text somewhere, and, like, the business just glosses over them. Like, I would love a business stakeholder to be like, and I’ve had that at previous companies where, like, I’ve had very senior folks, like the founders or CEO, be like, I don’t agree with this assumption, like, let’s, let’s debate this, let’s talk it out. But I don’t see, and I would love to see more of that.

00:43:06.72 [Tim Wilson]: I mean, I find it as a way to, you know, on the building trust to not have that in the delivery, but it’s like, I mean, say there’s an executive you’re presenting to, but there’s this business partner that you’re actually collaborating with, it’s a great way to throw in slack, say, hey, is it safe to assume this and have them say, yes, is it safe to assume that?

00:43:29.20 [Tim Wilson]: And then I, I sometimes will put those assumptions, like, front and center. Like, I want to be confident that, and, but, but I probably wouldn’t say I made

00:43:39.36 [Tim Wilson]: the assumption, I would say we made the assumptions. And hey, just so you know, we assumed, you know, that, that holiday has no impact on our bottom line yet. No, that’s, that would be a bad assumption to make. But I mean, it’s not, it’s not just like a CYA of like, throw this in the, in the footnotes, it’s, I think it’s as much of a show that you’re trying to understand the context and the operating environment, which is building trust upstream, which also means that there will be more trust in the output. No?

00:44:14.60 [Julie Hoyer]: Honestly, though, I feel like sometimes that type of documentation and assumptions are like, definition stuff too. I’m with Yuma, like it’s refreshing and encouraging when you are stakeholders key in on it and understand the value of it or want to debate it and can call it out if it’s wrong. But like, honestly, I think a lot of times it is more useful for yourself or others to replicate the work later on, or if you have to build on the current work.

00:44:42.92 [Tim Wilson]: Like, I think sometimes if you’re not kind of obsessed with like documenting

00:44:48.88 [Julie Hoyer]: all of those things, I mean, I’ve been the analyst that’s been given a spreadsheet with random numbers, supposedly where they pulled it. And I have no other definitions or assumptions. And they’re like, oh, recreate this and do it again for the new time period. And you’re like, F my life, I’m never going to get this. And the person that made it originally is gone. So all that to say, like, sometimes if it’s not building trust directly with stakeholders, I do think it helps like maybe the analytics team broadly or the data scientist team broadly, more for the replicability is how I see it.

00:45:26.32 [Arik Friedman]: Yeah, I think definitely like documenting the analysis or the assumptions. And like, that’s probably not the document that you’re going to eventually show as your final result, but it’s probably good to always have like, here’s the technical document with all the details, all the assumptions. So it’s available there. And so you can reproduce this. And yeah, but like referring to most point, I think you have this conversation with your business partner is part of the process, right? Like you get a question. Like, here is how I understand it. This is my interpretation. This is how it translates to the methodology. So you definitely want to be sure that, you know, you’re on the same page and you’re actually answering the same question.

00:46:10.32 [Julie Hoyer]: The other thing I think of too, like if it does get put in the footnotes

00:46:14.92 [Tim Wilson]: of the document or this the presentation, like sent to your business stakeholders

00:46:20.76 [Julie Hoyer]: mode, like I have spent so much time talking to people and on my team or being so worried about myself, what happens when I give this deck to my stakeholders and then they pass it to their stakeholders and they pass it to their stakeholders and then they start asking follow up questions and nobody knows to ask me for the clarification. So like, honestly, I think my safeguard and anxiety is what makes me put a lot in the final documentation in the appendix or in the footnotes. Because I’m like, hey, if this gets passed like three degrees away from me, which again, that’s a win if your work gets passed that far along.

00:46:54.84 [Moe Kiss]: I see this happen all the time with experiment results where like it, you know, a data scientist has pulled something together and then someone like summarizes it and someone summarizes it for a slack message and like these steps further and further away, which look, do I want stakeholders to be able to interpret experiments and communicate it? Absolutely. Lutely. But I think sometimes that’s like just the the the understanding of like the core tradeoffs kind of gets watered down or there’s like different incentives and it just gets really tricky. Arik, I know like you’ve done a lot of thinking about this, about like once the analysis is out, the experiment result is out and you kind of lose control of the narrative, so to speak, like how do you approach that?

00:47:41.76 [Arik Friedman]: Yeah, so it definitely happens like sometimes you you get a chart or something that gets copy pasted and then someone puts it on a slide deck and it’s completely out of context and you lost that. And like one one thing that I try is again, like if you have a documentation

00:48:02.40 [Tim Wilson]: of your analysis, like you should definitely have a link to that.

00:48:06.28 [Arik Friedman]: So if it’s copy pasted, at least the link to the document is still there. And, you know, there’s the footnote in the chart where you can highlight the main details, so it’s definitely a question of a balance, right? Like you don’t want to overload your visuals with all the assumptions and all the details, it’s distracting, but at least have this kind of, you know, selective context or pointer to where the information is. So it kind of like travels together with the visuals. And, you know, sometimes, you know, you just lose control and you have nothing to do about that. But at least you can take some mitigating steps. So, you know, the evidence is there. Like if anyone is really like will want to find this information, there is a way for them to get there. So that’s the least we can do to at least control that.

00:48:59.24 [Tim Wilson]: Yeah. I mean, I think like pointing to like having the footnote of like, what was the what was the right with the source? I mean, you’re well stated that it’s like you don’t want to overload it with all the assumptions, but you want to provide the breadcrumb to say

00:49:12.88 [Tim Wilson]: and the more you can put that proximate to the chart. So they’re not likely, I mean, that’s pretty gross misbehavior

00:49:21.92 [Tim Wilson]: if somebody says, oh, there are footnotes on this slide, but I’m going to pull just the chart and drop it in an email. There’s a little bit of that’s on them if it gets its own legs. If you’re giving it like, hey, this is probably important reference information that should go along with it. I think that’s good. But this, I feel like we could go on for multiple hours, but unfortunately, we need to head to rap or we will lose trust with our audience by having a two hour episode, which when you’re called the analytics power hour, that would be a disconnect between data sources. No, that’s a that’s a stretch. But before we leave, the last thing we like to do on the show always is go around the horn and share a last call, something that might be of interest to our users. And Arc, you’re our first guest. Do you have a last call or maybe a couple last calls you’d like to share?

00:50:26.00 [Arik Friedman]: The first one is, I guess, a classic, the ICLR, like the introduction to statistical learning book, which is actually a free resource. And today, there’s also a version for Python introduction to statistical learning in Python. And, you know, at times when, you know, AI sucks all the attention, I think that actually going back to the basics, to the foundations is, you know, just as important as ever, if not more so. And I know that at least from my experience, even just going over the first chapters of the books, you know, linear regression, it’s like a very practical oriented book. So, yeah, that’s that’s a good recommendation that I usually provide. Like it landed for me.

00:51:12.96 [Tim Wilson]: They have an R version and a Python version with, like, examples on it.

00:51:16.40 [Arik Friedman]: Is that right? Yeah. So there are two versions of the book. The original one was with R. Oh, it’s ISLR and ISL.

00:51:22.44 [Tim Wilson]: And ISLB. Introduction to stuff. OK, gotcha.

00:51:24.68 [Arik Friedman]: And about the available at www.startlearning.com. And like I noted, at least for me, it landed a lot of concepts that I didn’t really get before. So I really recommend that.

00:51:36.80 [Tim Wilson]: And if I’m allowed a second and a second call,

00:51:41.84 [Arik Friedman]: which is maybe more related. So I saw recently an article from Hamel Kussein called Revenge of the Data Scientist, and it’s actually both available as a YouTube talk. It’s a Pi AI talk from March. And he also posted it as a Twitter article. And he actually talks about, I think it’s about, specifically about mindsets, you know, we have the mindset that you go with to, you know, in this agent-first world. And I think that actually his point is that the data scientists approach, their mindset, their core skills, like a exploratory data analysis, metric design, model evaluation, all these things are as critical as ever and try to translate really well to this world. So I definitely recommend giving this a read. And also Hamel Kussein and Trash and Carr had a terrific episode about AI evils in Lenny’s podcast. So that’s a great resource as well.

00:52:41.64 [Tim Wilson]: Awesome. So that was three, really, sort of. But those all sound amazing. I feel like I’ve read the first three chapters of like more. Those are the books I’m most likely to abandon, but still get a lot of value because it’s like chapter four, where I start to I’m like, what, we’re at the area under the curve. I’m like, oh, boy, oh, boy, the equations are getting pretty in-depth here. So I kind of want to check those out. Julie, what’s your last call?

00:53:12.40 [Julie Hoyer]: Well, my last call is accurate and precise. And I’m feeling like maybe I have called this one out before. But you know what? It’s such a good one that I’m just going to say it again. It’s called the Moenarch app. And I’ve actually been using it now for quite a while for my own family budgeting and financial like tool for the family spenders. And it’s really nice because you can connect everything directly into it. So on top of having all your different credit cards, bank accounts, you get really nice cash flow visuals. You can recategorize like any of your expenses that come through. And it will notify you like, hey, this is a recurring one or hey, this is one that’s not categorized. You want to come in and quickly do it. But budgeting is not easy to do on your own. And I feel like this is the first app and different thing I’ve tried that I can actually keep up with it. I can quickly get to it. You know, there’s no delay like I spent. It shows there by paycheck hits. It shows there. So it’s been awesome. You can also set a lot of your budgets and goals in the app as well. So like for all your different categories, you can customize categories or not. And then you can say, you know, I’m looking to spend X amount each month in each category. And it just, I feel like makes it a lot easier to actually live out the financial plan and budget that we’re going for. So check it out if you’re needing a tool.

00:54:44.20 [Tim Wilson]: Have you figured out a way to turn off the turn off the net worth? The words because because there are periods where I want to not look at that

00:54:51.48 [Tim Wilson]: when the yeah, I’m a monarch user.

00:54:56.44 [Tim Wilson]: So I’m a fan.

00:54:57.48 [Julie Hoyer]: But you are. So you guys, it must be good if Tim’s into it. If he approves of the visuals and the tools.

00:55:03.24 [Tim Wilson]: Yeah, judge, judge your trust in the source. Moe, what’s your last call?

00:55:11.20 [Moe Kiss]: I know we talk about her a lot. But Cassie, Cassie has her cough has a really great new. I mean, it’s just her regular newsletter, but she’s doing a couple of back to back newsletters on why the vibe coding will bite you. And here is exactly where and she’s talked through a few like text scenarios of where things have gone really wrong. Like prod systems being completely white, things like that. So definitely go have a listen to that. But I think the thing like the real takeaway is that the stories are all about the same thing, which is just like misplaced trust and the speed at which it computes. And so it’s not like the nobody got hacked. AI didn’t go rogue. It’s just like people let their guard down. And I think the thing that’s been on my mind the most, which she’s so she’s just so damn articulate, but it’s like expertise won’t save you guardrails might. And I think guardrails is the topic that just keeps looping on my brain at the moment. And so, yeah, I’ve really enjoyed that newsletter and she’s got a couple more coming out on the same topic. So check it out.

00:56:17.64 [Tim Wilson]: I think that series like motivated me to dive into some new vibe coding project specifically. So far, I’m safe. I haven’t crashed the podcast because that’s where most of them happen. Starting to wonder if Riverside, our podcast recording app might actually be leaning a little too much on vibe coding is it’s having some of the joys it’s been bringing us of late. But so I’m going to pander to mo here a little bit because with the new season of choiceology that Katie milkman came out with. Um, one of the things she has is this checklist, which is this mapping of now she said she had a couple of undergraduate students do it. And that means it’s like in a PDF for some insane reason, but she’s basically gone through and looked at like, what are the different topics like attribution bias or Dunning Kruger or left digit bias, like the stuff that her episodes cover, this kind of reverses it. And it’s a guide to broken down by these are all the topics of kind of

00:57:20.56 [Tim Wilson]: cognitive biases, um, then which are the episodes you can actually listen to.

00:57:26.52 [Tim Wilson]: So if you’re not a, um, Katie milkman choiceology completionist, like I am, but you’re like, I wonder if she’s ever had an episode about, um, you know, mean reversion neglect, then this little guide will pop you to it. It is comically in a PDF, which I’m like, this is great. You guys really work to format this thing to one page. And she’s about to start a new season, which means this, this of all things that should be a vibe coded website where it gets updated and maintained. This is it, but you know what the undergrads, they’ll learn. That’s how it was scoped academia. They’re going to do their little thing. So, um, with that, um, our thanks so much for coming on. I feel like this is a case where we actually have like the show prep documents that have like even more gold in it that we were not able to get to. So, so we will have a lot of fun with that content our own ourselves. We may figure out how to bring you back for more of that. So thanks so much for coming on.

00:58:31.64 [Arik Friedman]: Thank you very much. Awesome.

00:58:33.76 [Tim Wilson]: So if you, uh, listeners, um, we love to hear from you. So if you’d love to have you leave a review or rating on whatever platform you listen to us on, if you’d like a free sticker of the podcast, uh, you can go to analytics hour.io and request one. I will, if you’ve gotten this far in the episode and you think, wow, that was a smooth conversation and these guys are professional, I will just call out now that we have dealt with, uh, tornado warning. They led to a power outage and two young children and a dog sheltering in place and with one co-host, we’ve dealt with a busted internet, um, that has been busted for the entire episode. But of course the repair team showed up for that during the episode. Um, and we’ve dealt with various cases of people dropping off and returning and not even realizing that we were still recording the show. So I encourage you to stick around for the outtakes because there might be some real doozies in those. Um, I would like to Tony, please leave this in. Thank you so much.

00:59:39.92 [Tim Wilson]: Cause if you pulled this thing together, you’re a good on your mate.

00:59:45.28 [Tim Wilson]: Uh, so it’s been fun. It’s been a fun discussion. Uh, we’ve been at this for four hours to get this one hour pulled together. No, it hadn’t been quite that bad, but we would love to hear from you. We’d love to hear if you thought that the edit was pretty smooth. If you’ve got your own thoughts on how to build, maintain, recover, trust, uh, you can reach out to us on LinkedIn. You can reach out to us on the measure slack. You can just send us an email at contact at analyticshour.io. So for Julie, for Moe, for all of the conspiring mother nature and construction projects that tried to not allow us to record this show about trust and accuracy

01:00:31.72 [Tim Wilson]: and precision, keep analyzing.

01:00:34.80 [Announcer]: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at, at analytics hour on the web at analyticshour.io, our LinkedIn

01:00:46.24 [Tim Wilson]: group and the measure chat slack group music for the podcast by Josh Crowhurst.

01:00:52.36 [Announcer]: Those smart guys wanted to fit in. So they made up a term called analytics. Analytics don’t work.

01:00:59.04 [Charles Barkley]: Do the analytics say go for it? No matter who’s going for it. So if you and I want to feel the analytics, they go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition. Fuck my life.

01:01:15.48 [Moe Kiss]: So the work that’s being done out the front of my house. Knocked the NBN cable out.

01:01:24.48 [Tim Wilson]: I’m a rap now.

01:01:26.12 [Moe Kiss]: Oh, shit. Were we still recording? Oh, fuck my life.

01:01:33.84 [Tim Wilson]: I had to smoke this, but I was, I was trying to make so many notes that I, uh, on other aspects that I doubted.

01:01:40.56 [Moe Kiss]: There’s a lot going on today. Oh, my God. Well, you gave me so much.

01:01:46.56 [Tim Wilson]: I just, there was no writing.

01:01:48.88 [Julie Hoyer]: Nobody has the heart to tell you if I’m like, well, then the second one, one is like, so can I wrap? Oh, guys. All right, guys.

01:02:03.32 [Arik Friedman]: I definitely want to take Tony in advance because like, yeah, like, that’s probably some work to do here. Yeah.

01:02:16.12 [Julie Hoyer]: My hot mic was the least of our concerns. I don’t know if you would have thought, Jesus.

01:02:21.08 [Tim Wilson]: Yeah, but I mean, I think it’s, I think it’s going to come together well.

01:02:26.04 [Tim Wilson]: And my mid-show recap was like me organizing my thoughts because I was like,

01:02:29.76 [Moe Kiss]: I think we’re actually hitting on some, I actually feel like Arik, we need like another two hours with you because there’s so much stuff here. It’s such gold.

01:02:39.40 [Julie Hoyer]: Seriously. Yeah. You had so much in the show prep doc that I was like, oh, I want to talk about that.

01:02:45.32 [Moe Kiss]: Oh, and this is why I was like, I knew that the two of you would really like

01:02:48.56 [Tim Wilson]: Arik because like you’re the same with the very good at prep and

01:02:53.80 [Moe Kiss]: organization and all those things. I literally bought a new microphone and it’s still. Why have I still got this shitty one that doesn’t even have a proper stand and it still works great and everyone keeps buying all these fancy ones. It sucks.

01:03:08.16 [Julie Hoyer]: I got to return my hundred dollar one, I guess, and try a twenty dollar one. Maybe I need it to be less sensitive.

01:03:14.64 [Tim Wilson]: I think the show might need to buy you an audio interface. I think it’s I don’t don’t don’t don’t make any moves. Don’t do anything drastic.

01:03:25.12 [Moe Kiss]: Yeah, the audio interfaces and changing microphones.

01:03:33.04 [Tim Wilson]: It is because it’s moving where the preamp is. It moves it into its separate. It moves it from a little thing, crappy thing in the microphone. It then takes it out and puts it in a dedicated box.

01:03:44.40 [Tim Wilson]: OK. No, this is so above my head.

01:03:48.20 [Julie Hoyer]: I just want to buy a microphone that works.

01:03:59.24 [Tim Wilson]: Rock flag and accuracy versus precision.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#296: Avoiding Major Oopsies: Twyman's Law, Intuition, and Valuing Accuracy Over Precision

#296: Avoiding Major Oopsies: Twyman’s Law, Intuition, and Valuing Accuracy Over Precision

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_296_-_Avoiding_Major_Oopsies_-_Twymans_Law_Intuition_and_Valuing_Accuracy_Over_Precision.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares