#172: Data Translator? How About a Data Detective? With Tim Harford

Data is everywhere and it’s simply not going away. Plenty of people do seem to ignore it to their peril, but if we are trying to make sense of the world, making good sense of data is absolutely critical. In business we call it data literacy, and, truthfully, it is a mandatory skill set for almost anyone. Data and understanding data might have a set of rules, and it seems like not everyone is committed to playing by those rules. Sometimes even our own brains get in on the act of hiding what the data actually means from us. And that’s the subject of this episode with Financial Times columnist, BBC presenter, and Data Detective / How to Make the World Add Up author Tim Harford

Books, Podcasts, and Surreal Game Shows Mentioned in the Show

Photo by Blake Wisz on Unsplash

Episode Transcript

[music]

0:00:05.7 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael, and Tim.

0:00:21.8 Michael Helbling: It’s Episode 172 of the Analytics Power Hour. Your hosts are Moe, Tim and me, Michael. Hi, Moe and Tim.

0:00:31.4 Moe Kiss: Hi.

0:00:31.8 Tim Wilson: Hello.

0:00:32.5 MH: Hey, do you like this? I’m still trying new things with the intro, can you tell? [chuckle]

0:00:37.0 TW: Keep trying.

0:00:37.6 MK: Very creative.

0:00:39.2 MH: Well, I’m you know… Just figured it’s time to tweak. Okay, listen, data is everywhere, it’s just simply not going away, I mean, plenty of people do seem to ignore it to their peril, but if we’re trying to make sense of the world and making good sense of data, it’s absolutely critical that we learn how to use data correctly in business, we call it data literacy. Truthfully, it’s a mandatory skill set for almost anyone. Data and understanding data might have a set of rules, and it seems like maybe not everyone is committed to playing by those rules, and so sometimes we’ve gotta pay close attention, in fact, sometimes our own brains get in on the act of hiding what the data might actually mean from us. Anyway, I know the three of us have been looking for someone to bring on to the show, who spends a good amount of time working in this area, we did find someone, it’s our guest, Tim Harford. Tim is an Economist, Journalist, and Broadcaster. He’s the author of “How to Make the World Add Up” or “the Data Detective”, “Messy” and the million-selling “The Undercover Economist”. Tim is a senior columnist at The Financial Times, and the presenter of BBC radios, more or Less, How to Vaccinate the World and 50 Things That Made the Modern Economy. Moe, I know you had a personal connection to that one. Your dad mentioned.

0:01:56.7 MK: I know, my dad is so excited that you’re on.

0:02:00.8 MH: Yeah, It’s pretty cool.

0:02:00.9 MK: I feel like for the first time ever, my dad is like, Oh, I get what you do now when you talk about data.

0:02:07.1 MH: And also he’s the host of the podcast, Cautionary Tales. Tim has spoken at TED, PopTech, and the Sydney Opera House. He is an associate member of the Nuffield College, Oxford, and an honorary fellow of the Royal Statistical Society. Tim was made an OBE for services to improving economic understanding in the New Year’s Honours of 2019, and today he is our guest coming from the country that brought us, Numberwang, welcome Tim Harford.

0:02:36.7 Tim Hartford: Oh, there’s no better introduction than that, It’s great to be on the show. Heya!

0:02:42.2 MH: Hey, there you go, I hoped you would get the reference that’s us…

0:02:45.4 TH: Oh yeah, oh yeah, yeah.

0:02:47.1 MH: Okay, great, well, we’re delighted to have you and specifically, we’re talking a lot today about your latest book, The Data Detective, right and I think it was released under two different names, sort of doing that cool thing with like the Harry Potter and the Philosopher’s Stone versus…

0:03:02.8 TH: Yes.

0:03:03.0 MH: Yeah. So, I see what you did there, nicely done.

0:03:05.5 TH: Yeah, not my choice, but yes, so it’s the Data Detective in the US and Canada and Guam for reasons, [chuckle] for reasons, and it’s How to Make the World Add Up In the UK, Australia, India, South Africa, basically the rest of the world.

0:03:22.2 MH: The rest of the world. Well, in any case, we’re delighted and our listeners do hail from all of those places, so do look for that book, but let’s talk a little bit about it, I, just out of the gate, we’re gonna spend a lot of time on chapter one, Tim Wilson, you know why?

[laughter]

0:03:36.2 MH: Because…

0:03:37.8 TW: Because that’s how far you made it.

[laughter]

0:03:39.8 MH: No, because it’s about your feelings. You’re very… That’s very cute.

0:03:44.6 TW: Okay.

0:03:44.7 MH: No, because the first thing you cover in the book is your emotions as it pertains to relating to data, and I’ve never seen a book about data lead off with that, and so that was just really compelling as I started digging into the book, that was actually a pretty good start. So maybe we could start there, like why did you lead the book off with that?

0:04:03.9 TH: Sure. Well, I actually… Strictly speaking, I lead the book off with a takedown of how to lie with statistics, so we could talk about that as well.

0:04:11.5 MH: Oh that’s true, that’s true. I have a little note about Dr. Doll I think… Dr. Hill, yeah.

0:04:17.3 TH: Yeah. That’s a very interesting story I didn’t know of it self but so the first full chapter, it… Yeah, there’s the full Star Wars reference, search your feelings. And the reason that I start with that, the book’s got 10 rules, 10 rules for thinking more clearly about data and maybe more clearly about the world, the first rule is to search your feelings. The reason that I start with that, and in fact, with a story that I love about an art forgery that has nothing whatsoever directly to do with data, is because our feelings are clearly influential on the way we think… We have preconceptions, we have things that we want to be true, things that we expect to be true, things that we’re told by people that we admire and respect, that are true, things that we’re told by our political enemies, by outsiders, by the… You know the other side of the argument, that are true. And in all of these things, we are processing this information, not just with our logic circuits, but with our gut feelings, and there’s a lot of high-quality evidence to show that people are very, very good at reaching conclusions that they want to reach, either subconsciously or consciously, and the basic argument of the first chapter of the book is if you don’t recognize that, and if you don’t notice your emotional reaction to the facts you haven’t gotta chance you fooled yourself before you’ve even begun.

0:05:51.7 MK: And so when you were working through, say, an article or a stat, or it sounds like you and your wife had some very interesting discussions sitting around the kitchen, whenever the news comes on, what… What goes through your mind in that… ‘Cause I think it’s one of those things, that’s really easy to say like, “Oh, you should check your emotions before you proceed along the path of analysis.” But what’s your process actually look like? Do you have a particular set of checks that you put in place for your own mind?

0:06:22.2 TH: I should… That’s a good idea. I should actually have a checklist on the wall. That would probably be a good idea, make up my 10 rules as a checklist, or even actually, I think I’ve boiled them down to three. We can talk about what the three are if you like. But the… Fundamentally it’s a habit, not a process, not a checklist, it’s a habit. I’ve just tried to get into the habit of noticing what my instantaneous response is to a claim, and it’s surprising how often though that there is an emotional response. Maybe we shouldn’t be surprised. When you think about it, most stuff on social media goes viral because it’s emotionally resonant one way or another, makes us laugh, makes us afraid, makes us feel vindicated, makes us feel part of a community. The most headlines are chosen for their emotional punch. That’s what newspaper editors are trying to do, TV news editors.

0:07:19.2 TH: So maybe we shouldn’t be surprised that so much of this stuff packs this emotional weight. And I can’t prove it, this is an untested hypothesis, I have to say, but I just feel fairly confident that simply noticing that emotional response, being mindful of it… To sound like a yoga instructor with a calculator for a moment, [chuckle] being mindful of that emotional response immediately drains it of some of its power. You can just just look at yourself and go, “Oh, I’m… I’m having the feels about this headline.” And then you go back and you look at the headline and you just see it slightly differently.

0:07:57.4 MH: That’s so great. I… Yes… 100%.

0:08:00.8 TW: And that is… That does… I’ve heard the Michael more on the, when I’m having an emotion, stop and examine it and then try to get a little more Vulcan about it, I guess, to stick with the Star Trek side of it.

0:08:13.3 MH: Sort of asking why is it making me feel this way? Like if this were true or whatever… Yeah.

0:08:19.9 TH: Yeah. Although on the subject of being more Vulcan, everybody should read Julia Galef’s book, The Scout Mindset, which is about just sort of being more curious about the world, and there are some interesting overlaps with The Data Detective or How to Make the World Add Up. But one of the points she makes is that Spock, who I recognize he’s only half Vulcan, Spock’s actually really terrible at logic. [chuckle] he makes bad decisions all the time, really misjudges other people, makes terrible forecasts, and yet constantly just talks about how incredibly logical he is. So let’s not hold up the Vulcan as a [laughter].. As a emblematic of perfect rationality because, yeah, he’s not. He’s not. He’s much more fun than that.

0:09:01.8 TW: Well, so can I ask, ’cause… There… As I was reading through the book and thinking, and there were a lot of kind of, sort of fist pumping and that led me into tracking down the More or Less is available as a podcast, and I’ve kind of… I guess I sort of came to you through Cautionary Tales, which there’s some overlap with the book, which is to be expected. So if anyone doesn’t wanna read the book, the Art Forgery story is in a Cautionary Tales episode as well. We do, as analysts, we want our audience to be more kind of, as Michael framed it, data literate. So do you have thoughts on how to… I wanna run out and get stakeholders to like… Get my business partners like, “Oh, read this book. It will make you think more rationally.” But we don’t have… Or not. Now I’m feeling like I’m second-guessing every word that I’m using. I think recognizing that that happens is still a really, really powerful tool for the analyst, but it’s a little tougher to be prescriptive with others.

0:10:12.0 MK: Tim, I love it when you ask the same question as me, but you say it much more eloquently. It’s like… [chuckle]

0:10:20.9 TH: You’re right. And a lot of people ask me in a nutshell, how can I stop my dumb friend being dumb or how can I stop [laughter] my dumb uncle or my dumb client or my dumb boss or whatever, how can I stop them being so stupid, to which my answer is, well, don’t worry about them, maybe try and get your own head straight first. If you… Stop trying to be so stupid yourself, and if you can fix your own thinking, ’cause we all make mistakes in the way we think. If you can fix your own thinking then isn’t that enough? But it is true that we are always wanting to fix other people. There’s a huge appetite for that. And I’m never quite sure what to advise in terms of persuading other people to be rational, but I always try and start with, start with yourself.

0:11:13.5 MK: Do you think that ever there is this degree of damage that has been done by books like Lies, Damned Lies and Statistics, or you know, The Fake News Saga or like… There are all these other schools of thought that are constantly telling people not to trust numbers and that… I mean, particularly in our industry, working in data and analytics, part of our job is trying to convince people like, actually, we should be making decisions using these numbers for whatever reason, in whatever context. Is it worth the fight?

0:11:45.3 TW: But the numbers aren’t perfect and they’re not simple and it’s not clear. I mean that’s…

0:11:48.8 MK: Yeah. The numbers aren’t perfect. Yeah. But is it still worth like how do we still then make the case for, this is how we should make our decisions.

0:11:57.0 TH: Yeah. So I mean I think that we… I talk towards the end of the book about the work of Baroness Onora O’Neill, a philosopher who’s become something of an authority on the question of trust. And to a couple of points that she makes that I find really interesting… First of all, there’s no point just calling for people to be more trusting. Oh trust is eroded, everyone should just trust more. We don’t want people to trust more in some generic way. We want people to trust with good reason. We want people to trust things that are trustworthy, institutions that are trustworthy, methods that are trustworthy, and we want them to distrust people and institutions and methods that are not trustworthy. And the second point she makes, which I find very useful, is just, trust itself is not a generic thing. There are… I’ve got a friend who I absolutely would trust to take care of my children, but I would not trust him to post a letter [chuckle] and vice versa. People I would definitely trust to post a letter and there’s no way I’d trust my children with them. People have different skills, have different abilities. So when we’re talking about trusting a process, trusting an institution, trusting a person, we’re trusting them to do what?

0:13:22.4 TH: And so when we come back to this question of trusting numbers, one of the things that I’m trying to get people to do is to take a bit more responsibility for exercising their own informed judgment, to ask the right kinds of questions of the data that will empower them, either to trust or to distrust the numbers as appropriate. Basically it’s just to get a lot smarter about thinking critically. And I think once people have a bit more self-confidence, they’re asking these smarter questions, they start to feel, “Oh, actually, yeah, I can tell the difference between something that’s probably true and something that’s probably not true and something that we don’t know, we need to ask more questions. I can do that.” It’s harder than we think in some ways, because we are surrounded by this sort of blaze of emotions and tribal reasoning and so on. But in other ways, it’s easier than we think. Statistics and data science can be incredibly technical, but most of the questions the lay person needs to ask are not in fact that technical. They’re fairly straightforward.

0:14:28.5 TW: But is it… There’s kind of in the business or the marketing context, it does seem like there’s… And I don’t wanna say… Again, it’s an over-simplification to say, healthy skepticism. I have memories kind of as I was entering the analytics world of being frustrated with statisticians who couldn’t give me a black and white answer, and now I am and have been for a number of years, that guy who can’t give a black and white answer. And we run into that all the time as we’re trying to present… Somebody says, “How much traffic came to the website?” And you get to, “I don’t… It’s not sampled, ’cause I know everybody who came to the website.” But then you talk to the analyst who says, “Well, there’s these seven or eight caveats that when we say this is how much traffic came to the website,” a simple, I’m just trying to get a sense of the scale, but it’s still giving you the right scale. So trust it to the point, this is giving you an order of magnitude. It’s not giving you the… Even though we’ve given you down to the -nth of a digit, it’s not that precise. And trying to walk that line between saying, “Let me make sure you understand that none of this stuff is perfect, but I also don’t wanna undermine the trust that you have in me that I know how to get the data that you’re looking for.”

0:15:50.6 TH: Yeah, it’s a good point. I think one of the things that I think we need to get better at doing both as suppliers of data or suppliers of analysis and as consumers of data and analysis is asking or explaining what’s basically going on here. When we look at, for example, traffic to a website, that metaphor gives us something that we can understand is that, well, if there’s a thousand vehicles an hour on a road, that’s a lot more than 100 vehicles an hour on a road and each vehicle is like a discrete thing. But actually traffic on a website… I don’t know really what traffic on a website actually means. I don’t know. So if the Google bot comes and looks at the website just to index for Google, is that traffic and how many… Is it like every time you download an individual GIF or JPEG, is that like a… Is that traffic? I don’t know, and I’m not sure you do. So just trying to explain to people the difference between hits and unique visitors, for example, here’s what we did, here’s what we looked at, here’s how it might not be telling you quite the story you think it’s telling you, here’s what we know and here’s what we don’t know. Awakening that curiosity in the process of actually putting this data together, I think it helps… It brings people together. People like understanding what’s going on under the surface, and also it helps people make, I think, a slightly more informed decision.

0:17:28.3 TW: Just from a quick little personality, that’s where the analyst who will happily dive way down and explain all the nits and details of what that means. I think that’s, again, the analyst challenge to figure out how much appetite does the person have for learning and understanding, how can I use metaphors, how can I use things that will help them get that deeper understanding without coming across as, well, I am an expert on stuff that you don’t really care about, but I’m gonna talk your ear off. I think that’s been also a part of a big take away from having to dive… Doven deep on the world… The world of Tim Harford of late is the power of metaphor and example and pulling, making things kind of engaging and interesting so that somebody learns along the way. I think that… Clearly, I think that’s probably one of the reasons that you have so many different outlets is because you are kind of a master of that and an analyst could learn a lot just from trying to kind of observe how to be engaging and interesting while educating, and now I’m just pandering to the guest.

0:18:41.0 MK: Yeah, I think you just saying nice things. That’s nice. Thank you.

0:18:44.3 MH: What’s your response to that, Tim? How awesome are you?

[laughter]

0:18:48.3 MK: So Tim, maybe you could talk us through a little bit of… One of the suggestions you had in your book about handy comparisons because that was something that I actually really took away as something really tangible, like, “Oh shit, I’m gonna go and memorize a bunch of these to have in my back pocket,” because it’s a really tactical thing that as an analyst helps give clarity, or make sense of the data in a way that someone can understand it.

0:19:14.9 TH: Absolutely. Okay. So we start with a number like, Oh, US government debt is $18 trillion… I’m actually not sure exactly what government debt is, but it’s probably around the $20 trillion mark. So there’s a couple of questions you might want to ask about that like, what actually is debt? People would confuse the debt and the deficit a lot. You might say, does it include things like an obligation to pay military pensions? Is that… ‘Cause they haven’t issued a bond, they’ve just kind of promised to pay. You can ask all these basic questions about what’s going on there, what about the debts of New York City or the debts of the State of California. Are they included or is it just the federal government? But even once you go beyond that sort of question, which I think is not kind of super technical stuff, it’s just basic curiosity, trying to understand what’s happening here. Once you go beyond that, you’ve got this number, let’s say it’s $20 trillion.

0:20:16.0 TH: Then you need to say, “Well, is $20 trillion a lot of money?” Sounds like a lot of money. It sounds like it must be a lot of money. So then you need to find a comparison, and the kind of comparison that has tended to be made, and this I think goes back to a speech writer for Ronald Reagan in the early 1980s, is to say, “Well, let’s have a stack of dollar bills, and how high is the stack of dollar bills.” And that feels like that’s a good comparison. It’s vivid and you can picture a big old stack and… Yeah, yeah, I understand. It’s no use at all. You’re getting stupider listening to that comparison, even though it’s very, very popular. Because, well, here’s one question, how many dollar bills too the yard? I know because I looked it up, but is it like… Is it 100, is it…

0:21:04.3 TW: It’s an interview question. Yeah.

0:21:05.3 TH: Is it a million? It’s probably like 1000 or more. It’s probably… It’s less, it’s gotta be less than 100000. I don’t know. You realize you’re converting a number that at least is clear into an image that is completely unclear. Turns out it’s about $8000 to the yard for what it’s worth. And if you then say, “Okay, well, it stretches to the… It stretches into space, or it stretches to the moon or it stretches to the sun.” Well, actually space and the moon and the sun, that three quantities that are totally different. And again, most people don’t know how far away space is. I think the official definition is often 100 kilometers, about 70 miles. I may have misremembered that. The moon is about a quarter of a million miles, I think. I mean, none of this is helping. None of this is helping. What does help is to compare numbers to quantities that we do understand. So for example, to compare US debt, you can just quickly say, well, hang on. There are… And I actually don’t know this number off the top of my head.

0:22:15.5 TH: So let’s do it right now, ’cause it’s not that hard. There are 325-ish million people in the US. So that’s a third of a billion. So if you had a billion dollars of debt, that would be $3 each. So a trillion dollars of debt is $3000 each. So $20 trillion of debt is $60000 each. And the precise numbers are not exactly that, but that’s about right. And then you go, “Okay, right. So federal government debt is about $60000 for everyone in the country. That just gives you a sense like that’s quite a lot, but it’s not as much as a mortgage, but it’s pretty big compared to a credit card debt. You just… It really gives you a sense of what is actually happening. And I call these landmark numbers.

0:23:08.3 MH: Let’s step aside for a second to talk about something that every data professional at every level has a vested interest in and that is data quality, ’cause if you don’t have the right quality of data, none of your analyses or insights is actually gonna be meaningful or move the needle. And so our friends at ObservePoint have been working for many, many years to give data professionals confidence in their data and insights. They do this by providing tools that automatically audit data collection for errors across all your sites. You can actually test the most important pages and user paths, so that way you’ve got accurate data collection where you need it the most. There’s built-in alert so you know when something goes wrong, so that way when the monthly report comes out, you’re not getting a big shock or surprise, and you can track the quality of your data over time. So if you haven’t looked at ObservePoint recently or you’ve never heard of them, we recommend go over to their website, request the demo, you can do that at observepoint.com/analyticspowerhour, all one word, to learn more about ObservePoint’s full set of data governance capabilities. Alright. Let’s get back to the show.

0:24:17.9 MK: Okay, Tim. So we’ve started to touch a little bit on a concept that you introduced us to in your book, which I was actually walking my new baby around and I had to stop the pram ’cause I was like, “Oh, that’s what this is called.” Premature enumeration was this [laughter] concept in your book that I just suddenly… I felt like everything that had gone wrong in every meeting ever with someone, I was like, “This is what happened… ”

0:24:48.6 TW: As you. Kind. Of made it up, right?

0:24:51.1 TH: So yeah I mean premature enumeration is something that I end up discussing a lot with my wife, because we’ll be listening to the radio in the morning and there’ll be a headline, and she’s got this idea of me as a professional nerd, so I’ve just got this big spreadsheet of every fact in the world in my head, and so she’ll… Somebody will make this numerical claim, and she’ll turn to me she’ll say, “Well, is that right?” And generally the answer is, “Well, I don’t know. And the reason I don’t know is ’cause I don’t actually know what they’re talking about, I don’t actually know what they’re measuring.”

0:25:29.2 TH: Just thinking back to the point we were discussing before about US government debt, what is it the federal government is it the central government and all the states and all the cities, the municipalities or what? What is explicit debt what’s kind of a bit fuzzier like pension obligations, all of this stuff. It’s so so common to hear a claim that is true and completely defensible and based on solid statistics, and yet, will utterly mislead you because you have left to some conclusion about what is actually being measured or counted, and in fact, something completely different is being measured or counted. And this habit, that I think we all have, including I think perhaps especially data practitioners, this habit of starting to slice and dice and analyze the number before you actually know what the number means or refers to what the definition is, that trap or that habit is what I call premature enumeration.

0:26:38.1 MK: So just one example from my work, we have a term called annual recurring revenue, so given that it’s recurring, it’s people that basically have… It tells subscription money, right? And then people will also have annual revenue, which is subscription money plus other money, so print-to-orders and all sorts of other stuff, and it happens constantly in my work where people will interchange the two, like they’re the exact same thing, and I sometimes feel like I’m banging my head up against a wall, but you seem to have this very like chilled calm way of just asking clarifying questions, so in that situation, do you just kinda stop and say, “Okay, well, are we talking about recurring revenue, or are we talking just about annualized revenue?” How do you tackle that?

0:27:28.4 TH: Yeah, yeah, absolutely. Well, I think annualized is different from annual, if I wanna get really nerdy, am I right?

0:27:35.3 MH: That’s right.

0:27:36.1 MK: Yes, that’s true. That’s true.

0:27:37.8 TH: Just saying. But Yeah, it does help, I think, to try to be calm. Somebody reviewed the radio show more or less, and accused me of having been chemically altered in order to keep my cool. Just I took it, I took it as a compliment. I think it was meant as a component, I certainly took it as a compliment because you don’t get any smart by getting angry. It doesn’t help. So yeah, I’m just trying to sort of ask these questions that clarify the situation, that clarify what’s going on. It’s absolutely essential just to keep your own head straight and then… And of course, for others as well, and people don’t like the idea of like these gotcha questions, they don’t like being tripped up, but I think it’s possible to ask these questions in a friendly way, to just clarify that… Everyone understands what’s being talked about.

0:28:28.8 TW: It’s kind of the cousin of the curse of knowledge, right? Like Moe, when you hear somebody say annualized revenue and you are immediately… Because you are so deep in annualized revenue and annualized recurring revenue, you’re like, “Oh, they’re being loose and casual and they’re making assumption they don’t understand it,” and yet the tendency, or I will say this as an analyst… There’s the data that I know really well that I’ve been really deep in, and I know it better than the business users who are even maybe generating it, but then I need to supplement it with some other piece of data, and I immediately assume that that data is simple and clean and is exactly what I would expect it to be. So I find myself sometimes catching myself trying to remember one when I’m looking at like, we’ve got a pharma client and I’m working with data on emails we’ve sent out that data I know really well, but then I wanna get the number of prescriptions that were written.

0:29:25.4 TW: And I don’t know that data that well, but I make these premature enumeration and I’ll totally be like, “Well, just give me those numbers,” and then there’s some other person in the organization saying, “Well, but do you mean like new prescriptions, do you mean the number of doses… Do you mean need the number of refills?” And of course, I had already started trying to slice ’cause I assumed it was one number and they’re like, “Well, no, there are seven different ways that can be defined,” and I feel like that is the sort of thing that comes with experience recognizing… When I’ve got depth of knowledge, the people I’m working with don’t, so it behooves me to say, “Let me do a little bit of education and hopefully engaging and maybe useful analogy-filled way, but at the same time when I get those data sources… ” So when I was with the premature enumeration, I’m like that, I’m guilty of that. Others are guilty of that, if others are guilty of that, when is that my fault for assuming that they have the depth of knowledge about the data that I’m providing… That I have…

0:30:26.8 MH: And so often you’re viewed in that situation as the person bringing unneeded complexity to this, or why are you asking so many questions you or why are you trying… And it’s like, it actually matters.

0:30:38.1 TH: Yeah. Yeah, but I think that by trying to phrase it in a sort of an accessible everyday way and explain or try to understand what is basically happening here, sometimes it is very complicated, but often it’s all common sense stuff, I mean an example from the book. A very serious example was, we found that this is from a radio show, infant mortality in the UK was increasing after decades of falling, and so people are trying to figure out what is going on, what is causing this terrible tragedy to become more common, and in a nutshell, it wasn’t becoming more common at least we don’t think so. What instead had happened is that medical practitioners, for reasons that we still don’t fully understand, although we can speculate about, had started to change the definition, basically draw the line in a different place between a miscarriage and infant death.

0:31:50.7 TH: And of course, you realize that this is one of the most contentious dividing lines in American Politics, right? Of course, it’s hugely emotionally and politically and morally and legally significant where you draw that line, but at a certain point, someone just has to call it. And what was happening was that the doctors and nurses were saying, “Actually, we’re no longer gonna call this particular tragedy a late miscarriage, instead we are going to say this was a premature birth and that the baby died.” And I think that was probably partly to reflect the desires of the parents who felt that a miscarriage didn’t fully described their loss, but everyone was running around trying to figure out… Rightly, trying to figure out what the cause was. Was it some mistake, was it a funding issue, was it a lifestyle issue, or it was someone with obesity or smoking or something, and it was a… It was a statistical recording issue, and I think we owe it to ourselves to ask those questions and to dig a little deeper.

0:32:56.7 TH: There’s another example which is hugely important, less traumatic from the financial crisis where people running these quantitative models in the banks and the other parts of the financial sector, were trying to measure risk, and they got very, very sophisticated in slicing and dicing and repackaging risk then reforming it in a way where you could on sell risk and do this, and off-set it and and yet, when it comes down to it, then what is risk? Not really actually that easy to define, and instead of measuring risk, what they were actually basically measuring was historical variance and covariance, which is no not a bad measure of risk, but not perfect either, and the more sophisticated you get at in the way you sort of slice it up and re-package it and start making trillion dollar bets on it, the more the subtle difference between risk and historical variance starts to… Yeah, that subtle difference starts to become incredibly consequential, and then that’s one of many reasons why the entire global financial system melted down in 2008. It was premature enumeration all over again.

0:34:15.5 MH: Yeah, this is great, I love the conversation so far, but that there’s so much more in the book, so there’s a couple of chapters specifically, I wanna make sure we touch on, so a little bit deeper in the book, you lay out some rules for how to think about or look at data in terms of where did this data come from, that was sort of the rule number five, and then rule six is sort of what data might be missing, and I just found both of those chapters very, very cool to read and also very timely. So I’d love to talk a little bit more about that, like you mentioned in the chapter, it’s not chapter six, I would call it chapter six, rule six, which is what’s missing from the data.

0:34:55.3 TH: You can call it chapter six, it’s fine.

0:34:56.8 MH: Yeah it’s fine… Yeah everybody just go with me on this. I’m re-doing the chapters right now, it’s not a big deal to live at it, so many studies that we all quote in these kinds of things from history are just massively missing a whole swath of the population from any kind of representation at all, including like you made mention of sort of the the famous Yale shock study that was only men were involved.

0:35:25.7 TH: Milgram’s electric shock experiment Yeah.

0:35:28.5 MH: Yeah.

0:35:28.8 TH: Well, only men for no obvious reason yeah.

0:35:31.2 MH: Yeah and so talk a little bit about your exploration of that and sort of some of the things you’ve come to, I’ve certainly got some opinions about this, but it’s I think data and people who use data.

0:35:42.7 TW: As long as you don’t have an emotional reaction.

0:35:44.5 MH: I have a very emotional reaction because…

0:35:47.6 MK: You’re allowed to have an emotional reaction you just need to notice it that’s fine.

0:35:50.7 MH: Yeah, I am very in touch with it, Tim well said so… But anyway, I’m not really forming a question as much as Look I wanna just put us in that part of the book for a minute and start talking a little bit about that, ’cause I think our listeners would benefit a great deal, from hearing a little bit from you on this.

0:36:07.2 TH: Absolutely so the basic idea in statistics, very, very basic principle is that you’re analyzing a sample and you want the sample to be representative of the whole, but which whole are you actually representing and who’s missing? So there we could talk about the… I’m sure, most listeners will know the very famous example of the 1936 US presidential election and literally digest trying to forecast this election and catastrophically mis-forecasting, despite an absolutely huge sample. It’s a sample of over two and a half million people, which is extraordinary and in 1936, but just turned out to be a very, very unrepresentative, 2.6 million people that they sampled so they got it wrong. And there are issues here of inclusion and representation in the more sort of the political sense, so my favorite example of this comes from Caroline Criado Perez’s book, Invisible Women.

0:37:09.1 MH: Oh, I love her book. Oh so nice.

0:37:11.9 TH: Yeah, it’s terrific. It’s absolutely terrific. Where she talks about the original trials of sildenafil so sildenafil is this drug that maybe it’s gonna treat heart pain, so and it’s tried out in a whole bunch of people, and they discovered this interesting side effect, which is magnificent erections, and so sildenafil then gets sort of re-marketed and repackaged as Viagra. And the rest is history, except when you hear that so where you go, well, hang on. But presumably, the women in that study weren’t getting magnificent erections, and the answer is why should the women in the study when it didn’t exist, there were no women in that trial, it was just tried on men, ’cause it’s just easier because we don’t wanna be trying new drugs on pregnant women, most women might be pregnant, so we don’t want… Just it’s easier to just do ’cause we know men aren’t pregnant. So that’s fine. So it was all men, and so they found this side effect in men that turned out to be really useful.

0:38:10.8 TH: More recently, there has been some work that I think is far too small and preliminary to be confident of suggesting that sildenafil might actually be a very effective treatment for menstrual cramps for severe period pain. Which is a problem for a lot of women. Now, it could have been if they had some women in the initial trial, they would have found that 25 or 30 years ago, but they didn’t, so we’re still wondering, and meanwhile, Viagra as been around so long, it’s now off patent, so that is just an example of the way in which you can exclude a whole class of people without even realizing that you’ve excluded them. And it’s bad statistical practice, but it’s also extremely unfair.

0:38:52.3 MK: One of the things that you touched on was academic research and the publishing bias, and you mentioned that basically if a study has been shared previously, journals are not likely to publish a second story, a second article that has similar results or is confirming the result is that… Do you know if that’s true even if it’s tested on a completely different group? Like let’s say we did have a medication that was only tested on men, but now they have results on women, would that also still be excluded or would that…

0:39:22.5 TH: Well, that’s a very interesting question to which I don’t know the answer, and I think that for medical trials, they tend to be much more inclusive now in general, so each additional medical trial, we’ve got institutions such as the Cochrane Collaboration who’s whole reason for existing is to make sure all of the medical trials that are performed get included in a big meta-analysis. So I think and I’m not an expert on this, that if you manage to get the funding to run a medical trial it’s not a trivial thing to do, by the way, it’s much more expensive than to say to test a hypothesis on 96 psychology undergraduates, which is a more sub standard practice. If you manage to put together a proper clinical trial, that’s gonna be published somewhere, and that’s certainly gonna be, of a decent standard, it’s going to be reflected in the meta-analysis published by the Cochrane Collaboration.

0:40:20.9 TH: More of an issue is that you’ve got these quite, often quite small studies that produce these results that are highly questionable. And somebody comes along with a bigger study and arguably a better study, quite often, the result goes away and much harder to publish the second study, because people are saying, “Well, we kind of already knew this.” But well, this is a different result. “Well, yeah, but you know.” This has been started already, I think this situation is improving, but slowly. And journals are still… Fundamentally journals are still in the entertainment business, even if it’s a very niche, very nerdy, bit of entertainment, they want to engage their audience and excite people with something new, rather than say, “Well, we checked that that thing that everyone kind of thought was true all along, we thought we’d go and check and it’s sort of still true, but maybe not a true as we’ve got to.” The effect size is small or whatever people aren’t so interested in that.

0:41:27.8 MK: It actually solidified, I guessed, in my mind, that that’s a really good practice for us to get into in a work context, because we often do run tests and we see a result, and then we’re like, great, let’s now roll this thing out to 100% of our users, and let’s assume that we’re gonna see this effect forever and the truth is it’s good practice to go back and re-test and make sure that that hypothesis still stands up and it’s something…

0:41:51.9 TH: Absolutely.

0:41:53.0 MK: I do sometimes think it’s a luxury when you’re super busy at work, but your book actually really solidified for me that that’s something we should all be doing as analyst is going back and re-testing.

0:42:04.0 TH: Yeah, and for several reasons, one is because the world changes, one is because the original result might have been a fluke, it is surprising how often flukes can happen, you think that this standard statistical controls are ruling them out, but they aren’t really. And also just, you never know. You never know. Go and check again. It’s astonishing how much stuff evaporates under scrutiny. And when you’re… Maybe when you’re asking some new question, a fresh question, it’s worth just checking the original question first, there’s an example of this in the book, I talk about this very famous study of jams in a fancy supermarket in California, and the famous study is a, if you… Try and get people to try these free samples and then maybe to buy some jam, if there’s loads and loads of different options, they’ll try a free sample, but they won’t buy jam, but if there are just a few options, they’ll try your free sample and then they’ll buy jam, this is a very famous result because it’s kind of… It’s a bit surprising. It’s counter-intuitive, but it’s not crazy-counterintuitive, and it hits this real sort of sweet spot.

0:43:16.7 TH: So the reason that we now think this is quite possibly not true, is somebody trying to re-run the experiment in order to then amplify, clarify, analyze, break it down and ask him, when does this happen and when does this not happen, and just trying to take the field forward, but before he took the field forward, he’s a guy called Benjamin Scheibehenne, he first said, “Well just sort of establish a baseline and just redo the original experiment.” Effects gone. It’s astonishing how often this happens. It’s unnerving how often this happens, but as I describe in the book, everything we see goes through what I call the interestingness filter. The more interesting it is, the more likely you are to see it no matter what way you’re searching for information, but also the more interesting it is, the more likely it is to be some kind of outlier or fluke one way or another.

0:44:09.7 TW: But it gets back to that having… Understanding a degree of the data, the running… Sometimes digital analytics people will run an, AA test where they say, let’s split our traffic and give them the same experience, and then they’ll see a difference and they’ll say, “Ah, this entire thing is flawed.” As opposed to saying, no, you need to understand variability. So also, when you run an AB test and you’re seeing an effect, there’s variability within that effect, if you repeat the, you’re not gonna get the exact same result because you’re running at a different point in time with a different sample from a different population, ’cause it’s a different population in time. That also kinda harking back to you talk about… And even the 1936 presidential election, with the volume of data, I think in digital marketing, there’s this idea we have all of the data and therefore we’re set except we don’t have any data about our future population. So this idea that we have… So when we look at these things and then want to replicate them, there are things moving in the world, like you said, you just said that the world changes so we wouldn’t expect to see the exact result, but at the same time, the more interesting it is, we should be able to reasonably replicate it. Yeah.

0:45:31.6 TH: Absolutely. David Hand has an interesting book called Dark Data, where he talks about all the data we want to have, but we don’t have, and he starts with the obvious stuff like, well, you’re sampling from a population, but you’ve only got the sample you haven’t got the whole population, then it gets into more complicated stuff, like it turns out you only ran experiments on men, you don’t have any women in the sample. We are analyzing Twitter, but most people don’t have a Twitter account. And then he starts to get really deep like, we only have data from the past, not data from the future, and you think, well, of course, we don’t have data from the future, but it does matter if you… Going back to this idea of risk in the banks, we have data on how these prices have moved in the past, that doesn’t tell us how they’ll move in the future, so… Yeah, there’s a lot of stuff we don’t know. And even though… One of the things I’m trying to do in the book is to make a case for the use of data wisely used, I say it’s… It’s like having a telescope, it’s like having an X-ray, it’s like having radar is telling you things that you can’t, showing you things you can’t see in any other way, at the same time, there’s a lot that it can’t show us, and we need to be very honest and humble about that.

0:46:47.4 MH: Well, and it’s a great set of rules as I read through the book, I started thinking to myself, this book is kind of like the every man’s thinking fast and slow. That’s sort of the graduate level…

0:47:00.5 MK: Every persons…

0:47:02.2 MH: Yeah, every…

0:47:03.2 MK: Every persons.

0:47:03.2 TW: Every persons.

0:47:03.8 MH: Every person, sorry.

0:47:06.1 MK: It’s all good. It’s all good.

0:47:07.6 MH: No, it’s good, I appreciate it, ’cause that’s… Yeah, you just stumble right into it.

0:47:12.1 TH: Now, I can’t use that quote or there’s a really nice… I like that quote.

0:47:16.0 MH: Let me read you the quote, so you can do it, so I’ll do it correctly, this time, it’s sort of the every person’s thinking fast and slow in a certain sense, and that sort of creates that accessibility, we can just pretend I never screwed that up in the first place.

0:47:31.2 MH: Alright, we’re bringing back the new segment of the show brought to you by Conductrics, this the Conductrics quick quiz the query that is critical and a conundrum. So Moe and Tim, once again, you get to host a listener as a contestant, your answers will determine who wins fabulous prizes. So let’s say a word really quick about Conductrics they’re an AB testing company that’s been helping the world’s largest companies for over a decade, conduct API first AB testing, contextual Bandits and predictive targeting, they empower marketing and IT professionals, they also respect customer privacy, and they offer BEST in CLASS technology, but really what sets them apart is their willingness to go above and beyond for the clients and exceed expectations.

0:48:23.7 TW: And make your co-host uncomfortable on a bi-weekly basis.

0:48:28.8 MH: This is like kind of the…

0:48:30.8 TW: That’s just the icing on top of the Conductrics cake.

0:48:34.3 MH: Alright, do you wanna know who you’re playing for?

0:48:36.7 TW: I do.

0:48:39.2 MH: Alright, so we’ve got a listener, Robert M is the listener, and Moe, you are representing Robert.

0:48:48.8 MK: Okay, Robert you poor might…

0:48:55.4 MH: He’s used to disappointment, no that’s not true but last time it was, last time it was a complete 50-50 chance either you do the answer, it was just a guess so, you just a good a chance of winning.

0:49:07.6 TW: There’s gonna be a central limit there, and if we repeat the sampling process here again and again.

0:49:13.0 MH: That’s right. We do have options and Tim, you are representing Morgan P. So Morgan is who you are representing, all right. Here goes the question buzz in, if you know the… No I’m just kidding there’s no buzzers, we should get you buzzer.

0:49:31.8 MH: George Box is often credited with coin the phrase, All models are wrong, but some are useful in truth, what he said in his 1976 papers, since all models are wrong, the scientist must be alert to what is importantly wrong. It is inappropriate to be concerned about mice when there are tigers abroad. Along with giving us his famous quote, which of the following is not one of George Box’s many contributions to statistics. Is it 1. The box Jenkins method? 2. The box cox transformation. 3. The Ljung… I don’t know how to pronounce that. Ljung box test or four, the box Neiman method.

0:50:19.9 MK: Can I find a friend?

0:50:21.4 MH: A friend? Do you have Matt speed dial.

[overlapping conversation]

0:50:31.6 MH: This is a tough one. There’s no getting around. This is a really tough one.

0:50:35.5 TW: What was B. What was B again?

0:50:38.8 MH: The second one is the box cox transformation.

0:50:43.7 TW: I’m gonna buzz in and say… I’m gonna go with B?

0:50:47.2 MH: Moe do you have a thought?

0:50:48.8 MK: I’m gonna go with C, because I figure if it’s the right answer, hopefully, you could pronounce it, but you know.

0:50:52.2 MH: The Ljung box test.

0:50:56.1 TW: But, the right answer is the wrong answer, so…

0:50:58.5 MH: Yeah, it’s what he didn’t contribute. So, if I can’t pronounce it. Doesn’t that seem like it maybe more right?

0:51:05.6 MK: Well now you’re reverse psychologing me. I’m doubting my own guess.

[chuckle]

0:51:08.8 TW: Kinda like Peter Segal coming in here with like the… Giving the answer to the celebrity.

0:51:11.3 MH: I’m not trying to put my finger on the scale at all. Okay, here we go, The answer is, you’re both incorrect. [laughter]
[overlapping conversation]

0:51:24.1 MH: This is not a pre-recording portion of the show, [laughter], there is no advanced knowledge of what’s happening, but this is just how it works, so now you’re choosing between the box Jenkins method and the box Neiman method. Let’s talk quickly about the box cox transformation is used to transform data from a non-normal distribution into an approximate normal distribution, and the Ljung box test is a statistical test of whether any of a group of auto-correlations of a time series are different from zero. So those two do exist.

0:52:00.2 TW: So now it’s the box Jenkins…

0:52:01.8 MH: Box Jenkins or box Neiman.

0:52:03.1 TW: Okay, I’ll let you go first and I’ll take the other one, but I know which one I want to say is not here.

0:52:11.7 MK: I just have a gut vibe, so I feel like maybe you should go with your actually informed opinion.

0:52:15.0 TW: No go with your guts I’m pretty sure… Oh mine is not informed.

0:52:18.0 MK: Oh, I’m going with A.

0:52:20.1 MH: Going with A, box Jenkins.

0:52:22.4 TW: And that’s good, cause I was leaning towards D, definitely, definitely…

0:52:26.3 MH: Box Neiman?

0:52:28.6 TW: Did not do box Neiman.

0:52:30.2 MH: He definitely did not do, four D, box Neiman.

0:52:35.3 TW: Yes.

0:52:35.7 MK: Which means Morgan you’re the winner, and here we go, [chuckle], it always goes down to a 50-50 and Tim still pulls out the win.

0:52:47.1 TW: But Neiman, there’s some another Neiman cause there’s Neiman is a, something else…

0:52:54.8 MH: That’s kind of the usefulness of this whole thing, so you can kind of iterate through all the answers and finally get to the right one, [laughter], just like you can with Conductrics and AB testing, [laughter] Alright lets get back into the rest of the show and thanks for doing the quiz.

0:53:13.7 MH: All right, we do have to start to wrap up. Even though this has been such a tremendous conversation. Tim, thanks so much for being here. So one thing we love to do on the show is just go around and talk about something we’ve seen recently that’s been of interest, and think might be of interest to the audience. So Tim, you’re our guest, do you have a last call you’d like to share?

0:53:37.0 TH: Sure, well, I think I mentioned it briefly already, but Julia Galef’s book, The Scout Mindset, is a really fun read, published a few weeks ago. She’s helping people think in an open-minded and curious way about the world, and her basic argument is, we should be thinking like scouts. There’s a lot of stuff we do know, we’re trying to map out the territory, we’re trying to figure out what’s out there. Rather than thinking like soldiers, where we’ve got a position and we need to attack the enemy and we need to defend our position. What surprised me about the book, apart from the really cool riffs on Spark, which are great. What surprises me about the book is how much energy she spends in a good way, just making a case for like, this is something you should want to do, so as well as giving you all kinds of tips, not thinking more clearly and seeing the world more clearly. She’s actually saying, “This is the right thing to do, this will benefit you”. There are the benefits of being, very sort of defensive and aggressive in the way you think about data, than often a losery. So I love the book and I recommended a lot, The Scout Mindset by Julia Galef.

0:54:52.4 MH: Nice, that’s great, awesome. Alright, Moe, what about you?

0:54:56.6 MK: Well, since I have been on a bit of a hiatus, I have a twofer, so one is actually a tip from Tim, which completely got me thinking, it was kind of about how you zoom out on a particular issue and then zoom in. And that there’s a big difference between what you read in the paper every day versus what you might read weekly, monthly, quarterly. And I’m a big person, like I do sometimes like weekly magazines or like quarterly essays and that sort of thing, and now I feel like I have an emotional response about why it’s an okay thing to read those instead of checking the news every day. But I really like that perspective of if you take time to read, say quarterly articles or books, you’re gonna read something more in-depth than just the daily article, and I thought it was, I don’t know, it was kind of a nice perspective to remind me to make time to read those longer articles. And the other thing that I’ve just been weirdly obsessed with, which I thought I’d share. I’ve been listening to the Land of the Giants podcast, and the first season is about Amazon, second is about Netflix, and third is about Google, and I just… I like churn through each season in two days, so if anyone wants to read about some of those tech giants and some of the weird and wonderful things that are going on with them, land of the Giants.

0:56:18.6 MH: Very nice. Alright Tim, other Tim. Our Tim.

0:56:22.9 TW: Other Tim [chuckle] This will be a twofer on two levels, I’m gonna pander again. So I am gonna actually… Well, one thing we didn’t talk about was the… And I just stumbled across and made the connection, and Tim confirmed that I was making the right connection, but Cautionary Tales, one of Tim’s podcast, in season two, there’s a Fitterin’ away genius episode, that starts off talking about Claude Shannon, which anybody who’s been around Matt Goss very much, will I’ve heard him reference Claude Shannon, Father of information theory. But this is really kind of exploring how Claude Shannon did some of his most seminal work when he was quite young, and then he kind of moved on and was kind of famous for being all over the place.

0:57:07.3 TW: And that episode along with what puts a little bit more of a formal, I think label to it, there was a TED talk to Tim did called Slow Motion multitasking, where talking through that, there’s value in kind of not just being mono focused on one thing. Which I was making the leap to for a while, it seemed like it was very much invoked to have the generalists were always going to get their clock cleaned by specialists, and that you really, really need to specialize. And so I was trying to take the leap because I’ve got some other interests outside of this analytics stuff, [chuckle], so it was validating. So there was definitely some confirmation bias going on. But the idea that activating different parts of your brain and taking breaks and all the things we kind of hear around like, “Oh, if you’re really struggling with a problem, step away because that may be when it comes to you,” but they’re both very, very engaging things to listen to. I’m not counting that as my twofer.

0:58:03.3 MK: What?

0:58:03.3 TW: The other thing I was gonna do was put a plug-in for the DAA’s OneConference coming up in October, which is gonna be a hybrid in-person and live event in Chicago, so they’re getting their speakers lined up, but if you’re interested in getting back into the world of conferences, after a year or two off, the Digital Analytics Association actually has their conference coming back partially in person.

0:58:29.3 MH: I appreciate that, Tim, thank you.

0:58:31.0 TW: I was just taking care of that for you.

0:58:32.4 MH: Yeah. I know, that helps me out quite a bit.

0:58:34.2 MK: And now he’s like, shoot, what am I gonna say for my last call?

0:58:37.6 TW: What’s your last call, Michael?

0:58:38.9 MK: No, no, I haven’t one… So here in the United States, we’ve definitely been struggling with and coming to terms with something that’s pretty important around police brutality in our country and how it affects different communities and races. So one of the things that really struck me over the last year as I tried to dig deeper into this issue was sort of a lack of data around this in any kind of comprehensive fashion. Well, someone who I’ve mentioned on the show before, Samuel Sinyangwe and a team of other data scientists and others have created a new site called policescorecard.org, and they’ve actually started putting together public data on outcomes, police violence, accountability, bias, all these things across 16,000 different municipalities all over the United States. They’ve even built API’s into this data so it can be leveraged by other groups, it’s just really amazing, this is exactly…

0:59:35.5 TW: Did you start pulling the data and slicing it immediately without actually…

0:59:39.6 MK: No, I just… No… But what they’re doing is exactly what the book recommends, which is going and trying to build a stable data source, so that there can be the kind of analysis and accountability brought to some of these conversations, it’s very… Been very difficult because it gets very emotional, and heated and the data supports when you start to look through it, some of the findings that have been pretty obvious to some communities for quite some time, we’ve got things we need to do and it very much shows up in the data. So anyways, policescorecard.org is the name of the website, and it’s quite impressive, and they’re just… They’re continuing to add, and they have a way for people to bring data to the program and to the project, so I encourage you to check it out. Alright, well, what a great episode, this book was so fun to read because, A, the stories, I learned so many things about stories I knew just a little bit about, but then gained so much deeper knowledge… You know, Florence Nightingale, and… I’d never heard of Alice Rivlin before and learned a little bit about her, and also some of the other people in that story, so you have to read the book to find out more about who she is, and it’s fascinating, but I would recommend it. It’s called The Data Detective or also the, How to Make the World Add Up, depending on where you live in the world.

1:01:04.5 TW: Or More or Less is actually available in podcast form.

1:01:07.9 MK: And, if you are a podcast listener, like Tim Wilson, some of these stories are covered in the Cautionary Tales and other podcasts. Tim Hartford, what a delight to have you. Thank you so much for coming on the show.

1:01:20.5 TH: It’s my pleasure, thank you.

1:01:22.6 MK: And of course, no show would be complete without thanking our producer, Josh Crowhurst… Love you Josh, and I know that I speak for both of my co-hosts… And incidentally at the end of this book, the last chapter is about being curious, so I think I speak for our guest too, when I tell you all, no matter how much data you’re dealing with out there, keep analyzing.

1:01:50.9 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:02:09.0 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.

1:02:15.6 Thom Hammerschmidt: Analytics. Oh my God, what the fuck does that even mean?

1:02:22.5 TH: Okay, so some of you are drinking coffee, some of you are drinking wine. I’m kind of hoping that the wine is for 7:00 PM and the coffee is for 7:00 AM and not the other way around, ’cause otherwise we have a problem…

[laughter]

1:02:35.2 MK: In honor of the podcast…

1:02:36.6 TW: I’m concealing what’s in here, so…

1:02:38.7 MH: Yeah, I’m actually drinking tea…

1:02:38.9 TH: Really we’re going to go there.

1:02:41.4 MH: In honor of our guest, I’m drinking tea.

1:02:44.9 TH: Okay.

1:02:45.6 MH: So I’m trying to get into the vibe. I ate an English muffin for breakfast, so yeah…

1:02:49.8 MK: Oh wow.

1:02:50.2 TH: Yeah, that’s great.

1:02:51.1 MH: Feeling pretty British. No.

1:02:57.1 TH: Yeah, 45 minutes into the podcast will be like, “I just love you all.”

[laughter]

1:03:02.9 MH: That’s right.

1:03:04.2 TW: We had a guest who…

1:03:05.4 MK: That has happened once before.

1:03:06.6 MH: Oh, that’s true.

[laughter]

1:03:09.9 MK: That’s true.

1:03:11.7 MH: It was like, we can’t fix that slurred speech, can we? [chuckle] Okay.

1:03:16.3 TW: It was the early days.

1:03:22.1 TH: I’m gonna stop ’cause someone’s knocked at the door. Talk amongst yourselves, be back in 2 minutes.

[laughter]

1:03:29.5 MH: We’ve got it figured out though because the median income is about 59,984, so every person just gives up one year of income and we got the net covered.

1:03:44.2 TW: Oh yeah.

1:03:44.4 MK: I’m so nervous.

1:03:46.8 MH: Why?

1:03:51.5 MK: Because he’s so cool. He’s so smart.

1:03:52.9 MH: No, he’s just like, easy going.

1:03:56.5 TW: Yes he’s really smart.

[laughter]

1:03:57.3 MK: So scared.

1:04:06.4 TW: Rock flag and premature enumeration.

[laughter]

1:04:08.0 MH: That had to be the one…

[music]

2 Responses

  1. […] If you want some Harford podcast action, More or Less is back on air – or here’s my interview with the Analytics Power Hour. […]

  2. […] In order for you some Harford podcast motion, Extra or Much less is again on air – or right here’s my interview with the Analytics Energy Hour. […]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#258: Goals, KPIs, and Targets, Oh My! with Tim Wilson

#258: Goals, KPIs, and Targets, Oh My! with Tim Wilson

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_258_-_Goals_KPIs_and_Targets_Oh_My_with_Tim_Wilson.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares