#126: When the Data Contradicts Conventional Wisdom with Emily Oster

Did you hear the one about the Harvard-educated economist who embraced her inner wiring as a lateral thinker to explore topics ranging from HIV/AIDS in Africa to the impact of Hepatitis B on male-biased sex ratios in China to the range of advice and dicta doled out by doctors and parents and in-laws and friends about what to do (and not do!) during pregnancy? It’s a data-driven tale if ever there was one! Emily Oster, economics professor at Brown University and bestselling author of Expecting Better and Cribsheet, joined the show to chat about what happens when the evidence (the data!) doesn’t match conventional wisdom, and strategies for presenting and discussing topics where that’s the case. Plus causal inference!

Items Referenced in the Show

Episode Transcript

[music]

00:04 Announcer: Welcome to the Digital Analytics Power Hour, Tim, Michael, Moe, and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour, and their website analyticshour.io. And now, the Digital Analytics Power Hour.

00:27 Michael: Hi, everyone? Welcome to the Digital Analytics Power Hour. This is episode 126. Ah, ye old conventional wisdom. It seems like there’s stuff out there that everybody knows. In the world of business, a lot of times, they’re called ‘best practices’, something that is generally accepted by most and… That’s fine. Well, right up until it isn’t. As analysts, it’s our job to detect the truth of what is happening in the data, to correctly diagnose the issue, and make the right recommendations. But if we’re already convinced, because we know something, well, it’s time to put a critical lens on what it is, and what we think we know. Hey Tim, you’re kind of a skeptic.

[chuckle]

01:11 Michael: You… You go with the best practices, or conventional wisdom?

01:16 Tim: No. I tend to try to, just out of principle, object to whatever best practices there are.

[laughter]

01:21 Tim: Even if the data doesn’t support me.

01:24 Michael: So quintessential. Thanks, Tim. Hey, Moe? Are you excited to do this show?

01:29 Moe: Oh, this is… Yeah, I can’t even, I was gonna burst out of my seat, I’m so excited.

[laughter]

01:34 Michael: I’m super excited too. I’m Michael. So, obviously, we needed a guest, because all the best practices say so, and we always adhere to best practices. And the conventional wisdom dictated that we seek out the person with some real credibility. You may have seen her TED Talk, challenging what you think you know about the AIDS epidemic in Africa. She’s also written two books. Expecting Better, and Crib-sheet: A Data-Driven Guide to Better, More Relaxed Parenting, from Birth to Preschool. And they’re kind of a big deal. She’s also a Professor of Economics at Brown University, and she received her undergraduate and doctorate degrees from Harvard. But today, she’s our guest. Emily Oster, welcome to the show!

02:13 Emily Oster: Thank you for having me.

02:14 Michael: Oh, it’s our pleasure. We’re super excited to have you on the show. I think, to get things started right away was, how did your parents feel that you went to Harvard, given they’re both professors at Yale?

[laughter]

02:26 EO: I think that a lot of people ask this…

02:28 Michael: Oh, okay.

02:29 EO: I don’t know, I… They didn’t seem to be… My younger brother went to Yale, so I guess they got one.

02:35 Michael: Okay, so… Sorry, that was not really my opening question. Actually, what I thought would be a great kick off to the show is… Your background is in economics. So talk a little bit about your journey to delving into correcting conventional wisdom in so many areas, ’cause it seems like it’s become a trademark of your research and career.

02:56 EO: Yeah. So I am an economist, and I sort of think of myself… That is my identity, and that is… If you ask me to describe myself, that is what I would say. But as I have progressed through my life, I guess, I feel like Economics is really useful for a lot of areas. And so, in particular, when I got pregnant, I ended up doing a lot of what I perceived to be just my job in the service of pregnancy, and then parenting. And so, I think a lot of the books and my approach to life comes from really feeling like Economics is the center of all the ways you should do everything, and that the problems that people have are they just don’t do enough economics in their everyday life. And I’m here to change that.

03:40 Moe: So one of the things, though, that’s really remarkable about your work is, whether it’s to do with pregnancy, or how you raise your children, or even HIV, these are all very emotive topics, where people often hold really deeply held views. And some of that can be just, whether it’s their grandparent, or their parents-in-law, who’ve all been there before… Yeah, it must be a really challenging space to work in, given… Yeah. Everyone really does have this really strong opinion.

04:10 EO: Yeah. I think part of the message in some of the books… Part of the message in both of the books is that you have to make the choices that work for you, and that those are not necessarily will be the things that other people are telling you to do, and the recognition that you should just say thank you for your comments, and move on. And I feel like there’s such a temptation to wanna convince people that you’re right in these spaces. Like the people are like, “Oh, you shouldn’t have a caffeinated coffee, ’cause you’re pregnant.” The lady in the Starbucks line wants to tell you that. There’s such an instinct to be like, “Actually, let me explain some things about the studies to you. You weren’t really paying attention to this.” And maybe, that it’s better to be like. “Okay, thanks. Thanks.” Or just to be like, “Yeah. It’s a decaf.” And just lie to her. Because what does it matter? [chuckle] You can’t… You kind of can’t always convince people, and you can’t always meet these sort of beliefs with evidence.

05:03 Tim: But it’s so easy to get sucked into, when you have all this evidence and you realize that… This happened to me because, in reading the Expecting Better, before the show, I wound up in a discussion over a Guinness with a neighbor, and I was really just making conversation. And man, I hit some nerves with him. And I kept coming back to saying that the point of the book is just to be informed, and then to make your own decisions, which seems like, even in other areas of life, that’s not bad to say, “Why don’t you actually get the data, and then if you still have a belief about X or Y, at least you’re making it informed.” But even that is trying to change people from the religion of “I know what I believe is right,” right?

05:48 EO: Yeah. And I think people get, particularly in the some of the sensitive things around pregnancy, and parenting, they feel really hard. People can get really upset about thinking that they… That we have to be as careful as possible, and any amount of risk is too much risk, and you should never do anything that anyone has ever suggested might be bad for your kid, or bad for your fetus. And I think that’s a pretty complicated… It’s a pretty complicated set of things to interact with somebody about.

06:18 Moe: It does seem like with these particular topics, that a lot of it is that like, why risk it attitude? So whether it comes to caffeine or alcohol, and particularly because we can’t test on pregnant women, so it’s not like you have a whole series of studies where you can compare one woman that did, and one woman that didn’t. So as far as like, how do you challenge and manage that, like, well, we just shouldn’t take any risks. And one of the topics actually, that I’ve been talking a lot about, with friends of mine is, like mental health issues and people like, “Oh, we’ve never done tests on mental health medications.” So therefore, you just shouldn’t risk it. It’s like, yeah, and if a woman had any other health issue during pregnancy, we would wanna treat that. And I don’t know if it’s just because it’s so, is it ’cause you can’t test it or is it the emotion or is it both combined?

07:11 EO: Yeah, I mean, so I have many things to say about this. So one is that there’s actually somebody whose name I can’t remember, that’s not super helpful, but who’s at Virginia, I think, who is an emphasis, who has been thinking a lot about the idea that it may be unethical not to experiment on pregnant women, that we have this idea that like, we can’t experiment of pregnant woman, we shouldn’t see it, what kind of antidepressants are safe, but I think her point is like, “Well, no, actually, like, it is unethical not to do that, because then we don’t have the right kind of evidence to guide people in that space.” And I think that’s actually pretty interesting point, which is for me, pretty resonant. And so, I don’t know how to deal with how we can sort of societally deal with this idea that we have to be as safe as possible, without recognizing that there are trade-offs.

07:57 EO: I mean, mental health is a good and sort of extreme example that… To say, well, let’s not put people on an SSRI because you never know, what if there was some bad effect on the baby, but not to be like, well, actually don’t you think that depression during pregnancy and postpartum, that could also be bad for the baby. And this comes up again, in breastfeeding. People are like, “Well, I don’t wanna put women on an SSRI ’cause then they have to quit breastfeeding.” Or women who say, “Well, I didn’t treat my postpartum depression because I was afraid I would have to quit breastfeeding.” And doctors told me well, the safest thing is not to combine these things, but of course, then there’s like nine different things. You’re supposed to breastfeed, you’re not supposed to be depressed, you’re not supposed to take SSRIs. It may be literally cannot do all those things. You can not be depressed and take an SSRI and not like, you just can’t do all three of them at the same time.

08:44 Michael: There’s no set of pairs that don’t… They’re all dead ends.

08:48 EO: There’s no pack, there’s like no way to do those things. And so then we’re like, “What are we telling people to do?” You just like do everything. Well, actually those things literally cannot all happen. So, I don’t know.

09:00 Tim: Well, that starts to… But the testing, well, one, we have a history in, there has been the human testing in Tuskegee or Puerto Rico or Nazi, right? So there’s all that baggage of it not being done transparently and ethically. It also starts to sound like the traffic problem, as well, like collect the data… I’m starting to feel really good about being in digital, where we can just AB test web experiences. And people… They mostly get talked about is that they got a different global navigation on the website, than somebody else.

09:29 EO: Yeah, and I will say, I mean that, one of the spaces there is some evidence, it’s just… It’s typically not randomized, right? So it’s not, it’s not that it’s impossible to do studies that didn’t involve data on pregnant women. It’s just that, or on people’s kids, it’s just that they’re typically not randomized. And I think that, we have to be a little bit creative if we want to think about randomization and some of the best evidence on this does come from creativity. So something like, how do we know that smoking is really bad during pregnancy, is bad for babies birth weight? We know that because we’ve tried to help people quit smoking, right? So it’s not like you randomly tell some women to smoke but you randomly help some people quit smoking. If you can actually help them quit smoking, then you can use that to both get people to quit smoking, but then as a side note, to figure out if their impacts on birth weight. So some people have done clever stuff and we can learn things from data but not as good as as I think we often wish.

10:21 Tim: Well, causal inference being like such a… Like in the data science world becoming like this big topic, trying to figure that sort of causality out in the limited world… Moe’s laughing because I, as I was starting to read the book, I’m like, “It is. It’s all about causal inference.”

10:37 EO: It’s all about causal inference. Yes, secretly, I’ve put my job as like person who researches causal inference, and I made it into about bad babies.

10:47 Moe: Tim’s like, yes! Causal inference is taking over the world. One of the questions I also wanted to understand a little bit about from you was, I guess, the standards to appear in medical journals nowadays. And it seems to be a topic that’s coming up, the ABC news here in Australia had a show on this recently, about whether… So basically, some of the studies you talk about in your books are, some of them are really old, which I was kind of shocked to see how old they were. And that that’s still informing a lot of opinions today. And part of that is like the threshold to get into a medical journal is you need to have a significant level of 95%, which means that if it’s 94, people think it’s irrelevant. And like, what about publishing studies where there’s no effect? Like, do you think there’s something in that space that needs to change, overall?

11:36 EO: Yes, I do. I think that there are sort of two pretty significant problems that I see in the space of papers that I spent a lot of time with. So one is just that these kind of public health literature’s have not really confronted that kind of credibility revolution that I think other places have gotten to. So that means that there are a lot of studies published, which are effectively just I’ve compared these two groups whose moms did different things, and I adjusted for their moms’ education. And that’s all I control for. And that kind of approach, like that is not going to lead to causal effects. Like the selection, the differences between those families on observable things, on things that the person doesn’t see, are just really vast.

12:21 EO: And I think we just need to recognize that a lot of that research, it’s not just that it could be better. It’s that like, we should learn nothing from it. People are like, “Well, it could be something else.” No, no, it is something else. It’s not the thing you think. It’s just differences in the families like, at best, it’s a tiny amount of the thing you think, and almost all just differences across family. So I think we just need to confront those problems more directly. And think the other issue is just a sort of general publication bias issue, which is problematic for this kind of research, but also randomized control trials, which we tend to think of as better also have this kind of… J.P Ioannidis at Stanford has done a lot of interesting work on this idea that sort of publication and bias, in general, causes us to publish a lot of results which are probably wrong. He has a really nice paper which is called ‘Almost all published results are wrong’ or ‘Most published research findings are false’ or something, that I like a lot, which I think is true.

13:16 Moe: When you were doing your research, how did you dig out, I suppose, the… ’cause it’s obviously a lot of published content and there’s a bias too assuming that’s more accurate. How did you make those judgment calls, if you wouldn’t mind talking us a bit through the methodology?

13:31 EO: Yeah, so I think the big… I think the issue of publication bias, I kind of punt on, in the sense that I have in the back of my mind, the idea that we’re gonna see more significant results, and I sort of try to use that to filter a little bit. I think the bigger piece of what I’m doing is just within the results that are published, trying to identify the ones that I think are more compelling from less compelling. And so, breastfeeding is probably the best example of that. In the second book, which is sort of most of the work of that analysis, it’s just reading the hundreds of papers on breastfeeding and deciding which 90% of them are just junk. And then, what are the 10% that are actually we are learning something from and that’s a lot of what I did.

14:13 EO: And I think that’s actually a big distinction between the way that I think about combining these pieces of evidence and the way that a sort of traditional, epidemiological meta-analysis approach would do it. Because in the latter case, I think they spend a lot of time kind of averaging things, which have some bias, and saying, “Well, there’s a million studies that show that breastfeeding is correlated with IQ, so that must make it more likely that it’s true.” But all of those studies have the same problem, which is that, more educated women are more likely to breastfeed. And if we isolate the papers that are able to do the best job addressing that, it’s true there’s not that many of them, but they show a different, a different thing. And I like to focus on those papers as opposed to saying we have tons of other junky, crappy papers and we’re gonna also use those. I don’t like to use those.

15:01 Tim: What is it like to actually go through a paper? What’s the process as you’re going through it, that you’re reading kind of the study design, presumably? Do you kinda scan the… What the conclusions and the findings and then jump into the study design and reading that in detail and just trying to kinda poke holes in it and asking questions, how does that work?

15:25 EO: Yeah, I think you’re overstating how complicated these things are, [chuckle] just to do. Basically, for many of these kind of studies, it’s very simple, to see what the research design is. I’ll sort of read the abstract, figure out what the research design is, I look at the tables and see what they show. And I look at many, many of these papers all the time and many of them look pretty similar. And then a lot of times I like to read the appendices. Usually the good stuff’s in the appendices. That’s where they put the extra robustness check or they show that actually if you control for one additional thing, the results go away. So you gotta read the supplemental appendices, that’s the secret.

16:04 Moe: I was just gonna ask, after, particularly, Cribsheet was published, and some of your articles like on FiveThirtyEight and whatnot about breastfeeding, there has been some criticism. I guess I’m just curious how you manage that. And particularly, one of the things that I do need to draw out, which I think is really interesting, is the fact that you have actually published work that has contradicted some of your previous work, which I think is completely incredible. But I suppose I’m curious to understand, particularly as analysts, like we often have to do this. How do you do that well, without, say, impacting your credibility and your trustworthiness and how do you manage it personally?

16:43 EO: So they’re sort of two separate things. One is yes, I did publish something that was wrong when I was in graduate school and then I published something else saying that it was wrong. That experience, well, ultimately very formative, was at the time pretty awful, but I got through it. But I do think about it a lot when I write things that are gonna be controversial and kind of… First of all, I think about how do you make sure that you’re right or you’re as right as you can be given the evidence? And then I think about how to process the criticisms that people bring. I will say, in the first book, I got a huge amount of criticism on the alcohol stuff way, way, way, way beyond.

17:24 Tim: Did you see that coming? Did you…

17:26 EO: No, I did not see that coming. That was like… I was so naive.

17:30 Tim: “What about… Here are the studies, the evidence says!”

17:33 EO: I don’t know, probably because I actually had talked to a lot of doctors. I had talked to many doctors who were like, “Yeah, it’s fine, I tell my patients it’s fine.” You can read surveys, it was like 50% of doctors tell their patients in the US, tell their patients it’s fine to have a little bit or at least some of their patients it’s fine to have a little bit of wine. I hadn’t sort of processed… I’d understood some people are gonna go like, “Meh.” But I hadn’t processed that this was gonna be such a significant thing. And that was really, I found that really hard. It was just being kind of attacked in this way and feeling like I hadn’t sort of come into it the right way. I would have… I would have presented the information differently or talked about it a little differently, not in terms of conclusions, but just the way I talked about it, I would have done differently, if I had known where we were going or how people would react. So that was really hard. I thought I would get some of that same kind of pushback on breastfeeding and there’s been a little bit but it’s like nothing, nothing like that.

18:31 Moe: Oh, really?

18:32 EO: I got… I got some people saying, “Well, breastfeeding is good.” But I think part of it is that… Actually the book is a little bit more measured about… It doesn’t say that breastfeeding doesn’t have some benefits, it does have some benefits. It just says that they’re kinda… People overstate them relative to the truth. So, I think there was something for people to glom on to there, if they were really into breastfeeding and I think the other thing is it was much easier to help people understand why it’s very costly to be so… Be so aggressive about this, and why it can be really harmful for some women to get this kind of pressure, particularly if they can’t breastfeed or it doesn’t work for some reason, that that’s kind of… I think people had an easier time seeing that. But then there was some lady who said that I was as bad as the Antichrist.

19:17 Tim: Yeah.

[laughter]

19:18 Moe: Yeah. Let’s just ignore that… Let’s ignore that human completely.

19:23 EO: It’s… You can’t please everybody.

19:26 Michael: So you’ve mentioned a couple of times… Talking more abstractly about some of these things where, you made a recommendation based on the analysis of these papers. And the response you got was over proportionate in terms of like, the emotional reaction that people had that because of their in going beliefs into this. And I think that’s a really good thing for analysts to think about. So could you talk a little bit about how do you, when you speak to a lot of people, when you’re talking to lots of different groups of people, when you know something may cause someone to sort of be like, “Whoa, what are you talking about?” How are you framing that and what ways are you doing that? ‘Cause I think that’s a really important thing for people… We don’t do the kind of work you do, but we do present ideas to people who are like, “I don’t believe your idea.” And we’re like, “Well, the data supports it.” And you end up either in a very, like, oppositional way. But I’m curious, how have you seen that work out, maybe some positive ways or maybe they’re just… People are really stuck and they can’t, I don’t know?

20:31 EO: Yeah. So I have practiced a lot over time, both when I talk about my academic work, and I’m gonna talk about this work, sort of projecting an atmosphere of openness, which tends to work better. And so when I talk about, something like breastfeeding, I open with the good stuff like, “I looked into breastfeeding, and there are some real benefits, they are blahbity blahbity, blah. But some of the things that you heard are overstated, and here are some of the evidence for that.” And so I think you kind of get people a little bit more on your team. I find I get people a little bit more on my team that way. And then you kind of hit them with like, “Here’s what the data says.” And I think that that works some of the time. I have had a lot of interactions or some interactions around the book where I felt like I was just not making any progress, particularly around vaccines.

21:22 EO: So in the book, I said that you should vaccinate your kid and that vaccines are safe and effective, and I spend a lot of time on why I think that. And I have talked about that with people. And that’s where I get people who are just like, they’re not hearing it. They have an additional reason why, “Well, actually, yes, that’s a study is from Denmark, but in Denmark, they give the first vaccines at three months, and here we give them at two months. And so you can’t ever learn anything from that evidence,” which, of course, doesn’t make any sense, but is also a very hard thing to refute, because if you believe that to be true, like what am I gonna say? Like that’s crazy, but okay, like we’re not… So I think there are places where people are so ingrained that it’s almost impossible to sort of pull them out with anything, data, rhetoric.

22:11 Michael: Yeah, I remember our second child was born right around that time where the whole vaccination thing was really at its height, and my wife is a nurse, and we were questioning whether or not we should even vaccinate like, so it was crazy. And eventually we were like, “Come on, this is not cool.” But yeah.

[chuckle]

22:32 Tim: But how many people share personal anecdotes of their own children with you at this point? Have you had it…

22:38 Michael: Yeah.

22:39 EO: 100%.

22:39 Michael: 100%.

22:40 EO: 100%. Yeah.

[chuckle]

22:41 Michael: Let me tell you about our journey. No, I’m just kidding.

22:43 EO: Yes. No, I get. I think it’s great. It’s… That is like many of my conversations, which is good. That’s I guess why I got into it.

22:52 Michael: Well, but people are very stuck out there. And actually, that’s why like, I was very fascinated with your books because I was like, when I was about to become a father, I think most people are in that position of not knowing anything. And all the information they’re getting is just sort of what… Like the cure to the common cold type stuff like, “Well you take three of these and you do four of that.” And so for someone to actually create some kind of, like wealth researched information is kind of a big deal. So I kinda just wish you were around about 10 years earlier that’s all.

[laughter]

23:25 EO: Right.

23:27 Tim: So there’s this other idea that… And Moe touched on this a little bit earlier with kind of all odds are not created equally. But coming at it from just in general human beings ‘ ability to kind of do probabilistic thinking and think about a 90% chance versus a 98% chance and everything wanting to see it as binary, do you feel like that is… I feel like that’s a challenge in the business world, even when we have much, much better data in trying to say there is no such thing as 100% confidence level.

24:05 EO: Yeah.

24:05 Tim: And that makes people think they want the simplification. And therefore they’re resistant to a nuanced discussion or where they need to kind of actually consider probabilities?

24:15 EO: Yeah. I often say I think people basically have only three probabilities, always, sometimes and never. And you can understand that something doesn’t always happen or it kind of sometimes happens, but understanding the difference, particularly if it tails between at 99% and 99.9%, that somehow those are actually really different. And they’re as different as some things you might perceive as very different. And yet people kind of don’t process that. And so this is something I think about so much in writing, because it’s so important to some of these decisions. So something like, “What kind of prenatal testing should I have?” Really requires people to actually understand what are the probabilities of these different events happening. And they’re very small probabilities. And so I usually try to think about framing them either ways so you’re trading off things you do understand or framing them with a probability that’s closer to something you understand. So I spend a lot of time talking about like the risks of being in a car as a frame of something like you’re taking that risk. That’s a risk, you sort of understand. You could put things as more risky than that, less risky than that, at least it would be a benchmark as a starting point.

25:25 Moe: I did actually, I noticed that quite a bit in the book where you’ll be like, “This is the probability of these.” And then you’ll have five other examples, which I guess provide, like a bit of a scale. So that someone who’s not familiar with the concept of probability can be like, “Oh, that’s yeah, the same as traveling in a car or getting hit by lightning or whatever the case may be.” And I think that’s something that analytics could probably do better is trying to contextualize that.

25:50 EO: Right. I think I always try to explain to people like the chance of winning the lottery ’cause one time in graduate school, I wrote a book… I wrote a paper on lottery. And so sometimes people ask me to talk about the lottery. And they’ll be like, “You know, there’s a one in 80 million chance.” [chuckle] I’m like, “That’s zero.” And they’re like, “Well, I’ve got the 1.” It’s like, “You should know that number is zero.” So in case, you’re wondering what the number is, it’s zero.

26:10 Tim: But then you find a person who’s third cousin friend’s neighbor actually won the lottery and therefore they have blown up your entire perimeter because…

26:17 EO: The best experience I had with that was I went on Good Morning America and I talked about this, and I said, it’s exciting to play because you feel like, I could win, but of course you won’t or something of that nature. And they cut the quote as you might win, and then they cut my quote, and so my college roommate calls me, and she’s like, “My mom said you were on the Good Morning America, and you said she might win the lottery.” [laughter] So, that was not what it was.

26:44 Tim: Oh, why don’t we go on editing this? I mean, this content. Yeah.

26:47 Michael: Yeah. There’s probably…

26:47 EO: Exactly, you do… And that’s okay, though, TV is a lot of fake news.

26:54 Michael: We cannot let that air. We get too many advertising dollars from state lotteries.

[laughter]

27:02 Moe: So one of the concepts I’d love to touch on, which I think is framed really well in your second book, Cribsheet is about cost-benefit and obviously working in analytics, it’s something we do talk about in the business context a lot, but it also… I’ve been reflecting on this a lot and I’m gonna share an example from a close person to me which I’m sure everyone can guess who that person is. So she was making a decision about daycare for her son, and she called me up and I was a kind of shit human when we first had the conversation, ’cause there were three top choices, and one was a really good school that’s really close by. They serve kind of crappy food for the kids. The kids aren’t reading yet. Not too sure about the educational standards. The other one was a really, really great school. Best reviews. All the kids seemed to perform really well. They were already reading and then… Yeah, there was another third option, but the Principal wasn’t nice, so that was kind of mixed.

28:00 Moe: And I said to her, I was like, “Well why wouldn’t you just choose the best school with the best, I guess, education program for your kid?” And she said to me, she goes, “You know, that is important, but what is also important is that’s a 45-minute drive so that’s an hour and a half I don’t have with my kid every day.” Is the education standards of a three-year-old going to make a difference in the course of their entire lifetime? One year at a pre-school is probably not gonna… And we ended up having this massive discussion. And I’m not gonna lie. I often use a matrix and I score different things when I’m making an important decision, which I told her to do and I then was reading your book and I had this like… “Woah, this is exactly what Emily is talking about is/it’s like cost-benefit is also so subjective.” And when it comes to parenting that’s actually okay. For you as a parent, something might be more important, and that might be the time with your kid and I think you talked about this with meal prep, and cooking at home. I suppose… Are you able to shed a little bit more light like, is this something that evolved or did you… I feel like you’ve just always got this and it’s taken me a while to get there.

29:06 EO: Yeah, so I should say, is back where my parents are both economists and so in my house growing up, the sort of idea that you would trade off the cost-benefit, that was just a thing that… That was just the way it operated. So we did not go grocery shopping, we had… My parents had figured out, even in the early 80s, how to get the local grocery store to deliver. They would fax over their list, then they would frame these boxes. I don’t know what they did before the fax but… But I thought… I remember thinking going grocery shopping was very exotic and I would like to do it because that’s what everyone else did. Now I realize that was wrong and so I was like, “Why don’t we go grocery shopping?” And my mother was like, “Well, my opportunity cost of time is very high.” And I remember being like, “Oh okay, yeah, that makes sense,” that’s… Of course, that’s the answer.

29:53 EO: I should have thought of that. And so I kind of grew up in this idea that somehow when you’re thinking about stuff in your household, when you’re sort of thinking about how to allocate your time, that’s the idea that you will use these kind of almost decision tools is a big piece of it and that you would actually try to think about your time as the way that you would almost at work. And so I think I found that very easy to apply and a very natural way to operate my life. But my husband, who is also an economist, when we first met and he was with my family, he was like, “You know, this is super weird, right, your family… [30:28] __ ” I know that people are not like this and I said, “Huh, that’s… I didn’t think about that.” And so I recognized it’s a bit weird, but of course, now, he’s also an economist so it’s very easy. We can just go right into doing it, doing it all of this way, but I think it is… And for me, it is very helpful.

30:44 Tim: Is there like the, “Who’s gonna do the dishes?” And it’s a whole theory of comparative advantage debate between the Oster household?

30:50 EO: There are many principles that you could use, of which that is one, I think, the important thing to think about with the dishes is that there is also a learning by doing component for this. So actually before the kids, I did more of the dishes, but I also do more of the cooking and now he does the dishes and now when I do the dishes he is very upset because I don’t, I like…

31:09 Tim: Do it right.

31:10 EO: Yeah, I don’t do it right. I mean… I don’t really think that, but you know… It’s sort of like… Now it’s like, “Just step away now. I have to move everything.” I was like, “Uh, okay,”.

31:24 Michael: Systems are important, yeah.

31:25 Tim: When are you guys pitching the reality show that is literally just following you guys around, and the dialogue? And the kids?

31:33 EO: Yes, and my family has drawn the line at being characters in the book. That’s where the line is. That’s not… We’re not going… We’re not going beyond that.

31:42 Tim: This has given you the opportunity to tell me that’s not the first time you’ve been asked that?

31:46 EO: No that is the first time, yes. No one has approached me about reality TV.

[laughter]
[music]

31:54 Michael: Yeah, as we step away for our multi-touch moment, Josh, I recommend everybody that they get somebody on their team that’s a growth hacker you know what I mean?

32:03 Josh: Yeah, too many companies just focus on static marketing.

32:06 Michael: I know, and it’s a tragedy, but more and more people are getting into the swing of things and there’s a great new tool out there. I know you’ve heard about it, growthalytics.cool. They’ve been so visionary, they’re now getting you to that chief growth hacker that every company needs.

32:23 Josh: Yeah, and the best thing about it is you don’t even have to involve your IT group, there’s actually no code that needs to be added to your website. Talk about lean analytics, Michael.

32:32 Michael: I love it. This is the kind of technology… It was created by three Stanford grads who realized that every other platform is out there doing it wrong, and that’s just it. You deploy it or don’t deploy it, actually, you just log in with the mobile app, there is no desktop, and you just choose from a pre-defined set of North Star metrics and the insights just start rolling in. I mean, what could be simpler?

32:53 Josh: Startup the Model S because their valuation just tripled.

32:56 Michael: Boom. I’m getting two new pairs of Allbirds, and another Patagonia vest after this one.

[music]

33:03 Josh: The best.

33:05 Michael: Alright, great stuff. Well, let’s get back to the show.

33:09 Moe: So, I do have one question from a listener, that I’d like to ask. So, Nicole Ferman, we have this thing called Measure Slack, and it’s basically 5,000 analysts that just nerd out and chat to each other on Slack and help each other with problems, which is really lovely. And I did share with some of the people on air that you were gonna be on the show. So Nicole was really curious to understand, I suppose when you’re going into doing analysis like this, whether it’s on breastfeeding or whatever the topic is, naturally as humans we always have something in our mind of what we expect the outcome to be. So she wanted to understand, what about your findings most challenged your prior assumptions? And I guess from a practical standpoint, how do you… We’ve talked about bias quite a bit on the show before and how to mitigate from it in analysis, but do you have any tips or tricks or techniques that particularly work for you or… Yeah, I don’t know.

34:03 EO: Yeah, I mean, I struggle with this. I do struggle with this and I try to de-bias. It’s not like that’s really a thing that you can do, but I try to kind of acknowledge to myself before I start writing this stuff, what biases I am bringing. I would say, most of the stuff that I was surprised by were things where I just didn’t think there would be any data and then there was. So the one that comes up in Cribsheet is the issue of discipline, sort of just looking at whether there is evidence for any particular kind of discipline where I thought that there would not be and then it turned out actually there was pretty good evidence in favor of some forms, some sort of specific discipline programs, mostly around total positive interactions and timeouts and so on. So that was a place where I… Actually, that was a place where I changed what I was doing parenting-wise because of doing the research for the book which doesn’t happen that much ’cause usually I’ve already learned about the thing before I… Before I write it in the book.

35:00 EO: But then there were a few places… I mean actually, this was a… Like in one place which is about spanking where I’m pretty explicit in the book. There is no evidence. I’m gonna tell you what I think the evidence says here, but I also wanna be clear there is no evidence that would make me think that this was an okay thing to do just because that is a personal bias that I bring that was strong enough that I didn’t want to. I wanted to acknowledge that. So, sometimes I do think about, should I say… Should I say how I feel about this? And the books are pretty open about the choices that I made, particularly in pregnancy. I’m pretty open about like… I chose not to have an epidural, and that was important to me, but I try to also be like I can see why you would not choose that.

35:39 Michael: Yeah, so in my research, what I did is, I went back and I read all the one and two-star reviews on Amazon…

35:45 EO: Oh, nice. That sounds nice…

[chuckle]

35:47 Michael: So… ‘Cause I… What I wanted to see was sort of how people who were sort of having a problem were interacting with it and that was very interesting because, between books, the tenor of the disagreement changed. And in your second book, it was much more strident and I think we’ve covered that a little bit about some of the topics. It’s sort of after you have the baby people get real emotional about how that baby should be raised. And so, that… Cribsheet, that book talking about some of these things and that is, I think, really interesting. But the other thing that we also talked about at the top of the show is you also changed or modified your tone and some of the feedback the people who were negative about the book were like, “This book is less well-researched.”

36:31 Michael: I wonder if even the way you talked about what you were doing, ’cause I honestly don’t believe that you researched things less, I think you modified your style to be more adaptive to probably a broader audience, but some people then actually perceive that as being less authoritative and I’m wondering if you’ve experienced that or you could talk about that dynamic a little bit. And I always… I’m trying to bring this back to the context of how do we present results to a business and like if the CEO is asking us tough questions, do we dive into the stats or do we stay high level and talk about some of the broader concepts? So anyway, I just thought that was a really fascinating takeaway that I got from doing some sort of alt-research…

37:12 EO: Opposition research.

37:13 Michael: Yeah, exactly, I was like… Who is this Emily Oster anyways? Let’s check out the one and two-star reviews.

[chuckle]

37:19 EO: Yeah, so this was… I found that that particular criticism to be super interesting and I think part of it is that the data is not as good in a lot of these… Like it’s not as… It’s just… It just does not give you the answer. And so, I think for a lot of things in pregnancy like when people read that book, they were like, “Okay, the data says it’s fine to have sushi,” and there were many things like that. Like the data says blah, and it’s not that there weren’t choices, or it wasn’t like some people would choose differently, but there was a lot more or it was just like, “Here is what the evidence says and we can… ” You’ve… That’s something you can kind of take to the bank and you can really learn something from the evidence.

37:58 Michael: Right.

37:58 EO: In the… Many things in early childhood, it’s like, “Look, here is some evidence that we have, and it’s not super compelling and it’s not always that good.” And so, I think even people who liked the book were like, “Yeah, I thought this was just gonna… ” Like it was just gonna… “I expected better, you’re gonna tell me, do this don’t do that. And why didn’t you tell me that” And the book was more like, “Well, you can make your own choices.” And so, I think that’s some of what people are reacting to, I think, the other piece is, yes, I clearly moved towards a slightly more chatty style, which I think it does make the book more accessible, but also has a feature that it kind of masks a little bit more, the work, and so they’re sort of like the book is something between a memoir a meta analysis. In both cases, the thing it’s underlying is a meta analysis. The question is, when you write it, how close you get to writing the meta analysis, and I think the first book is closer to that than the second, even though the research doesn’t.

38:54 Tim: Are we trending then, towards the third book, is “The teenage years: I don’t know, we have no idea.”

39:00 EO: It’ll be like, “Do whatever, who cares.”

39:03 Michael: Yeah, it doesn’t matter, they’re gonna hate you.

39:06 EO: Yeah, exactly. Just try to live through it. Don’t die.

39:09 Tim: No one has cracked that code.

39:12 EO: Yeah, I don’t know.

39:12 Moe: I’m thinking out loud here, but I do wonder, because you have talked before about the concept that lots of parental decisions, and I guess, the justification for why a parent made a particular decision stems from loss aversion and basically that decision was so tough for me and involved sacrifice, so, therefore, it must have been a better decision. I wonder, I’m… And this is literally me just hypothesizing, but I just wonder whether… Like whether a person rating a book, reading one of your books, pre being a parent, therefore, would have… Be able to have a more open mind than a person post-children reading your book. Although I don’t know, Tim read them all and he seems to have an open mind, but he’s all about the data, so, I don’t know.

39:53 Michael: The reviews clearly indicated that someone should read this book before having a child.

39:58 Moe: Okay. [chuckle]

40:00 Michael: Absolutely, is my perspective.

40:01 EO: Yeah, I think once you have a kid, you can get pretty locked into the things that you’re doing, and it can be very hard to either say, “Well, I did this wrong.” Or even if the book’s like, “All choices are good.” Or many different choices could be the right choices for you. If you made a choice that was very difficult, then to hear there wasn’t a lot of benefit from that, I think that’s the thing that’s hard, right? So I write in the book about how sleep training, like there’s no evidence that that’s bad for for your kid and it does help them sleep better. And one of the pieces of pushback I got there was from people who were like, “I suffered through postpartum depression and my kid didn’t sleep through the night until they were two, but I would do anything for them. And if you choose to do this, you must hate your kid.”

[chuckle]

40:44 Michael: Yeah.

40:45 EO: And I think some of that comes from the feeling, it should just be like, “Ah I suffered through postpartum depression and my kid didn’t sleep through the night too, but I guess that was on me ’cause I could’ve done it a different way.” I think that’s not something we like to, we like to say.

40:57 Michael: It’s an emotional response.

40:58 EO: Yeah.

41:00 Tim: Well, or that, as a wise person repeats to me regularly, “We’re all doing the best that we can at any given moment in time, it’s the best we’re capable of.” Which means, everybody made decisions and they made the decisions based on what they knew or felt or believed, whether they researched it or not. And so then, it is going back and saying, “Well, wait a minute, if that wasn’t maybe the optimal decision, now I have to re-evaluate. Yeah, that’s getting deep.”

41:27 Moe: Right, sorry, is this Julie? Does Julie say this?

41:30 Tim: She quotes my son’s therapist.

41:33 S?: Julie. [laughter]

41:34 Michael: Yeah, she’s gonna be like, “I don’t remember saying that to you, Tim.”

[laughter]

41:39 Tim: So I do have a couple of, they’re a couple of total curiosity things that were around specific… And this is more from Expecting Better and also kind of hot topics. So we’re back in the anti-vaxxer world or at least the vaccine world, but there wasn’t a whole lot about autism. Did you come across things that were any sorts of indicators and consciously say, “There’s just not enough strong evidence for anything from a causality perspective?”

42:07 EO: Yeah, I looked into this a little bit. And I’ve read a lot of papers on different things that people say are about autism: Father’s age, father’s old sperm, mother’s age. And I know there’s a lot of correlations, not with vaccines, to be clear. But there are a lot of correlations, but I think that there’s nothing that is especially compelling, causally, and this just feels to me very poorly understood. Particularly, the fact that it’s gone up over time; how much of that is diagnosis, how much of that is some real thing that we don’t understand, so I didn’t put it in because we don’t know anything about it. Basically.

42:42 Tim: That’s what I suspected. There other, there seemed to be a lot of studies and this seems like it was just an outcome measure that the studies had, what they would measure IQ, down the road, and not having studied it exhaustively, it seems like I picked up through various sources that IQ is kind of questioned as a great measure. So is that… Did you find yourself saying, “Oh my God, I could be researching for 100 years following one trap down the other.” And it’s just… That’s the data that was collected?

43:12 EO: Yeah. IQ is a shit measure. And…

[chuckle]

43:13 Tim: Okay.

43:14 EO: And that’s just like… But it’s not the only thing people measure, right? There’s other neuro-development measures, there’s behavior stuff, and these things all kind of… There’s test scores, these all sort of line up in most of these studies as being the same. I think there are many outcomes, which are missing, and where we don’t actually see, you don’t see anything about them. So… People often ask me, “Well, how do I make my kid happy?” Where is the study where the outcome is like, is your kid happy?

43:45 Moe: Wow!

43:46 EO: And I was like, “Well, we don’t have… That isn’t a study.” We like things we can measure in administrative data or with tests, and so we’re pretty focused on those things. [chuckle] But it is an inherent and very deep limitation of what you’re even able to do with data.

44:00 Moe: God, you must get that all the time, though, like, “Here is my every problem, about my child and… “

44:05 Michael: Yeah.

44:06 Moe: “You’re the expert.” That must be hard to…

[laughter]

44:09 Moe: Do you just say, “No, I don’t know that.”? How do you…

44:12 EO: Yeah, I’m usually, I’m just like, “We don’t know.” People don’t ask me that many things about the older kids. They ask me a lot of questions about pregnancy like, “Is it okay to have tonic water?” That was yesterday. Somebody is like messaging, like, “Is it okay to have tonic water?”

44:24 Moe: Tonic water?

[laughter]

44:26 EO: You’re not allowed to have Quinine. Like, quinine, as malarial medication is contra-indicated. And so that’s in… Right, yeah, yeah.

[laughter]

44:36 Moe: I’m like, mind blown. All this… All this shit. I’m just like I can’t believe people think about this. I just feel like, “I don’t know unless someone tell me definitely no, then… “

44:44 Michael: Stick to the white cloths.

44:45 EO: Yeah. And people like… The internet has conflicting… People are… People are… Yeah.

44:50 Michael: Oh, Moe.

44:51 EO: Yeah.

44:51 Michael: Oh, Moe.

[laughter]

44:55 Moe: To be clear, I’m also the only person on the show who has not had a baby, or ever been pregnant, so yeah, I’m…

45:04 Tim: It’s possible she has a sister who has a son, who may or may not have been unnamed reference earlier in the show.

45:09 EO: Right, it’s possible.

45:12 Moe: Who also just messaged me saying, “I’m so fucking jealous. Can you ask Emily if we can have drinks,” so…

[laughter]

45:18 EO: Are you in Australia?

45:20 Moe: Oh, my sister is in the Bay Area, but yeah.

45:22 EO: Oh, okay.

45:22 Moe: Yeah. She does the same job as me, so she actually used to work with Tim. So it’s kinda nice.

45:26 EO: Oh cool. That’s funny.

45:28 Michael: Yeah, such a small world. Okay, we’re gonna keep chit-chatting, but we do have to start to wrap up. And it’s unfortunate because there’s so much value in this conversation, and I’m enjoying it so much. So one thing we love to do on the show is go around the horn and do a last call, and it’s anything we found interesting, where we wanna tell our audience about. Emily, you’re our guest so I’ll give you first crack at it or we can skip you.

45:53 EO: No, I wanna go… I wanna go… I wanna go last so I see what you do.

45:56 Michael: Okay. Perfect.

45:57 EO: Yeah.

45:57 Michael: Then we’ll start with Tim, ’cause he’s the quintessential last call.

[laughter]

46:02 Tim: Oh man, I’m torn between the silliest one or the… Oh, I’m gonna do the terrible. I’m gonna recommend something that I haven’t read. I’m gonna recommend a post that is very compelling, that’s, three must read statistics books for aspiring data scientists. And I just thought the recommendations sounded, the way they were described, sounded like they would be ones for me to put on my bookshelf. So, it’s Naked Statistics: Stripping the Dread from the Data, Practical Statistics for Data Scientists: 50 Essential Concepts, and Statistics Done Wrong: The Woefully Complete Guide. Don’t those sound like page turners?

46:41 Moe: They all sound like great night time reading, although, I’m not gonna lie, I was reading Cribsheet before bed every night, and it scared the shit out of me, and I was like, “this is a day-time book.”

46:48 Tim: And she hasn’t slept in a week. [laughter]

46:54 Moe: Michael, why don’t you go next.

46:55 Michael: Oh, yeah, of course. Well, so I’ve got two things, really quick mention, I’ve mentioned on the show before, but tomorrow, literally tomorrow, the Digital Analytics Association is having their first national conference in Chicago. And if you’re coming, I will see you there. If you’re not coming, you can start planning for next year. So, if you mention the show to me, though, at the conference, I will give you some kind of special swag that I’ll figure out by the time I get there. And then the other thing is, I read a really great article recently, so Ben Thompson…

47:30 Tim: I’ll be there with ya, so.

47:32 Michael: Oh yeah, that’s right, Tim will be there, too.

47:35 Tim: But thanks, buddy.

47:36 Michael: Hey, I couldn’t remember.

47:38 Tim: No swag for me. I’ll just point them all your way.

47:43 Michael: I’m gonna have you order the swag, Tim, so that’s how that will work. And then I read a really great article, a little while back, by Ben Thompson, on Stratechery, where he’s talking about how Amazon is functioning. And I really enjoyed it, because, I think I’ve mentioned on the podcast before, how I always read Jeff Bezos’ letter to shareholders every year. And in one of them, he always talked about how Amazon wanted to be a Day 1 company. And in this post, Ben Thompson’s able to demonstrate how they’re slipping on that a little bit and what they need to do to return. And I thought it was just a very good analysis of Amazon as a business at a time where Amazon is either the devil or the best thing ever, depending on your perspective. Alright, Moe.

48:27 Moe: Okay, I’m up. So, for full disclosure, Emily, both of your books have actually been previous last calls of mine, but today, I’m doing something really odd, which is that I’m not talking about anything technical or anything that I read or saw. But last night, I was sitting at home, having a glass of wine, prepping for the show, and I was going through this really long list of last calls, ’cause Tim makes me keep them in a note so I can find them. And I was just sitting there being like, “Damn, how did I get really, so lucky?” Lucky to not only be on the show, and Tim and Helbs, obviously, are amazingly supportive and every crazy idea, they always back me. But I was just sitting there, so excited to speak to you, Emily, and thinking that I could just spend an hour with this person that I never thought in my whole life that I’d probably ever get to talk to and ask her all these really fascinating questions.

49:23 Moe: So my last call today is to encourage people to not read something or listen to something, but it’s to reach out to someone and someone that they admire or respect. Even reach out to someone that you think you could help instead of just always asking other people for help. You could reach out and offer a hand to someone, someone that you have a curly question for. Because, I guess, basically, all the people that we never dream of interacting with, you don’t get the chance unless you ask. So have a think, also, when you’re reaching out to those people, about how you could help them and not just ask them for favors. And then the next time someone asks you, pay it forward and say yes to that other person that probably looks up to you. So yeah, sorry, it’s a bit schmaltzy, but, I did confess I was drinking wine at the time, so you know.

50:06 Tim: Moe, on repeat?

50:08 EO: That is so nice. Okay, I don’t think I can follow that so nicely, but let me say two things: First, I will put a plug in for some time in December, which I realize is far away, when I will be on a Netflix show which is, David Chang, the restauranteur, has a Netflix show. And I’m going to be on an episode of that. I filmed an episode of that with his wife, who was pregnant at the time, about whether you can eat sushi, and it was really fun. I will actually say that is one of three Netflix shows that I have filmed and Netflix has so much content that none of them have aired. But someday, I may be everywhere on Netflix. So, wait for that. And the other plug I will give is for a book by a former colleague of mine named Matt Taddy, who is at Amazon, which is called Business Data Science. And it is basically the textbook from a really successful class he taught at Chicago Booth on how to use data science as a business person, and it is a really, really, really good book of combining ideas from causal inference in a language that people who are more familiar with the data science side would like. It’s very good. I highly recommend it.

51:18 Moe: Tim is ordering it on Amazon as we speak. I can watch him typing, “Order now.”

51:22 Tim: Why did it read Data Science for Business and not Business Data Science? Amazon let me down.

51:28 Michael: No, that’s awesome. So is that David Chang’s Ugly Delicious show?

51:33 EO: Yes.

51:33 Michael: Okay, that’s a great show. Oh wow, okay, well.

51:37 Tim: Michael just got really excited to have you on the podcast now that he knows that you’re an upcoming…

51:42 Michael: Whatever. I… Listen…

51:45 EO: I’m still… I only did this because my husband was like, “can you become friends with these people? And then we can go eat hand pulled noodles with them in Queens.”

51:52 Michael: Right?

51:53 EO: And I was like, “okay.” So now I’m working really hard on becoming… Actually David’s wife, Grace, is extremely nice. She’s a very nice person and I’m optimistic about eventually being able to eat hand pulled noodles in Queens.

52:06 Michael: Queens. Nice.

[laughter]

52:06 Michael: That’s awesome, and yeah, Tim, just so you know, I was very excited about this episode, but someone had to anchor Moe’s otherworldly excitement, so I just had to pull it back.

52:17 Moe: Yeah, yeah. That’s fair.

52:18 Michael: Just balance.

52:19 Moe: It’s all about balance. It’s all about balance.

52:22 Michael: Exactly. Okay. Well, you’ve probably been listening and you are thinking to yourself, “Wow, this is amazing. Where do I find these books?” And of course, you could buy them on Amazon, or wherever books are sold, find a local book store, if you’re so inclined. We would love to hear from you about this and how using some of these tools can help you as an analyst. So reach out to us on the Measure Slack, or on our LinkedIn group. Obviously, the show wouldn’t be possible without our great producer, Josh Crowhurst, so we always like to give him a shout out. And Emily, I can’t thank you enough for coming on the show. This has actually been so delightful. And I think, even just maybe, I almost sort of, because of how Moe feels about it, I feel so much better about it too. I was already okay with it but…

[laughter]

53:12 Michael: I’m saying it’s so badly. Here’s what I mean. It was super awesome to have you on the show, but to hear Moe talk that way about what it meant to her makes it even more meaningful that you’re here. So thank you so much for taking your time. Your research has helped so many people. And so keep doing what you’re doing and been delighted to have this conversation with you.

53:38 EO: Thank you guys so much for having me. It was really fun and very nice.

53:42 Michael: Awesome. Well, if you’re out there and whether you’ve got a baby or you’re thinking about having one, just remember after you read Cribsheet and Expecting Better, the other thing you need to do is keep analyzing.

53:58 EO: Thanks for listening and don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, facebook.com/analyticshour or @analyticshour on Twitter.

54:18 Barkley: So smart guys wanted to fit in. So they’ve made up a term called analytics. Analytics don’t work.

54:23 Hammerschmidt: Analytics, Oh my God. What the fuck does that even mean?

54:33 Moe: But don’t… And don’t worry. No matter what happens in your life, both with my sister and I buy your books for every single woman we know that gets pregnant, so you’re always gonna have a great stream of income coming your way.

54:45 Tim: ‘Cause they’re real popular, so they have a lot of friends.

54:48 Michael: They do have a lot of friends.

54:50 Moe: No, we don’t!

54:51 Michael: That’s why you’re on the show, Moe.

54:54 Moe: I think we’re just women in our 30s.

54:55 Michael: Tim and I have no friends, and so it’s all on you.

[laughter]

55:02 EO: Yes, my academic work, the rest of which is about econometrics, is not as funny.

[laughter]

55:07 Michael: It’s real hard to lighten it up.

55:12 Tim: Did you go searching for, “What else has she written? Wait a minute.”

55:17 Michael: We do try to maintain an explicit rating on this podcast, so feel free to use profanity, if you’d like to. You don’t have to, but it is something that’s become a part of our brand. We were looking for an early differentiator and we decided that it looks so cool on iTunes to see the little ‘E’ next to it that we decided to keep doing it.

55:41 Tim: Rock, flag, and evidence!

One Response

  1. […] (Podcast) DAPH Episode 126: When the Data Contradicts Conventional Wisdom with Emily Oster […]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#261: 2024 Year in Review

#261: 2024 Year in Review

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_261_-_2024_Year_in_Review.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares