#150: The Curiosity of the Analyst with Dr. Debbie Berebichez

Did curiosity kill the cat? Perhaps. A claim could be made that a LACK of curiosity can (and should!) kill an analyst’s career! On this episode, Dr. Debbie Berebichez, who, as Tim noted, sorta’ pegs out on the extreme end of the curiosity spectrum, joined the show to explore the subject: the societal norms that (still!) often discourage young women from exploring and developing their curiosity; exploratory data analysis as one way to spark curiosity about a data set; the (often) misguided expectations of “the business” when it comes to analytics and data science (and the imperative to continue to promote data literacy to combat them), and more!

Items Referenced in the Show

Episode Transcript

0:00:04 Announcer: Welcome to the Digital Analytics Power Hour. Michael, Moe, Tim, and the occasional guest discussing analytics issues of the day and periodically using explicit language while doing so. Find them on the web at analyticshour.io, and on Twitter @AnalyticsHour. And now, the Digital Analytics Power Hour.

0:00:26 Michael Helbling: Hi, everyone. Welcome to the Digital Analytics Power Hour. This is Episode 150. You know what I’ve always wondered? Why was that cat so curious, and why did that lead to the cat’s demise ultimately? And why in science and analytics does curiosity seem to be such a necessity? Curiosity’s a trait that I’ve always looked for in analysts that I’ve tried to hire in my career, so why is it a good thing in analysts and a bad thing in cats? Well, we’re about to find out. Well, maybe just the first part of that. Hey, Moe, would you say that being curious is important to your success?

0:01:09 Moe Kiss: Yeah, I’m pretty sure that’s a big part of it.

0:01:12 MH: So you’re not a cat, then, is what you’re saying.

0:01:14 MK: I am not a cat.

0:01:15 MH: Right. So that’s… We’re making sure, just going down the list. So Tim, what about you? Does your curiosity drive you to deeper insights?

0:01:23 Tim Wilson: Well, I’m curious as to whether you think that intro that was a play on curiosity actually landed, where that fits in the pantheon of Michael’s intros’ witticisms.

0:01:36 MH: Yeah, I was looking for a way to introduce Schroedinger’s cat in there too, and I just didn’t seem to find a good angle. But it might be in there.

0:01:45 TW: I’m curious about whether you can do that.

0:01:47 MH: Okay, and I’m Michael Curious Helbling. Okay, well, okay, we needed a guest, someone with deep curiosity and passion for science. Dr. Debbie Berebichez is the first Mexican woman to graduate with a doctorate in physics from Stanford University. She is a physicist, data scientist, and TV host. When she’s not hosting hit TV shows on Discovery, Nat Geo, or NOVA on PBS, she speaks at conferences and inspires young people to pursue careers in STEM. She’s the Chief Data Scientist at Metis and is launching her own science-focused courses at sciencewithdebbie.com. And today, she’s our guest. Welcome to the show, Debbie.

0:02:26 Debbie Berebichez: Thank you, Michael.

0:02:27 MH: It’s a pleasure to have you. Your body of work is extremely notable, but I think one thing that seems to always have been a very core part of your story is your own curiosity. Do you wanna talk a little bit about that?

0:02:42 DB: Yes, thanks for the question. I think what led me to study science was that I was an extremely inquisitive child. From a very young age, I was asking my parents questions about everything, like why does this work the way it does? And now that I have a three-year-old daughter, I can see how it can be a little bit annoying for parents, a constantly asking question child. And so I think physics became this secret love because the books were incredibly profound and interesting, and the answers to my questions led to more questions, and more and more questions.

0:03:31 DB: And so it was this never-ending landscape of questions and answers, and it just felt like the only field and the only field of study that allowed me to keep asking questions forever. And so I think at one point my parents didn’t really know what to do with me because I wanted to know the why of everything, and why do the planets circle, well, orbit, around the sun and what’s the size of our galaxy and why are we in the corner of a galaxy, and every answer they gave me, I wanted to ask more and more. And I think science, data science, any kind of analytical career, lends itself really well for people who are curious.

0:04:19 TW: So are your parents somewhat curious as well? Like when you reflect on them, do they have it, or did you just kind of emerge with a… It’s interesting, because you’ve got a ton of videos, and it’s hard to not just sample some of you speaking in various places. You actually bring it up, like you bring up curiosity pretty quickly, so it seems like on the spectrum of curiosity, you were pegged at one end of it. I feel like I’ve run into people who were maybe pegged at the other end, and I can’t quite understand them. So I guess, do you look at people and say, “How can you not be asking questions about X or Y?”

0:05:06 DB: Yes, it definitely colours my life, Tim, because I grew up in a conservative community in Mexico that discouraged young women from pursuing careers in STEM. So when I confessed at a very early age that I loved mathematics and physics and that I was very curious, I was told that that would intimidate people and that I should just ask the right amount of questions and not go beyond what was needed. To cut my curiosity, cut the edges and be more malleable. And that hurt me a lot, and when I grew up and I had the courage to go against what everybody was telling me to do, and I met a significant number of young women around the world, not only where I grew up, that had similar societal pressures to make them less curious, to not stand out with asking too many questions because it doesn’t look good for a lady to constantly be asking questions. And so I do think there was a bias against that trait for women. Men can ask endless questions. My father was a civil engineer, so he was a deeply curious man and he always asked me questions. But it’s funny.

0:06:33 DB: Even though he respected my choices and my career and all that, I think he was always a little worried that I was going too far by doing a PhD, and when we spoke on the phone, he would be like, “Oh, that’s great, but what about your personal life or what about… Enough of these questions. Enough about… Telling me about research. What’s going to happen with you? When are you gonna settle down and stop this Curiosity Machine, so to speak?” So I have learnt to fight very hard against this stigma that exists for women when it comes to curiosity, and that’s why it’s such a central theme in my story and in my everyday work.

0:07:22 MK: And so when you speak with children or young people now, particularly, I suppose I probably wasn’t quite as curious as you, but I definitely am on the more curious side I would say as a child. I’m curious to understand, we know we wanna bring more girls into STEM, is this something that we can start to foster or how do we actually encourage young people, and particularly young women to feel that this is acceptable behaviour and even encourage this, I guess, asking questions and wanting to know about the world?

0:07:56 DB: There is a… There’s a key book that I cherished, and it’s called Mindset. Yes, it’s by Carol Dweck, who’s a Stanford psychologist, and she talks about two kinds of mindsets, the growth mindset and the fixed mindset, and she says traditionally, young girls are brought up with a fixed mindset, which tells you that you have a fixed amount of intelligence, so you’re either good or bad at some things from the moment you’re born, as you can’t really grow intelligence, so you better spend your time doing things that you’re already good at. So girls are discouraged from taking risks and doing things where they can fail, and that already disqualifies science, data science, where failure is an everyday occurrence, and then you get up and you try again, and again and again and you have the grit to continue until you get the answer, and so that’s why on average, women tend to choose careers where the possibility of failure is lower and there are less risks. In physics, I had a professor who spent his entire life at Stanford searching for the magnetic monopole, and it turns out it doesn’t exist.

0:09:14 DB: We have magnetic dipoles, so something, magnet having north and south, but there’s no such thing as one magnetic charge, it’s always a dipole and so… But that alone, that is not a failure. His entire career was very fruitful because he discovered that there was no magnetic monopole, and boys are traditionally brought up with the growth mindset which is, “You don’t have a fixed amount of intelligence, you can always grow it, you can always become an expert, don’t worry about failing ’cause that only means that at one point, you’re going to succeed, just keep trying.” And so that encourages boys to be more risk-takers and try to study subjects in school where they’re not gonna get good grades and it’s okay. And to go for jobs where the career description is way more sophisticated than their current skills, whereas women tend to avoid applying for those kinds of jobs.

0:10:17 DB: So what I do is, I encourage people from the youngest age to do that, to foster curiosity by not just simply answering questions that children ask, because what you want is not for them to have the answer and memorize it. What you want is for them to have pleasure in finding things out, which is actually a title of one of my favorite books, “The pleasure of finding things out.” By Richard Feynman, the famous physicist.

0:10:47 DB: And so, it’s really, like when my daughter asks me stuff in the playground about why the sun is casting a shadow over the tree, or why do I think that is more dangerous than her climbing on that rock, or what not, I tell her to try to figure it out with me. So we go step-by-step, I don’t just give her the answer. And step-by-step we go, “Okay, well, okay, what do you think can happen if you climb to that level? What do you think are the potential risks? Oh okay. And why do you fall down? Remember, we studied it in the book because there’s some amazing books, it’s called Quantum Mechanics for Babies, Gravity for Babies, since she was two years old, we would read about gravity and quarks and quarks is her favorite book, and it’s not that I want her to memorize what quarks are, it’s that I wanted to think that nature is made up of little blocks that you can take apart and you study them if you want. And she comes up with her own reasoning, today in the elevator she was telling me, “Mom, I don’t know if I should wear my mask in the play ground because I’m short, so I can’t spread the virus to the tall people like you.”

[laughter]

0:12:10 DB: And I said, “Well, thank you, but there are gonna be little people like you in the park, so we have to protect them as well. And then she’s like, “Oh yes, you’re right. You spread the virus to the tall people, and I spread the virus to the short people.” So it’s making them feel empowered, not embarrassed if they come up with conclusions that are funny or silly, rather laugh with them and tell them, you sometimes come up with really silly and embarrassing conclusions too. Experiment, use the opportunity when you’re in the kitchen.

0:12:46 DB: Make pizza with them and then try to say, “Oh, what if I combine this olive oil with a little water? Oh look, they separate. Why is that? And what happens if I make a cake and I try to mix butter with water? Why won’t they mix and what is it that… What ingredient do I need to use to make them mix?” I say, “Let’s try an egg, and then you… The egg is an emulsifier which basically has a hydrophobic and a hydrophilic side to the molecules of the yolk.” And so one side grabs the water, the other side, grabs the oil and it brings them together, and that’s why you can mix them, but you actually see it when you’re doing it, how it brings them together. So when a child sees that, “Oh, it makes sense.” And you just draw the little molecules and now they understand what’s going on in the microscopic sense, they can’t see it, but they have a theory that they can use to explain and continue to ask questions.

0:13:46 MH: Wait, this is the three-year-old, you’re doing hydrophobic? Okay, she’s gotta be having some fun conversations on the playground.

0:13:54 DB: Well, I try to use words that describe things, and she has a sophisticated vocabulary, but I don’t care so much about the terms, that’s just some fancy word to say, “Hydrophobic means that it has a phobia to hydro, to water, which really just means, it doesn’t like water.” That, a three-year-old can understand, I don’t like water, I like the oil and now let’s play that you like to get in the water and you don’t like the oil, but if you and I hug, we bring them together.” That kind of.

0:14:31 MK: So, one of the things that you haven’t touched on, but it sounds like it’s very closely related to curiosity is confidence. So with children, they can be a bit fearless, and it sounds like this is exactly one of my favorite people that helps me learn technical stuff, she does the exact same method as you, but she just keeps asking me questions until I figure it out myself, but sometimes with the analysts, there is a real lack of confidence, and so, it can be quite difficult to encourage that curiosity and them knowing that making a mistake, is not a failure. I’m coaching a girl at the moment, who I keep trying to say like, “Mistakes is when you learn stuff, it’s okay to make a mistake, I expect you to make mistakes.” But, she kind of has it drilled in her brain that if she makes a mistake on anything, that’s a failure. And I feel like it does get in the way of curiosity. Have you found that you have to work on the confidence first, or do you think that they’re kind of a package deal?

0:15:37 DB: I think we’re all born with confidence, and that’s why if you look in playgrounds, like little children go far away from their parents and try to explore trees and nature and all that, and we slowly beat it away from them, we tell them to the point where now, especially in the US, it’s more important to get a correct answer in a test, everything is multiple choice rather than learn how to think about solving a problem. And so, I don’t think it’s entirely their fault to have lost that confidence, it’s that the education system and the parenting style that we have is like, what’s important is the goal, and if the goal is success in terms of a grade, then of course they’re not gonna be confident if asking more questions is leading them further away from their perceived success.

0:16:36 DB: So what I do is, I divorce, I don’t know what other word to use, but any kind of performance metric from when I’m managing a data scientist or an analyst, I divorce their success metric from their work. That is very useful because what I say is, “This is your time to play, to be curious and to have grit, so I’m not gonna judge you on whether you solve the problem correctly or not, but on whether you think about the problem and the solutions, and you tell me the uncertainty in your result.”

0:17:14 DB: If you show me that you’ve gone through that whole process the right way, even if your result is wrong, then in my eyes, that will be a plus, and that’s really… It’s bizarre when you compare it to, say for example, a software engineer more times is judged by the results. If you’re programming a website and after five days you have nothing to show for it, then obviously your boss is gonna be pretty upset because like, “Hey, you should have just followed these principles and come up with a design, where is your body of work?” Whereas in analytics, especially when there is funding, other times you need to be more cautious about how far you can deviate from your hypothesis, but I always think the healthiest thing is to have an R&D arm to the data science team that goes on to bring new tools, new algorithms, new platforms, whatever they need, to ask better questions and to use the data that they have, to create insights that are aligned with whatever the company is doing. And so, sometimes I can show you, when I give talks, I look at successes of companies that are randomly based on some creative thing that… Many times, simple things that somebody that was curious decided to analyze.

0:18:47 DB: In fact, I’m thinking of an example of a retail store that we were analysing, when I worked at ThoughtWorks and we called it the, “Second camel hump problem.” Because normally what you expect in a retail store is that the transactions, when you plot them, price on the X-axis and, I’m sorry, on the Y-axis, and then on the X-axis the people… How many transactions are happened over time, you’re gonna see that you get a normal curve, that is the majority of people tend to transact or buy things around a certain price, and there’s a minority on both sides that buy a lot more and pay a lot more, and there’s a minority that pays a lot less, but usually you pay a lot more if you go to a very fancy retailer, but that price, the average price, will be a lot less if you go to a not such a fancy retail place.

0:19:47 DB: So then, the analyst received this graph and then instead of just one hump, one normal curve, there was a second hump. That’s why we call it like a camel with one hump, but this was a camel with two humps. That simple analysis, that was just a histogram of how many people were buying a lot versus a little and whatnot, that simple thing led them to discover the second hump, which led them to discover a second kind of customer that was coming from abroad at a specific time of year, was buying lots of things, taking them back home, and reselling them. They were vendors. And so that opened opportunities for the company to create franchise stores abroad, as well as to sell online, and open the international market. So that’s just a simple example. But just because somebody was curious, to see, “Hey, what’s this second hump here?”

0:20:47 TW: Well, so… But that… Using, like, even that example, I feel like where I have run into challenges, and I’ve been doing… I’ve been pretty much mostly consulting for the last decade. So it’s kind of a bunch of different clients, and there are people who have kind of gotten… They’ve kind of drifted into a role as an analyst, and I could imagine in that example, one of those people who kind of lacks curiosity is they pick their bin size. They pick one… They pick a bin size just out of thin air, and the way they pick their bins for that histogram, they don’t see that double hump. So they see it, it’s kind of what they expected, they reported the result, and they move on. I guess if… Are there examples? Do you feel like you’ve run into those people? ‘Cause they’re… I really, really struggle with the… When I run into analysts who, really that’s kind of how they operate. They are focused on, “I need to produce the report. So I pulled data, I charted the data, I presented the data.”

0:21:56 TW: And it just doesn’t seem to be there. If anything, it almost seems like… All the examples I’m thinking about are dudes, actually. Like, there’s the flip side of confidence, the overconfidence, where you pulled it, you pulled it one way, you looked at it one way, that must be the truth, you’re presenting it and you move on. And they don’t stop to explore. And I’ve never managed to crack the nut, as much as I’m looking at it and saying, well, I’m trying to guide them, like, prompt them to say, “Well, yeah, but what about… What if we sliced it this way? What if we changed the bin size? What does that look like?” And I’ve never figured out how to… Like, that’s where it feels like some aspects of curiosity, either they just weren’t encouraged in the right way as children, I can’t build a time machine, or it’s an innate trait that they just don’t have, or it’s some combination.

0:22:49 DB: Yeah. By the way, that’s called the Dunning-Kruger effect, and you get…

0:22:53 TW: Oh, yeah. [chuckle]

0:22:54 DB: Traditionally, men have it more than women.

0:23:00 TW: I don’t, actually. I’ve never actually fallen prey to the Dunning-Kruger effect, so…

[laughter]

0:23:06 DB: The first thing… Feynman used to say, “You should not fool yourself, because you are the easiest person to fool.” And data scientists should have that at the top of their mind, because the definition that MIT came up with for data literacy, not even data science, just data literacy, is to be able to recognise something as data, work with it, and argue with the data. And I love that part. Not only because it suggests that data literacy is a skill that anyone can learn, but also because when you argue with data, that means… For me, the most important part of data science is what we call Exploratory Data Analysis. In a way, it’s mathematically the most simple, but conceptually the most complex. Because you really have to try… It’s as if you’re in a debate, and you have to try to prove both sides. So first go to the limits, like we do in physics. Make the bin size super, super small. Now, you know that’s the limit. Make the bin size super, super large, and you know that’s the other limit. Now, try to explore what happens in the middle.

0:24:25 DB: That’s when they find the answers. If somebody… I can’t really talk about the psychology of working environments because I do think there’s a lot at play here that… We are teaching people to conform, or try to, again, sort of, get on the good graces of the bosses and their managers, and so they may act in a certain way to try not to bring news that could be disconcerning, or uncomfortable, for the company to see. A result like the second hump, it could very well have backfired, because they didn’t wanna confirm that or whatnot. So, that other, I guess, organisation behaviour, people may be able to answer the question better than me, but I can tell you that it’s a skill that anybody can flex that muscle of curiosity. And that it’s up to us to encourage an environment where failure is a good thing. As long as they’re trying, they’re finding things out. And even a negative result is a good result.

0:25:36 TW: So that… I like the idea of doing the EDA, the Exploratory Data Analysis, because there are kind of formalised approaches for… It seems like when you’re learning statistics, it goes through and says, “Look at the median, look at the mean, look at for outliers.” There is… It somewhat can be kind of formulaic with a, “Get familiar with the data, and while you’re doing that, keep a record of what looks like it might be popping out in a different way,” which I don’t feel like people who came up just… If you came in from marketing to do analysis, that step never really happens, whereas that does seem like data scientist, that’s kinda core, “Oh, new data set? First thing I had to do, do my EDA.”

0:26:21 DB: Yeah, exactly.

0:26:23 MK: So it almost sounds like you need to reshape the ticket or the task, to introduce an exploratory section before they try and answer the question. ‘Cause it sounds like in that example you had, Tim, it’s straight to like, “How do I answer the question in the quickest, easiest way possible?” And then, like Debbie mentioned, being overly confident with their results.

0:26:46 TW: Which what you may have a time constraint, right, ’cause it’s like you don’t have…

0:26:50 DB: Yeah, my…

0:26:50 TW: A lot of organizations is like you had to produce this thing, so it’s like, “What’s the quickest way to get to an answer? I wanna move on to the next thing. I have too many things to do.” Which is an understandable pressure, but I think if you were heading towards… You do need to maybe adjust the assignment a little bit.

0:27:05 MK: Yeah, well, I feel like that’s the way to do it. And if it is a time pressure, then that’s a different discussion, right?

0:27:11 DB: Exactly. That’s what I was trying to get at, yeah. If it’s a management issue or personality issue or a time constraint, then there are other ways of dealing with it. But I think if it’s a… Given that all other things stay the same, then it should be innate for a data scientist or an analyst to be curious about what kind of data they’re working with. And it doesn’t take very long, but most discoveries are done within the first few hours of playing with the data. But you do have to spend that one day, that one afternoon, giving yourself the time to explore it in many different ways to see what comes up. What outliers are there? Are they really outliers or is the data just not clean? What checks and balances should you set to know when there are errors and when it’s an outlier?

0:28:03 MK: So it sounds like, Debbie, just… I wanna make sure I’m capturing all of your amazing tips. So number one is that you need to foster an environment where the goal is not to get the right answer but to explore and to come to a type of answer; whether that’s wrong or right is irrelevant, as long as you can show your thinking. And then the second point is to encourage people to explore before they set out for a particular task. Sorry, this is a topic that’s on my mind a lot. Is there anything else in the work context that you can do to really encourage an analyst… Yeah, I guess, to let go of that needing to be right and focus on the curiosity side, or to bring out that quality?

0:28:49 DB: I just wanna correct something first, because I don’t mean, or I hope I didn’t say that, the right answer doesn’t matter.

[laughter]

0:29:00 MK: That’s fair, that’s fair.

0:29:02 DB: It actually matters, for sure, and they should want to get the right answer…

0:29:06 MK: Yeah.

0:29:06 DB: What I meant to say is, sometimes getting an incorrect answer is a good… Not a correct, but it’s a good answer because at least it’s data. It tells you this is not the way to solve the problem.

0:29:22 MK: Exactly. They learn something from getting an incorrect answer, yeah.

0:29:26 DB: Exactly, that’s what…

0:29:29 TW: But aren’t there… There are organisational pressures. There are some gross misperceptions about data that exist, if you take kind of the casual business user, right? They assume there is a truth, that you’ve got kind of the magic machine, I’ve thrown a data set at you, “Hey, I’ve heard about machine learning and AI, throw it in the machine and spit out the truth.” What that… That misses the need to really understand the data, which means in many cases understanding the process that generates the data. It leaves out the… Which is kind of the exploratory data analysis.

0:30:04 TW: I feel like data scientists are kind of constantly trying to explain that 80% of the work is just the cleansing and the prepping of the data. And then the third thing is that there’s this idea that there’s the truth, there’s just the answer. And even, we just kind of touched on it as well, when a lot of times the answer has got some uncertainty baked in. We’ve explained some of what’s going on here, but the rest of what’s going on, maybe we haven’t found it yet and we need to continue digging further, or maybe it is just an incredibly noisy environment. So it does seem like the analyst has a lot of pressures to quickly get to the right answer, when that’s kind of not a fair ask, which goes now back to saying, “We’ve gotta educate the organization.” And it does go to, I guess, to the management of those analysts.

0:30:58 DB: I think there are several points in what you said, Tim. One has to do with the misconceptions that executives have about what data science is, and what data scientists can do. And the second one is about the data scientists themselves not remembering that it’s a probabilistic science and that there are always uncertainty in the results that they get. So there are two different things, and I have… I don’t know which one you want me to answer.

0:31:31 TW: Well, I really… That second one, since you went from a physics and math deterministic world into a probabilistic, if we don’t manage to have a discussion about how you actually inherit that dichotomy of spaces, then it would be a travesty. A travesty. [chuckle] So let’s do that one.

0:31:51 DB: Okay, so first of all, contrary to our, I guess, common perception, physics is not a deterministic…

0:32:04 TW: Deterministic.

0:32:04 DB: Science. Yes, the physics that we learn in high school is, for sure, deterministic, but if you study quantum mechanics, statistical mechanics, you’ll see that nature for physicists is really probabilistic in nature, in essence, and that we… Quantum mechanics, we don’t report quantities that are fixed. We report the probability of a position or velocity having a certain value. And that’s a probability distribution. So that’s similar to data science. However, when they do differ, and where I do agree with you, is that experiments in science can be done thousands of times. They can be repeated, is what I mean. Whereas experiments in data science cannot.

0:33:01 DB: So this is what I mean. If I want to calculate the distance that my baseball is going to travel in air if I throw it with this certain angle, and this certain initial velocity, then whether I do it here or in Australia or in China, I’m going to get pretty much within certain uncertainty parameters, I’m gonna get a very similar result. Whereas if I decide to do an A/B test today on a random set of strangers on whether they look… When they look at my website, if they like my website colour being pink or if they like my website’s colour being blue, by randomness, I could very well have gotten one in a million chance that the result I got from my A/B test is that they like something that they wouldn’t like any other time, or hundreds of thousands of times.

0:34:03 DB: So that’s the difference that, when we query a political survey or when we do… Every time, it’s a different population, it’s a different question, it’s a different context, it’s a different time. We cannot provide the same conditions to do the experiment again, to test. And so that’s why one has to be very careful making assumptions in statistics. So in that way they’re very different. So yes, science gives you way more confidence, certain… By the way, not science. I should back-track that. Certain areas of classical physics give you way more of a confident answer than others. Than in data science for example, where your population and the behaviour of people is way more unpredictable than if water is gonna get hot if I put it on top of a fire stove.

0:35:06 TW: Yeah. That seems like a whole other topic of trying to help the business world understand that. The A/B test… I was actually talking to one of Moe’s co-workers a few weeks ago where I was… We were talking about A/B tests, and it was like, “You’re getting a sample of… The population you care about is the all future visitors to your website. Your sample is in some finite period of time. Definitionally, you are not getting a random sample of your population, which is a pretty fundamental thing you’re trying to get. And trying to then turn that to… ” The co-worker was, she was like, “Oh yeah, that totally makes sense.” Trying to explain that to a business person, I may get it for a glimmer, ’cause they’re still wanting that truth which I think still… I don’t know, it feels like the analyst is in a tough spot because we’re trying to get to something as black and white, and the more curiosity we exhibit, the more we understand how much variability and uncertainty and how many different factors and how much we don’t know, which then leads us to maybe be even more annoying to that business person who wants us to just give them a binary answer. We’re doomed. I quit.

0:36:29 MK: But I do wonder… I do wonder sometimes, that just because… I don’t know. I feel like there can be uncertainty in the data. I think that takes away from having a strong recommendation. You can have uncertainty in your data, but still make a strong recommendation. And that, I feel like sometimes from a business perspective, is how you balance the two. I don’t know.

0:36:53 DB: I think what I get from what you said, Tim, is why data literacy is so important for C-level executives and decision-makers, because if they don’t know what goes behind calculating some of these results and insights, they may demand for absolute certainty, or tell me what exactly is the truth. Whereas an analyst, again, in a good environment where people are trained to read the insights in a probabilistic way, we should say, “Well, for the data, we analyse it for that timeframe and during that context, these are our results with this much confidence level. That suggests that this is the correct colour to build my website with.” Something like that. So it’s a correlation not a causation.

0:37:47 TW: Well, Moe… Moe, you’ve… ‘Cause Moe has given some talks, and we’ve talked about it I think on the podcast of the ACH, analysis of competing hypotheses. Is that what it is?

0:38:00 MK: Nailed it.

0:38:00 TW: Where I kinda wonder if that… ‘Cause I’m gonna completely bastardise this. You basically come up with all your hypotheses, and then you start trying to disprove… You look at your evidence and start disproving it, which I’ve always… I haven’t actually managed to read the book or fully apply it. But to me, that can be a tool to start saying, “One, you have to have multiple hypotheses to compete, and in order to have multiple hypotheses, you have to have curiosity. You have to think through what might be going on here.” It also seems like it can be a communication and a data literacy developing tool for the stakeholders, to say, “Let’s run through these hypotheses. What hypotheses do you have?”

0:38:48 TW: I’m now gonna apply some rigor. I’m gonna explore the data. I’m gonna look at what evidence we have. I’m going to assess it. So I don’t know that that… Haven’t tried to make that connection until literally just now. But would that be a way to get you… The curiosity comes out, it engages the business in understanding that none of these are absolute truths. There will be evidence that supports and contradicts the hypothesis you actually think is probably the most likely, has some evidence that says that it’s not the right one, and it kind of injects that little bit of uncertainty, and maybe over time can help the business to get comfortable with the need for the curiosity and the uncertainty that’s inherent in the process.

0:39:39 MK: I actually… I had a manager who said to me, and I’ll never forget it. He’s like, “Moe, I need to know that you thought about everything,” and I was like, “Huh?”

[laughter]

0:39:51 MK: Like, obviously I thought about all of the options. And this was a very… He was an analyst at BCG before he went into people management. And one of the things that I found that worked really well with him about that ACH approach, was that I could show him all of the things that me and all the other various stakeholders had thought through. And I think that’s the bit that we struggle with is… And to be fair, not all stakeholders need to see that level of detail, so, often I would just shove it in the appendix. But it’s about knowing that you’ve not jumped to one hypothesis and then been like, “Okay. Here’s some data, it’s true, let’s crack on.” You’ve really explored the range of what could be driving it. And yeah, you could miss something, but at least it’s a more holistic process and you can show that you’ve done that thinking.

0:40:38 TW: I think that’s where I… I sometimes refer to some business users as being hypothesis generating machines, and I… And whether they realise it or not, they’re talking and they’re speculating, and then I’ve sat with analysts, saying, “If you’re listening actively, you’ll realise the subject matter experts in that… Your business partners actually have hypotheses whether they recognise it or not. Part of your job is to actually recognise those, and if you’re recognising them, if you can’t come up with… And it doesn’t matter if yours are kinda zany and nutso, just get practiced in saying what are those. What hypotheses could be?”

0:41:21 TW: I’m not gonna go be able to validate or analyse every single one, but if it does need to start with that somewhat freewheeling thought of what might be going on here. It’s like the person who starts looking at the data for a website and they’ve never actually gone through the website experience. They’re like, “I’m gonna get into Adobe Analytics and I’m gonna pull this data.” I’m like, “What does the page look like?” They’re like, “I don’t know. It just… It was Ivar Fore gave me the answer.” I’m like, “Come on.” You’ve gotta…

0:41:52 MH: Yeah.

0:41:52 TW: You need to try it on a desktop, you need to pull it up on your phone. And if that doesn’t spark some thoughts, then what are you doing?”

0:42:01 DB: Yeah. Tim, you just reminded me. This is one of my pet peeves. Exactly what you just said. I gave the keynote speech at the Grace Hopper celebration a couple of years ago, which is the largest conference for women in tech and women in computing, and there were… I forget how many thousands of women in the audience, it was almost like a stadium. And I told them that my worry for women is that now there are all these coding courses and data science courses sprouting like broccoli everywhere for… Girls Who Code, Black Girls Who Code, Women Who Code, The Code Hour, this and that. And that many schools in an effort to quickly check the mark and say, “We’re teaching girls how to code,” they’re just teaching them as if it wasn’t an end… Not a means to an end, but just coding for coding’s sake. And so these young girls are very good coders, but then they’re not really using their critical thinking skills to solve problems. So this happened to me, and this is the example I gave in my talk.

0:43:16 DB: I went to a museum of science where an afternoon program for high school girls was happening, and I visited each one of the teams and they were analysing data that the museum had given them. So I went to… Each team was composed of four to five girls, young women. And so I went to one of the teams and they were analysing turtles, the turtles that the museum had in place. So, we’re talking, and they were incredible practitioners of SQL, the database. They knew it. In fact, I was jealous. I never had access to anything like that at their age, and this is so wonderful. They’re already having… They’re meeting a skill that is one of the most important ones that industry requires when you get a job. And so I was very proud of them.

0:44:08 DB: And then I… We started to talk about the turtles themselves. And so I said, “Ohh, I’m curious, how big are these turtles?” And they are like, “Oh, they… We went to see them once, they fit in the palm of my hand. They’re very, very small.” I said, “Oh wow, I thought you were talking about huge turtles.” They said. “No, no, the little ones.” Okay. And then we continued to talk, and at one point I was looking at all the columns in their Excel spreadsheet about all the factors that were involved in analysing these turtles, and one of them was height, one of them was weight. And so the weight didn’t have any units. There were just numbers, and it was like 300, and 250, and 150. And I said, “Oh wow, what is this weight in? Is it pounds, is it kilograms?” None of them were able to answer. There was complete silence.

[laughter]

0:45:03 DB: And I said, “Well, should we ask the teacher or the supervisor?” They’re like, “No, no, no.” And then one girl finally raises her hand and says, “I’m pretty sure it’s in pounds. I know it’s in pounds.” I said, “Wow, it’s funny, because I weigh around 130 pounds, 140 pounds, and you’re telling me that something that fits in the palm of your hand weighs double, or even in some cases, three times what I weigh. Isn’t that incredible?” And they’re like, “Oh, no, no, no, okay, this must be wrong.” So they were clearly able to think through the problem. And finally, after we went to the supervisor, we realised the unit was grams. So the weight was in grams. And you may think, “Oh, it’s unfair for me to criticise that,” and I don’t think it’s the girls’ fault, they were doing such a great job.

0:46:01 DB: But it’s our focus and exacerbated emphasis on getting them to code and to be proficient that we have forgotten that coding is a language. Why did I have English for? Not just to speak English, to be able to communicate my thoughts, to read beautiful literature and poetry, etcetera. But in the same way, we have to include a critical thinking component. What do we expect from small turtles? What kind of behaviour? So that it can… That’s what we call domain expertise. But at least you have to somewhat inform the work that people are doing, so you just reminded me of that.

0:46:46 MH: Yeah.

0:46:46 TW: But that’s back to the curiosity. If I’m just studying a dataset on turtles, then if that’s my assignment, then I probably need to make sure I’ve got some curiosity about these turtles. I need to think through what they are. And the same thing goes for business problems. What are we really trying to do? What does this mean? Email’s another one, like classic. I feel like all the time we’re getting data on email. I’m like, “Can I see the email?” They’re terribly… I get terrible emails everyday, I get good emails every day.

0:47:18 DB: Yeah.

0:47:19 TW: You’re having me analyse open rate and click-through rate, can I just see and touch…

0:47:24 DB: Sure.

0:47:25 TW: And it’s… Sometimes it’s crazy, that’s hard work. They’re like, “Well, I don’t know.” And they’re like, “Wait, nobody’s seen the email, this is just some agency executing an email? Maybe we should should put that cap on, look at the email before we dive in and try to have the data find an answer about it.”

0:47:44 DB: Yes. Exactly.

0:47:45 MH: Well, and it’s clear to me in the turtles example, it’s probably the metric system that’s at fault there.

[laughter]

0:47:51 MK: The funniest thing was, as Debbie was telling it, I was like, “They’re in grams. It’s gotta be grams.”

0:47:55 MH: Yeah, I know, I was thinking the same thing, “It’s grams, it’s grams.” It’s so good. And actually, Debbie, that ties back to a little bit the MIT definition you shared about data literacy, which I had not heard before, but now I’m also in love with, which is to read, work with, analyse and argue with data.

0:48:13 DB: Yeah.

0:48:14 MH: And it’s interesting ’cause it’s not just any data, it’s the data that you’re using in that particular space, for that particular set of problems. So that… Anyways, I love the connectivity there.

0:48:25 DB: Yes, thank you.

0:48:26 TW: Plus, go Beavers. That son I just road-tripped with is at MIT, so I can ask him whether he’s getting a data literacy… I just had him try to explain how to design a recursive function to me so I can…

0:48:40 DB: Wow.

0:48:41 TW: I can throw data literacy at him next.

[chuckle]

0:48:45 MK: Debbie, I’ve gotta say I have a whole bunch of tips that I’ve written down to work on with this colleague, and I do also just wanna correct myself. Earlier, I called her a girl in my team, it is a woman in my team, and it’s something that I’m really trying to be aware of.

0:49:00 DB: Yeah.

0:49:00 MK: When I keep using the word girl instead of woman, so I’m just calling that out as a correction.

0:49:05 DB: Thank you.

[chuckle]

0:49:08 MH: Anyway, okay, well, it seems like a natural enough point for us to start to wrap up, and actually it’s been so amazing.

0:49:14 TW: I was curious about how you were gonna wrap this up, actually.

[laughter]

0:49:17 MH: Yeah, thank you, I was curious too. Honestly, I feel like I was a listener for this podcast because I was just fascinated to hear… I was like, “The science people are sharing all this cool information. I’ll just sit back and… ” Anyway, one thing we love to do for our listeners to maybe help increase their curiosity, is to share a last call. Debbie, you’re our guest, do you have a last call you’d like to share?

0:49:42 DB: Well, actually, I wanted to point people to an article I wrote which breaks the typical assumptions with which we judge the world and what happens. It’s called Outrageous Acts of Thinking, and it’s on The Huffington Post and on my LinkedIn and on my website, sciencewithdebbie.com, or just Google it. But I wanted people to know about that and get hopefully their feedback and comments.

0:50:12 MH: Awesome.

0:50:13 TW: Wait, is that the one… Does that have the 2 x 2? Is that… You did a talk… Have you done some talks on that one? Is that the one… Does that have your matrix? Okay.

0:50:21 DB: Yes, yes, absolutely.

0:50:23 MH: Yeah, awesome. And we’ll have a link to that in our show notes on our website too so people can find that quite easily. Okay, well, Moe, why don’t we go next to you, do you have a last call?

0:50:32 MK: I have a twofer. One is a very short one, which is just that for those who are using Looker, the JOIN conference is coming up on October 13 to 16, and it’s free. So I will be watching with bated breath. But the other one… And it was a little while ago that it was in the news, but it started to get shared around, this particular article on Medium, and it just went straight to the core. So I really encourage everyone to read it, men included, and it’s the article, The Pinterest Paradox: Cupcakes and Toxicity. And I can’t say who… Is it Francoise or Frances or… She was the COO at Pinterest, and I feel like I’m gonna butcher someone else’s name. I feel like it’s Frances, but every time I heard it, it was said in a much more fancy French way.

0:51:27 DB: Francoise Brougher.

0:51:30 MK: Okay, nailed it, Debbie, thank you for saving me. But she talks about her time at Pinterest, and a lot of women in tech, women in data circles, have been talking about this article, and there were just so many bits of it that I’ve experienced in different ways or mannerisms at work. And I just encourage encourage people to read it and just imagine what it must have been like to be in her shoes. Because it does sound pretty horrific, and I think the only way we fix systemic issues like this is by reading it, when people are brave enough to share their stories.

0:52:03 MH: Yeah, that was a very powerful article, I agree. Okay, Tim, what about you? What’s your last call?

0:52:12 TW: I’m gonna do a twofer as well, but they’re not as deep as that.

[chuckle]

0:52:19 TW: So one’s like the logrolling because… And a little bit of a hat tip to Stéphan Hamele because I… This was shortly after SUPERWEEK, but there is a… O’Reilly put together a book. It’s an O’Reilly book done through the International Institute of Analytics called 97 Things About Ethics Everyone in Data Science Should Know. And I have a little piece in that that’s short, but Cassie Kozyrkov is in there; Brent Dykes, past guest on our show, is in there. So, I have not actually received a copy yet, so I have not read the other 96, but I’m looking forward to that. Then maybe my more fun, I discovered a new data-viz type person, Erin Davis. She is at erdavis.com, and Walt Hickey, past guest, Newsletter, had an interview with her. But she’s done some stuff on Pudding.cool and some other areas, and she just does… They’re just fun projects, like The Topography of Tea, and What’s the Biggest One Hit Wonder on Spotify, and A Map of Emoji Similarity.

0:53:31 TW: So it’s just fun data exploration stuff, which there are more and more people doing. But every time I come across somebody who just has had a… She clearly is curious, even in the article or the interview that I read with Walt Hickey, I’m now realising she talked about how she would have these casual conversations that would then just launch her brain into curiosity exploration, and then she would go track down the data, and next thing you know, she’s created something beautiful and fascinating and quirky. So, erdavis.com.

0:54:06 MH: Nice.

0:54:07 TW: What do you have, Michael?

0:54:09 MH: Well, I’m glad you asked. So, last year, the association of which… The Digital Analytics Association, I sit on the board of, we had our inaugural national conference and we are holding it again this fall in about a month, in October. And obviously, because of Coronavirus and those things, it is a virtual conference this year, but I highly… It actually makes it more accessible than less. And so if you are listening to the show, I got a lot out of attending it last year, and so I know it’s gonna be even better this year. We’ve learned a lot, put a lot of learnings into practice, and in addition to that, there are the awards for the Digital Analytics Association, the Quanties, will be held. And Tim Wilson is a winner of a Quantie Award, so he knows what it’s like to be at the peak of our profession.

0:55:08 MH: But anyway, so I would highly encourage you to attend, give it a chance if you have not attended one before. There’s gonna be some excellent speakers and some workshops and things like that. So a lot of good content and topics for anyone in the analytics and digital analytics space. Typically, we have it in Chicago. So just order a deep-dish pizza and eat it at your computer or something like that.

0:55:32 TW: Typically, we have it in Chicago?

[laughter]

0:55:34 MH: Well, we were gonna have it in Chicago.

0:55:36 TW: It’s only happened once.

0:55:38 MH: Well, but we were gonna have it in Chicago again.

0:55:41 TW: Okay.

0:55:41 MH: So that was the plan before… Listen, Coronavirus has made 2020 very difficult for all of us, okay, Tim? Anyway…

[chuckle]

0:55:50 TW: I have heard that… By the way, I’ve heard that the King James… No, King Arthur Flour. King Arthur Flour, not King James.

0:56:00 MH: King Arthur Flour. King James Flour. [chuckle]

0:56:01 TW: Yeah, King Arthur Flour’s deep-dish pizza recipe has come highly recommended. We have not tried it yet, but if you’re gonna make yourself a deep-dish pizza to attend the Quanties, you could…

0:56:12 MH: Well, I went and looked at Lou Malnati’s home delivery option and now I’m getting plastered with retargeting ads on every website, telling me to go order pizza online. It’s ridiculously expensive, but at some point, I feel like I’m gonna cave. Anyway… Okay, you’ve probably been listening and if you’re like the rest of us, you’ve been delighted, you’ve been challenged, you are curious and more excited about pursuing science, and we would love to hear from you. And so, reach out to us with questions, comments, and the easiest way to do that is through the Measure Slack and also our Twitter account or on our LinkedIn group. And then also, you could find out more about what Debbie is doing with science and things like that on her new website, sciencewithdebbie.com. So be sure to check that out. I don’t know, Debbie, do you have a Twitter account that you use?

0:57:02 DB: Thank you, Michael. It’s debbiebere. So that’s D-E-B-B-I-E-B as in boy, E-R-E.

0:57:11 MH: Excellent. So that’s probably a pretty must follow as well. So anyways, Debbie, thank you so much for coming on the show. This is delightful. It’s such a pleasure to have you.

0:57:21 DB: Thank you, guys. I had a ton of fun.

0:57:23 MH: Yeah. I’ve learned a lot, and it’s not every show where I get to say that. So that’s a pretty awesome thing. And…

[chuckle]

0:57:31 MH: No, it’s not… Sometimes we cover stuff that’s more in my wheelhouse, so I’m more sharing knowledge as opposed to receiving it. So that was a…

0:57:38 TW: Those are the ones where I don’t learn anything.

0:57:40 MH: Yeah, exactly. Tim just tunes me out. Anyway, I know that for all of you analysts out there, and I know I speak for my two co-hosts, in saying that we wanna thank Josh, our producer, because I almost forgot that. But Josh is our producer and we couldn’t do the show without him. So, remember, I speak for Moe and Tim, I’m sure, when I say even if you’re not getting to the answer you think you should do, just believe in science, trust your curiosity, keep analysing.

[music]
[automated voice]

0:58:14 Announcer: Thanks for listening. And don’t forget to join the conversation on Twitter or in the Measure Slack. We welcome your comments and questions. Visit us on the web at analyticshour.io or on Twitter @analyticshour.

0:58:28 Charles Barkley: So smart guys want to fit in. So they made up a term called analytic. Analytics don’t work.

0:58:35 Thom Hammerschmidt: Analytics, oh my God. What the fuck does that even mean?

0:58:44 DB: How did you three get together initially?

0:58:47 MH: Well, that is a great question. It all began at a conference at a lobby bar, and Tim and I met, I wanna say maybe in 2010 or 2011, and we got to know each other through a conference. And we started this podcast because we would argue with each other about different topics and somebody had a bright idea of like, “Let’s put it in a blog.” And then we were like, “No, blogging is way too much work. Let’s just record ourselves.”

[chuckle]

0:59:18 MH: And recording ourselves actually turned out to be way more work.

0:59:22 MK: But it actually was pretty similar, like, Tim and I met in a bar after a conference in Australia, and I had just moved into data analytics a month before. And I basically just chewed his ear off for three hours, asking him every question under the sun.

0:59:40 TW: I pulled strings to get Moe access to the conference ’cause I used to work with her sister and have known her sister. And then Moe was like, “Oh, I don’t need that. I already got my own in to the conference. Thanks a lot.”

0:59:50 MK: No, but that’s because they were trying to black-ball me ’cause I worked at a competitor. So I had to have three different ways of getting tickets, ’cause they really didn’t want me there.

1:00:00 TW: But I think her opening line was, “Thanks, Tim, but I don’t need you.” And I was like, “Oh, you’ll be back.”

1:00:04 MK: I don’t think that was it at all.

1:00:07 TW: Oh, it was brutal.

1:00:09 MH: Moe holds her own very well.

1:00:12 TW: As does her sister. So…

[chuckle]

1:00:14 MH: Yeah, that’s right.

1:00:15 MK: That’s true.

1:00:17 TW: Rock, flag, and curiosity. Save the cats!

3 Responses

  1. Isaiah says:

    Hello, I enjoyed this Curiosity episode of the podcast. How do I register for Digital Analytics Association conference

  2. Post says:

    Hi

    I would like know is,How would I kno if I’m a Analyst I’m an curious about certain things but at the same time for me it’s hard to understand the answer that’s is given.

  3. Temitayo says:

    This is a great episode. I’m curious to how do you balance getting out a great output for your employer with exploring the datasets for other angles that might be hidden or not directly aligned with the outlined metric

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

#243: Being Data-Driven: a Statistical Process Control Perspective with Cedric Chin

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_243_-_Being_Data-Driven__a_Statistical_Process_Control_Perspective_with_Cedric_Chin.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares