We’re baaaaaaack…! Shorter show name, a rebrand, some minor formatting and structural updates, but still “Moe Kiss with a couple of guys who listeners can’t keep straight.” On this episode, we talk for a little bit about what we’ve been doing while we were on hiatus and then dive into a topic that only Cassie Kozyrkov has dared to deeply explore before: the distinction between analysts, statisticians, data engineers, ML engineers…and data charlatans. Well, really just the first two. But, Cassie(‘s content) has made numerous appearances on the show, so it seemed like high time that we dug into some of her ideas.
0:00:05.7 Announcer: Welcome to the Analytics Power Hour. Analytics topics cover conversationally, and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.
0:00:21.0 Michael Helbling: It’s the Analytics Power Hour. Do you realize how hard it was for me not to say digital? But this is Episode 171, ’cause we’re the same show with a new name and we’re back. Alright, let’s introduce you to our wonderful set of co-hosts. Moe Kiss, could you introduce yourself?
0:00:48.5 Moe Kiss: Hi, I’m Moe. I do data stuff at Canva and hang out with lots of data nerds in my spare time and you know… That’s me.
0:01:00.2 MH: Excellent. And Tim Wilson.
0:01:02.2 Tim Wilson: I am Tim Wilson. I am terrible at self-introduction, so I’ll say I’m a senior analytics Director at Search Discovery where we do data things, as well.
0:01:12.2 MK: That sounded very polished.
0:01:14.2 MH: Yeah, that was good. So you think you’re not good at it Tim, but you are.
0:01:18.3 MK: Nailed it.
0:01:19.1 TW: A lot of times, if I had to do that in front of a client then I’ll have some… The biz dev person will be like, “And Tim has a podcast,” so I feel like I couldn’t throw that in. At which point I didn’t throw in that. Yes, I am a middle-aged cis white dude in 2021. So of course, I have a podcast. But that didn’t feel like it would flow into this podcast intro.
0:01:40.9 MH: Yeah, not so much. And I am Michael Helbling, the founder of Stacked Analytics. And here we are. We’ve been doing a podcast for a long time, although from our intro, we’re trying a lot of new things, and actually, if you’re a listener to the podcast, you may have noticed we’ve been on break for a little while, and this is our first episode back in quite some time, and so we’re trying a lot of new stuff. We’ve got a new name. We’ve done a complete redesign. What’s the hiatus been like for you, Moe?
0:02:11.7 TW: Anything interesting happened?
0:02:14.7 MK: Yeah, there’s a small human in my life, which has… Yeah, the hardest thing I’ve ever loved. Sleep is fleeting but yeah, it’s been good. Apparently, you become a mom and you’re allowed to have lots of indecision, so that’s been a nice new learning. What about you, Tim?
0:02:33.2 TW: But do you have hour by hour tracking of his sleep patterns and everything else?
0:02:38.7 MK: Oh yeah, that gets pretty… I’m like, Jamie, did you log the nappy? Jamie, what time did he wake up? And he’s like, Moe, this is for us to help us. And I’m like, No, but every data point must be recorded.
0:02:48.8 TW: It’s the data set.
0:02:50.0 MK: Yes.
0:02:50.8 TW: I don’t wanna have to… I don’t want missing data.
0:02:52.4 MH: Yeah ’cause it used to be the baby book where you’d record certain things, but now you could just have data… Like a baby big query.
0:03:01.4 MK: There is and I specifically picked a particular app so that I could export the data to CSV format, even though it has the least functionality of all of the apps, so, you know.
0:03:12.3 MH: Data export is important.
0:03:13.5 TW: Well, that’s funny, while we were off outside of the podcast stuff we’ll talk about in a little bit, I did do some playing around with a Fitbit R package that was not on CRAN, and therefore was kind of outdated and got me to do a lot of challenges with trying to pull self-quantification. I’m like five years later, following in Michelle Kiss’s footsteps, but literally just within the last couple of weeks now there is an identically named package by a different person that’s on CRAN, so I’m not sure whether I’m gonna have to go and completely refactor my code, but no new humans. Actually, just today, my youngest child got her driver’s license, so there’s passing that milestone.
0:03:56.0 MH: Wow. Awesome.
0:03:56.5 MK: So you took a real break then, Tim, it sounds like. Stepped away.
0:04:01.3 TW: Well, I spent a lot of time stressing about what we weren’t doing on working on ancillary aspects of this. I wanted to have four full on rehearsal shows so that we wouldn’t have such an unpolished opening…
0:04:11.6 MH: Intro… Yeah.
0:04:13.1 TW: But no.
0:04:14.5 MH: No, I won’t have it. We gotta build back up from the beginning, we’re gonna try lots of new things, it’s gonna be awkward as heck.
0:04:23.7 MH: We did change the name and… Tim, you wanna talk about that a little bit? Why did we get rid of the name digital?
0:04:30.3 TW: [chuckle] I feel like we’ve… That’s like in every year when we’ve been doing our planning, we’ve talked about, is this the year to drop it, which is like harkening back to when the Web Analytics Association was wringing their hands about dropping web and then they went to digital and even at the time there were people saying Digital is gonna get kind of dated for the DAA. And I think, just as the podcast has evolved and the guests that we’ve gotten on, there’s plenty of them that we point to that say, Is this really about this broad, but still narrow in the grand scheme of things, world of digital analytics, or are we really trying to embrace the broader… Even if it makes the… We’re not dropping the Power Hour.
0:05:13.9 MH: No.
0:05:14.0 TW: It’s still a power hour.
0:05:15.4 MH: And will always be exactly one hour.
0:05:19.9 TW: God… Goddammit.
0:05:20.4 MH: That’s right.
0:05:21.4 TW: And will stay explicit.
0:05:23.0 MH: That’s right.
0:05:23.5 MK: But it’s funny, I probably should have done some research and looked at our last shows over the last year, but I would argue we’ve done more discussions on data science, even we’ve done quite a few and data engineering, we’ve definitely widened the scope, a lot.
0:05:37.4 TW: Well, that brings up one of the functional changes to our website, which honestly, this is probably the sole thing that we had a request for, and it actually does… It is really, really useful, which is we’ve gone back and applied kind of a taxonomy to the shows, and we now have a search on the site, so if you wanna go and say, What are the episodes where you touched on data science or which ones are around career development, that I think actually… And it’s not perfect, there’s no perfect flagging of shows, but we realize that we’ve got enough of a backlog and there are people who are still discovering the show and would want to say, Hey, what are the shows that were about career development or that had aspects of managing teams? So that to me is totally a functional aspect of the site, but I think it’s actually… It’s useful for us even to…
0:06:32.4 MH: To be able to be like, What show was that in? It’s gonna improve so many things, and honestly, hopefully will improve a lot of things in other areas as well. But one thing that will always remain the same is we’re gonna have a topic, we’re gonna talk about it and we’re gonna have a guest on. But one thing that’s different is on this episode, the guest we really want isn’t really available right now because apparently she’s taking a hiatus, which I’m just like, who approved such a thing? [laughter] We just didn’t know. No, what I’m talking about is, there’s someone who the three of us have been reading for quite some time, we think super highly of and we really would love to have her on the show. Her name is Cassie Kozyrkov, and she is the Chief Decision Scientist at Google, and she is a great writer, speaker, just a great thinker in our space around these ideas of Data Science, what an analyst role is, statistics, a lot of educational material. Her main writing, I think is housed on a blog website called Towards Data Science but you can also find quite a few YouTube videos, she’s got a podcast with a lot of her articles. So it’s…
0:07:45.1 TW: Hacker Noon. She winds up with those sometimes.
0:07:46.5 MH: Hacker Noon. So yeah, there’s many places where you can find her writing, but we really think highly of her. We reached out to her during the break to see if maybe she’d wanna be a guest, and she was like, Well, I’m not really doing that right now, so we’re like… We were like, Well, we’re gonna talk about you anyway, and then hopefully…
0:08:03.1 MK: Well, a lot of the stuff she says is really contentious, and I listen to a lot of her posts while walking Harry around the block trying to get the little brat to sleep. And my first thing was like… I messaged Tim being like, I need to call you and rant at you about some of these posts because I feel like she’s right, but I feel conflicted because what she’s saying is often quite different to what we, I guess, traditionally thought in the industry, and I need to talk about this.
0:08:32.8 TW: I said, Look, I’m not talking to you unless it’s on the mic, and we’re getting an episode out of it.
0:08:34.8 MH: That’s right.
0:08:36.7 TW: So don’t you dare.
0:08:37.7 MH: Don’t you dare. So here we are, we’re doing a show about a person we’d like to have on as a guest, so hopefully we get it so wrong, she just feels the need to come on and correct the record. And so there’s a lot to pick from, but I think one area that all of us had a lot of interest in discussing was there’s some articles that Cassie’s written around what the role of an analyst versus a data scientist versus the statistician. There’s some of that discussion, so why don’t we start there and sort of see where this takes us?
0:09:12.2 TW: Should we start with maybe trying to recap the highlights of the distinction, as we understand them?
0:09:18.9 MH: Yes. I think, Tim, you’re probably the best qualified, ’cause I think you probably should also talk about the little mini project you’re running internal to your company as well, ’cause I think that’s pretty neat what you’re doing.
0:09:31.4 TW: Here’s where I’m conflicted, and this is where… I’ve said this now for several years, that Cassie has a very, very clear view, definitely in her mind. Sometimes they are things that seem like they’re contradictory and they don’t really square with the way I’ve thought about things, but my way is kind of fuzzy so I suspect she’s probably right, and I think there’s a lot of value in having some definitions, and again, this is me trying to describe. As I understand her view, an analyst really is purely descriptive. It is taking historical data and looking at it and describing the facts about what is in the data. There is not modeling, there is not inference, there is not prediction. And the statistician is all about uncertainty. There was one of her articles, and I should have pulled it like it just… It was kind of like… It was almost verbatim what I’ve heard Matt Gershoff say which, Statistics is all about uncertainty. And we’re living in a world of uncertainty, which I kind of think is okay, except she starts to make some pretty strong statements like once you have used the data set for analysis, it is now sullied, it cannot be used for prediction, which…
0:10:54.2 TW: And maybe that would be the sort of thing she would come on and say, Well, you kind of oversimplified because I don’t know, does that mean you can take a data set and split it into a test and a training, and then you’re kind of okay? But she may say, no, a statistician would be doing that. The statistician is building the model. I feel like she kinda steers clear of data science a little bit. She uses the term a lot, but she tends to say there’s data engineering, there’s machine learning, there’s statistics, there’s…
0:11:24.1 MK: Analysts.
0:11:25.7 TW: Analysts. And that, I kind of like. I don’t know, is that a fair…
0:11:32.6 MK: I really like that, too. And one of the things that she talks about, and I didn’t realize that until I read it, that I actually do this, and I didn’t even notice that I did it until she pointed it out. But she said that analysts really will present the facts and they might hypothesize why something’s happening, so they might be like, Oh, I found out this interesting insight, it could be because this happened, or it may be because of this, and that is then the starting point for when the statistician would pick up the work to investigate. And in my findings, I always do that, I’m always like, Oh, it may be because it is or it could possibly be a result of this without being like, It’s definitely this, or, we’ve concluded X, Y, and Z. And I didn’t realize I did it until she literally said it in her post, and I was like, Oh, uh.
0:12:29.4 TW: Well, I feel like it’s a limited number of years back where I feel like I’ve started to… And I’ll say I think it’s because as I’ve gotten a deeper understanding of causal inference and what causality is and what it isn’t, that I’ve now found myself… I guess the negative way to put it is like hedging. Our clients want us to come do an analysis and give them an insight, a truth, a universal thing, and I cringe at the word. We’ve done a whole episode on what is an insight. So I think that’s… Because, at one point, she goes as far as saying, Look, your job is to present the facts. And she even says, Don’t make really anything about it, it’s up to the business to decide how they wanna interpret it. And I’m like, Well, that’s going a little far… The line between that and really just doing data puking. Here, business user, I’ve now sliced the data, there are eight dimensions, I’ve sliced it by every pair of dimensions and given you 25 pages, “You, dear marketer, look at this and tell me what you see and what you wanna hand off to the statistician.” That feels too far. Where you just described it, Moe, feels like the perfect, “Find the stuff that looks like something might be going on, and that then becomes evidence for a hypothesis.” It doesn’t validate a hypothesis or fail to invalidate the null hypothesis.
0:14:08.9 MH: It sort of like… I think there is, if I read between the lines correctly, which I probably do not, there’s a purity to the role that I think that I perceive her as trying to create. And I can, on some level, very much appreciate that. And what most of us who work in this field feel is, we feel the very muddled crossover of so many different roles that we play, and so our… I call myself an analyst, and I think storytelling is a really big part of what I do as an analyst, and I obviously think statistics and other things are important, too, but I can actually have some like… Okay, yeah, she’s saying, “Okay, you’re not a storyteller, you’re an analyst or you do statistics, so here’s your function.” And in an academic sense, or sort of like a system sense, systemic, systematic, whatever sense, that kind of makes sense to me, but if you start doing it in the real world, it gets really muddled and you start playing a lot of those roles. But I think there’s a real value to sort of the way she thinks about it. At the same time, I read a lot of her writing and I sort of bristle and be a little bit like, “No, I do that. Don’t belittle that part of my job.”
0:15:32.0 MK: But I think that’s why I had this really strange reaction to a lot of her writing, was like, “Oh, that’s the opposite of what I’ve been thinking this whole time and this is really confronting.” But I guess the point to also draw is, she works at Google. You can have these really hard delineations of responsibilities at a huge company with a shit ton of money. When you’re in a smaller company…
0:15:53.0 MH: That crossed my mind, too.
0:15:55.6 MK: You do sometimes have to do a bit of the stuff that, guess what, maybe you’re not really super qualified for that great at, but you have to do it, ’cause there ain’t anyone else there to do it for you.
0:16:04.0 TW: Well… And I guess the other piece… ‘Cause on the one hand, her stuff could be read as saying, “Okay, if you’re gonna use R Python, you’re doing stuff with SQL, it’s purely just to pull the number and you’re basically gonna do visualizations of the facts, you don’t need to be using R Python and going and doing statistics.” She actually says, “No, not so much.” And I just like a couple of weeks ago, I literally took a data set, 1600 rows of survey responses, and I cleaned it up in R, and I just ran a regression on it just to see what bubbled up. And I was like, “Oh, that’s what she’s talking about.” That could be… I’m doing it. I wasn’t gonna put that in front of the client, but it was gonna bubble up like, “Well, these are the ones that showed up. Now, I may wanna be a little better and a little more rigour,” that was just kind of like the one quick cut in exploration. Okay, maybe do a little bit more and say, “Okay, well, this is just telling me which of these variables in this data set looked like they had a relationship.” And I think she was saying, “That’s okay.” She talks about how an analyst might actually need to have more intuition and depth of thought around the P-value in a T-test, than even… Well, certainly than somebody who’s doing machine learning in her definition, so…
0:17:25.0 MK: I really liked her comment where she was like, “Statisticians bring rigour, ML engineers bring performance, and analysts brings speed.” And then she talks about the fact that the analyst way might be semi-sloppy coding styles that baffle other engineers. But basically analysts just do whatever they’ve gotta do to get the job done, and yes, they might use the same formulas as statisticians or ML engineers, but there’s a different purpose. And it’s often the messy hack to get an answer quickly. I actually was like, “Oh, wow, someone’s just described my life,” like this… Maybe I need to stop feeling shit about messy code or doing things quickly, because that is the point of an analyst, right? I’m not trying to productionize my model, I’m not trying to build something to forecast the next 12 months of some amazing KPI that we’re looking at. I’m trying to get someone in the right direction of where to go to next, where to point our resources. Does that make sense?
0:18:23.0 TW: It does. Yeah, that definitely makes sense. I think there’s part of the challenge is that, even if we as, say, analysts, and our statistician counterparts and our machine learning… Machine learner counterparts all said, “We are totally locked into this… We’re still working with the business, and we’re kind of grappling with and trying to figure that out,” and when we go to a marketer, and they say, “What should I do? What are you? Certainly, on the consulting agency side, the question of, “I need to get my insights and recommendations in my monthly report,” and I do a lot to coach around, “No, let’s just measure performance.” Maybe this is where I’m okay with that. If I say a half of an analyst job is… Or one of the functions is saying, “Let’s make sure we’ve established meaningful KPIs. Let’s make sure we’re setting our expectations and setting targets. Let’s efficiently and objectively report those.” That, I would say, is one function of an analyst, and that is working with the business, having the domain expertise, and totally squares with what Cassie defines. I think when it gets into the… What I’ve historically called hypothesis validation…
0:19:44.1 TW: And I think I started realizing that, Oh, it’s partly a difference in definition. I have for years used Merriam-Webster’s number two definition of a hypothesis is a tentative assumption made in order to draw out and test its logical or empirical consequences. And I talk… I can talk to business users about that and saying this starts with the ideas that you have that may or may not have been informed or sparked by data. But the fact is, is the way that you draw out and test its logical or empirical consequences is actually many times kicking into statistics where you all of a sudden your hypothesis is your null hypothesis, and you go into that more rigorous definition of a hypothesis. So now we’ve got the same word used in two different contexts, and I think it’s really valuable with clients to get them to think about what are their assumptions about the world, which really are that may be an assumption, because this is what I’ve always seen in my reports, this is what I’ve seen for my analyst, it’s helping them say, Okay, that’s your assumption. That’s what you think is going on and you may be right. Is there value in putting rigour behind that?
0:20:58.9 TW: We need to run a test, we need to model this out, we need to look at that with more rigour. And then I hear the voice of Matt Gershoff saying, Yeah, but don’t try to do that for everything. How important is it that you actually put that… What are the decisions where you need that rigour versus what are the ones where you say you don’t.
0:21:18.9 MK: I think you and I both have used that word hypothesis a lot and reading, I guess, her work about… It’s really statisticians jobs to validate hypotheses. I was like, Uh, what does this mean for me? Because I always have preached about analysis of competing hypotheses.
0:21:34.4 TW: Analysis of competing hypotheses. Right. The same, yeah.
0:21:37.6 MK: Yeah. And I guess if I was to assume how Cassie might interpret that, it might be to get to a point of like, we’ve been able to whittle stuff down enough that we now have one or two hypotheses that we’re pretty like… Well, that we can’t disprove and now, we would throw a statistician at it to really add the rigour to the process and confirm… It’s basically taking the 10 or the 12 hypotheses you have and whittling them down, and then it’s about how do we assign the right resources to do the next step of that. I think that’s what she would say.
0:22:17.7 TW: And many times that may be, you need to collect data in a way which it checks the box of saying, This is not the data you did analysis on, now you’re doing an A/B test or a randomized controlled trial, or potentially that’s not the only way for the statistician to work, but… Yeah, the analysis of competing hypotheses. That’s analyst and hypotheses… Entire book written on the subject, so you’re analyzing competing hypotheses, but is that really just grooming your competing hypotheses and then putting the decision-making… And I think I’ve seen her… By the way, could we also acknowledge that South African who lived in Russia and is… And she’s hilarious, I guess that’s the other… She is somehow both approachable and I’m just like there are times where I’m like, I can see her saying things in my head around… Now, I’m gonna blank on which one exactly it was on the hypothesis front, but I lost my train of thought as I was wanting to acknowledge how fun she is to watch or listen to.
0:23:36.5 MK: She’s like the teacher you wish you had.
0:23:41.6 MH: Let’s step aside for a second to talk about something that happens to every data professional, no matter what level you are on, is that you work really hard to bring insights back to the business, but poor quality of data can undermine everything you’re trying to do. Moe, has this ever affected you?
0:24:01.2 MK: Just, it’s pretty much what my nightmares are full of. It happens all the time.
0:24:05.9 MH: Yeah, it’s sort of like this constant thing and so it’s sort of like, Well, what are we gonna do about it? Well, luckily, ObservePoint actually is a product that is developed to give data professionals confidence in their data and their insights. They’ve got a lot of things that they bring into their products, but obviously one of the main features is automatically auditing data collection for errors across all of your websites.
0:24:31.0 TW: Means actually getting your most important pages, so instead of the analyst sitting… And it’s almost like a little mini analyst sitting and checking every page on a regular schedule to say, Am I getting accurate data collection? And if I’m not, that little automated intern is pulling the alarm bell and saying, We have an issue, so they’ve got immediate alerts, as well.
0:24:52.7 MH: So keeping your data quality clean over time, tracking it, that’s what you need. I would check out ObservePoint. They actually have some pretty nice socks, too. I know that’s a big thing in our industry is some quality socks. Anyways, if you haven’t looked at their product in a while, I think it might be worth your time to go over and check out a demo. You can actually go to observepoint.com/analyticspowerhour to learn more about their full suite of data governments tools and capabilities. Alright, let’s get back to the show.
0:25:25.0 MK: The one thing I did struggle with a little bit is talking about the role of the analyst. She really emphasizes that what makes an excellent analyst is speed, and she even has this disclaimer of, Don’t be fooled by simple interpretation of speed, because an analyst just chasing shiny insights is gonna just distract the business and, yeah, that’s not worth their weight in gold. But I still have this thing where, I don’t know if the word speed… And I think I understand theoretically what she means about how statisticians and ML engineers… And to be fair, that’s basically the structure we have at Canva is we have data scientists and we have ML engineers and data engineers. And she talks about how their role is really to go narrow and deep on a particular problem and what makes analysts so unique and useful to the business is the fact that they’re speedy, they can get answers quickly and point people in the right direction.
0:26:26.5 MK: But I feel like the word speed… I don’t know, there’s something about it that just really rubbed me up the wrong way is probably overly strong, but I think it’s because I do read… You guys all know that I love Adam Grant. I really read a lot of his work, and he does talk a lot about how you need to step away from a problem to really think deeply about it. And the word speed to me… And I know what she means, like an analyst can get an answer in hours or days and an ML engineer could work on something at Canva for weeks or months. So I know what she means by speed, but I still feel like it perhaps doesn’t capture the fact that we still need to think very deeply about the work that we’re doing, and sometimes take time away from it, and I haven’t been able to reconcile exactly how I’m thinking about this, but it’s all churning around in my head at the moment.
0:27:20.4 TW: Well, and I will say another, what we’ve started with… The former head of Data Science at Search Discovery and I had started kind of going around with was that… And this was kind of the data scientists versus analysts, and he started saying, You know a lot of the stuff that we’re doing, he’s like, It’s being talked about like it’s data science, but it’s really just advanced analytics, which I was like, Okay, I’m fine, that I kind of like that. But that was back to this kind of potential progression of a career, ’cause what we found was that it was statistical work, it was designing experiments, figuring out how to rigorously interpret the results of an RCT, and what we’ve found is that having analysts do a lot of the data prep work and then even potentially do some of the work that is of crunching the data that then kind of has a… Let’s call it a statistician, sort of review it and provide guidance on the techniques, and so there’s a part of me that was thinking, well, there is a degree of speed… And then I fall into the trap of saying that an analyst is, Oh, do the tedious prep work although, frankly, I think it’s a lot of fun to take a messy data set and find out that now you’ve got 300 lines of code getting it just into a pristine, long format.
0:28:49.9 TW: So yeah, that’s again where Cassie’s perspective starts to challenge that, and maybe that goes back to, well, in the smaller company or in the… If your statisticians are a really scarce resource and you have analysts that can start at a simpler level with some reporting and learning some tools, and that they can get more that maybe it does make sense that… Well, there’s a degree of theory of competitive advantage. If you’ve got somebody who’s an absolute rockstar with R or Python and is building models and can do all this crazy stuff, do you really want them writing? If it takes them two hours what it takes some analysts six hours, well, what are the pay? I don’t know, but that framing gets back to like, Oh, it’s a lesser and greater as opposed to different roles, but in practical terms, it works pretty well.
0:29:51.9 MK: The bit that the full stack data analyst or a full stack data practitioner or whatever you wanna call it, is obviously, a term that gives me the chills ’cause I’ve seen a workplace try and tackle that, where basically like everyone’s gonna be a full stack data scientist, which means we can do analytics, we can do ML engineering, we can do data science, and I feel like Cassie’s the only person in the industry who’s talked about this, which is basically like, you have to look after your analysts or they’ll leave. They are not second class citizens. All analysts don’t wanna grow up to be a data scientist, which is the thing that I’m always harping on about, and that they have a different skill set and if you try and hire this like… Yeah, like the unicorn of the full stack data scientist, you’re mostly gonna end up with people that don’t really do any of them well, or they’re strong in one area, but don’t do the other ones well, and that if, as a business, you can only hire one you should really be hiring analysts. And it’s like… It’s literally the first person I feel like I’ve ever heard say this. And I’ve watched a company go down this route where they’re like, We’re not hiring analysts anymore, in fact, they got rid of all the analysts, they just started hiring data scientists.
0:31:14.0 TW: No.
0:31:14.9 MK: PhDs. And guess what happened? They got their PhDs to do the work that analysts should be doing, but paying them four times the price, all the data scientists got incredibly bored because they were doing work that didn’t interest them, and they’re like, Oh, but we’re hiring the people we’re gonna need in five years time, and it’s like, Cool, but those people are not gonna stay here for five years and do a job that they didn’t wanna do in the first place, and I feel like I’ve been sort of burned with this.
0:31:42.4 MH: And I think a lot of what Cassie writes about, the value proposition of what she writes about specifically addresses this kind of thinking that’s so prevalent or so wrong-headed of trying to sort of take and jam into and everybody sort of… Even all of us practitioners out there, we kind of run at the thing we think is the most popular at the moment. Like seven years ago, not everybody was clamoring to throw data scientist on their title, and then… And sort of became a joke, What’s a data scientist? It’s an analyst from California, so it’s sort of like, that happens as a result of pressures and demands that live outside of the discipline of what we do, and I honestly kind of look at Cassie as sort of a defender of the sanctity of those things.
0:32:34.7 TW: We haven’t talked about that one. That’s the data charlatan.
0:32:38.2 MH: Oh yeah, I was gonna bring that article up because I think that’s the article where she made that definitive split, and also that was an article where I really had to look at my life and be like, Wow, am I a data charlatan? [laughter] I feel like sometimes maybe I am a little bit, but you know it usually works out pretty well when I’m involved, so…
0:33:00.2 TW: Thanks to Google and the rabbit holes you can go down, they now have the profile box or something. If you search for Cassie, a box shows up on the right with her, and then it’s got the other people you might be interested in, and I was like, “Oh, okay, that person, that person,” and I was like, “Oh, I haven’t heard of that person,” and clicked through. And this was a lady who, she’s like data scientist. She has a company that, it’s all about data science. I started reading, and she’s like, “Well, I was doing finance and I got into Tableau,” and she’s one of the data visualization type people but not one that I’ve come across, but she’s got apparently a massive following on the LinkedIn. And I was reading what she was offering and even a few interviews with her, ’cause I’m like, “You’re doing data visualization,” which there are lots of times you’ll hear about how data visualization is this critical skill for the data scientist. I’m like, which I think is fair, but I think saying, if that’s your core focus like… That is a label, which I do think, even though Cassie will use it, it does seem like she… Once she starts talking about definitional thing, she steers clear of that label specifically.
0:34:16.7 MK: Which I think is a blessing.
0:34:18.1 MH: It’s hard not to think that maybe she doesn’t care a lot for data storytelling and things like that. In one of her articles, she’s called it data journalism, and then basically said data journalism isn’t really that great. I took that from her article, I guess.
0:34:36.1 MK: And I really do wanna delve into this because we had a data analytics meet-up where I brought up this topic and it was at the end of the night. It was like 9:30, everyone’s had a few drinks and opinions start flying around, as they do, about analysts and data storytelling, and I think Cassie also sometimes talks about marketing your ideas and sees that as a thing that analysts shouldn’t be doing.
0:35:03.7 MH: Or it is not within the role of an analyst in her definition, and that’s the thing is I think she’s creating these really strong definitions… Anyway, sorry, keep going with your point.
0:35:16.1 TW: Except, she does talk about as you progress, the more experienced and seasoned as an analyst, the more you might bring ideas and bring up hypotheses… I don’t know. Thinking off your point.
0:35:27.8 MK: Yeah, I think she cages them, I think she puts them in the frame of, “It could possibly be because of this, or it may be because of this,” versus you actually championing a particular course of action, but I suppose the thing that I was struggling the first time I heard the particular podcast where she talks about analysts shouldn’t do data storytelling. And yeah, I had this real gut reaction to it of like, “Uhh, but this is what my entire career was based on!”
0:36:01.2 MH: Yeah, yeah. Oh, you had a reaction! I literally live to explain things to people.
0:36:07.6 TW: But there’s another post where she says analysts are data storytellers, right?
0:36:09.5 MK: Yeah.
0:36:11.8 TW: But then she kind of caveats, “Okay, then what’s the nature of a story?” So it’s like every one of these semantic things when you say data storytelling, ’cause it’s funny. It’s actually interesting. I would think it would be fascinating to watch Brent Dykes and Cassie debate data storytelling, ’cause I feel like he’s maybe on the other extreme, “Oh, you found this insight and that’s coming from analysis.” He’s not, I don’t think, generally thinking that’s coming, so I don’t know, there’s data storytelling and there’s effectively communicating what you did, what it means and what it doesn’t mean. And I think the what it doesn’t mean is that’s… I go back 15 years and I would be frustrated when I talk to a statistician who couldn’t give me a straight answer. That was my perspective, and I kept that with me, and now I’m becoming the analyst who can’t give a straight answer and that is because there is a deluge of misinformation out there teaching the market that, “Oh, if you throw… If you have enough data… ”
0:37:18.9 MH: Yeah. You can answer any question.
0:37:22.0 TW: I think for our next episode with the guest we’re crazy excited about, he talks about in his book, his latest book, this confusing that, if I have all the data and I throw it into the machine, it’s gonna spit out an answer. And that’s just wrong, but that is the simple and alluring idea that’s out in the marketplace with people who aren’t spending their time listening to podcasts about analytics. We should’ve rebranded as the Data Science Power Hour. What was I thinking? It’s not too late.
0:37:57.6 MH: We are A/B testing that on a different podcast simultaneously, right now, so [chuckle] folks in certain parts of the country are hearing that podcast.
0:38:07.9 MK: But so one of the things that I keep thinking about is that analysts spend so much more time… Well, I would like to think, considering their biases and trying to really challenge whether they’ve been persuaded to a particular perspective, much moreso than I think the traditional stakeholder would. How we think, and why we think what we think is… Well, it’s something that keeps me up at night. I find this stuff really interesting. And I don’t think the typical marketer that I present findings to really gives a shit about that. They’re like, “Of course, I’m incentivized to say Facebook look great. I look after Facebook,” but they wouldn’t give much more thought than that. And that’s one of the reasons I do think that analysts have a really strong place in, I don’t wanna say data storytelling, ’cause we now know there’s a whole bunch of caveats about that language, but doing analysis and making recommendations to the business because… And I don’t know if it’s arrogant to say, we spend more time thinking about our biases than other people, because I don’t know, maybe that’s like a jerk thing to say, but I feel like we do, and so therefore… I don’t know, when I brought this up, even one of the guys was like, “Oh, Moe, it’s not about us checking our biases, it’s that we’re truth-tellers.” And I’m like, “Ooh, I don’t know. I don’t think I would say truth-tellers.”
0:39:35.4 TW: That’s rampant in the space. That’s the, Oh, we’re the… What’s the father of statistical control, In God we trust, all others bring data. That gets thrown around and abused like, “Oh, you’re going off of gut, I am the mighty analyst, I will bring you the truth”, and it’s like, “Well, okay, that is the truth about what happened in the past”. And, oh, by the way, let us talk your ear off and put you to fucking sleep talking about ITP and cookies and all the other reasons that you’re not actually even bringing the truth, you’re bringing a rough either over or understatement of some metric. So you’re not even bringing the truth that you’re claiming like what data set is pristine?
0:40:22.9 MH: Got misrepresentation based on the faultiness of the model.
0:40:25.4 MK: But in their defense, I do think that when it comes to presenting things back to the business, we would have more neutrality because we’re not as motivated by the particular KPIs that we’re reporting on or we’re presenting findings on, and I do think that is a difference. I’ve talked about this before, I had a CPO once who said to me, “Moe, your job is not to give us recommendations, it’s just to give me the insights and then we’ll decide on the recommendations,” and I’m like, “Oh, wow. Way to put me back in my box, but okay, whatever.”
0:41:00.1 MH: Well, then did that same ass also then tell you that those weren’t insights, whatever you provided?
0:41:07.3 MK: Oh don’t… Anyway, let’s not go into my PTSD about that one. But I do think the business wants to hear our recommendations because we do spend more time with the data, we spent hours looking at it. And maybe that actually in turn makes us more biased, but the 25-page thing that you talked about, Tim, of us doing all of this work, we cannot possibly present that to a stakeholder and expect them to digest that and make a decision about a course of action in the 20 minutes they have to look at it versus the other 50 things they’re doing that day, when we’ve had a week to analyze that same information and make a course of action.
0:41:49.6 TW: Well, they’re not supposed to come up with the course of action, they’re supposed to come up with the hypotheses from it.
0:41:54.1 MK: The course of action might be that a statistician would look at the hypotheses, just to be clear.
0:42:01.1 TW: Which again… In my years now, the simplified version of, we believe if we were right, we will do whatever, and the if we’re right, we will run a test or experiment with or do, like that has kind of fit in the construct. But I think that’s again the challenge of… Yeah, you’ve got a limited attention span… I mean, not attention span, that’s pretending that it’s an attention problem, it’s a time problem, and to your point, it’s a having put more time in thinking about it. I would say the other thing that analysts often… It’s rare for 10 analysts to be supporting one stakeholder, it’s more likely there are 10 stakeholders that each have their specific area of responsibility supported by one analyst, which means the analyst often does have a broader view of what’s going on. If I’m the Digital Marketing analyst, I may be looking regularly, even in my Digital Analytics platform at Paid Search and Display and Social, whereas there may be a team managing Paid Search, a team managing Social, a team managing Display. Which means I might actually have a little bit of a broader view, which means when I’m working with them, I can actually maybe bring some of those ideas to the table of what seems to be happening, but we’ve gone a little bit far afield of Cassie… But at this point, she’s either so furious that she just stopped listening or she was like, “I’ve got to set the record straight, you morons.”
0:43:39.0 MH: That’s right.
0:43:40.2 TW: I will give you time to… ‘Cause I think the other… Oh, sorry, go ahead. I was gonna shift a little bit.
0:43:47.2 MH: No, I think it’s fine to shift. I think the only thing I wanted to say, just generally speaking, is one of the reasons we can have this discussion is because Cassie’s been willing to make bold statements in public with her writing that create space for a conversation. That’s hard work. And in a lot of her writing you will even see like, “Here comes the people with the pitchforks”, you know she gets a lot of pushback. And on the world of the web and social media, people are generally unkind, and so I do want us to recognize, and I think we all agree, I think we very much deeply appreciate the fact that Cassie is willing to make such strong statements about things, whether we agree with them or not. That’s a benefit, that’s a huge benefit, so… Please come on the show, Cassie. That’s right. We think you’re awesome.
0:44:42.6 TW: Well, and the other… Kind of the other world that she will go very narrow and deep. I literally cannot count the number of times that I have read the birthday paradox, her explanation of the birthday paradox, or where she explains p-values with puppies, which is really just what’s your prior with what… I think she even starts to get a little bit frequentist versus Bayesian views of… When it comes to, let’s explain what… A p-value is just the measure of your surprise, and she kind of rants, she’s like, really, you shouldn’t be looking at somebody else’s p-value, if you’re an analyst, it’s a quick little shortcut, ’cause, she said, it’s a measure of your surprise. But she puts that in a very, very approachable and frankly, hilarious kind of context.
0:45:33.2 MK: She’s probably the best author I’ve ever read to explain technical terms in a way that I understand them. I feel like I’ve read 50,000 articles on p-values, and she has this way, I’d suddenly be like, “Oh, this really makes sense now,” and I do have to give credit actually, the way that I first discovered her work was actually through a former guest, Lizzie Allen, who’s a Data Scientist at Google, and had done some of her courses and was like… She’s so relatable in how she explains things. I feel like… Yeah, we all know stats is not my strong suit so I need all the help I can get.
0:46:13.0 TW: She explains… She did like a three-part and I found a little chopped up into mini spots, I watched the first hour one, that was actually internal at Google, but then they posted it where she’s like explaining a model and it’s kind of like trying to guess the calories in a… The number of calories in a smoothie, and she’s giving you different pieces of information and then saying, Guess how many calories are in the smoothie? And she’s kind of explaining that the more data you have or… She even gets to saying, Oh, look, what if I actually have gotten the calorie count for 12 smoothies, and this is… And I ask you, what do you think the calorie count in this smoothie is gonna be? What are you gonna do? You’re gonna average it. That’s a model. So yeah, I think on that front… I will still say I look forward to the day where I can kind of call… It has to be more than an hour after having read about it that I can sit down and explain the birthday paradox, not necessarily in a way that anybody can understand it other than me, but I love the birthday paradox, and yet it does not stick with me. I’m pretty sure that might have been one of the Last Calls from years ago.
0:47:20.9 MH: Yeah, I know she’s been a Last Call a few times.
0:47:22.8 S1: But it was like, Oh yeah, Cassie’s explanation of the birthday paradox is… And not just the explanation of the birthday paradox, but thinking through probability and the ways to approach probability and how that leads to the birthday paradox, so it’s a much deeper exploration than just the math behind the birthday paradox.
0:47:43.6 MK: But it’s also the first time that math has made sense to me, interestingly enough. I’m like, Oh, I get it. Well, I think I get it. Let’s be real.
0:47:52.6 TW: It just hasn’t stuck with me. If she did it more as a data story, maybe it would have stuck with me more. [laughter]
0:47:58.8 MH: The question is, how did she end up writing address the analytics translator role? [laughter] Alright, we’re working on a little new segment that’s actually sponsored by a great friend of the show, Conductrics, I think we’re gonna call it The Conductrics quizzical query this week. Who’s Conductrics, you might ask? Well, they’re a company that for over a decade has been helping the world’s largest companies discover and deliver the best customer experiences via industry’s first APIs for A/B testing, contextual bandits of predictive targeting. They empower Marketing and IT professionals while respecting customer privacy. They offer best-in-class optimization technology, and it can be said their true competitive advantage is their willingness to go beyond expectations to help their clients achieve their goals. And the Conductrics quiz is set up like this. In one corner, we’ve got Moe, we won’t talk about who’s weighing in at what here, but… And Moe, you’re representing a listener today. Are you ready and willing to represent our listener?
0:49:12.6 MK: I am, I just feel really badly for our poor listener.
0:49:16.6 MH: Alright, well, your represented listener today is Cory Underwood, who’s actually, I think has been a guest on our show. And Tim Wilson, are you ready and willing to represent a listener?
0:49:32.4 TW: Oh, I’m not gonna do the whole sandbag thing, but… Yeah.
0:49:36.6 MH: He’s ready. Okay.
0:49:37.7 S1: Ready… Not ready, but I am willing, how’s that?
0:49:40.2 MH: Perfect. Alright, well, your listener, is also a great friend of the show, Sarah Hoffman. Alright, so here is the setup, I’m gonna read the question, you two will give me your answers or need additional hints or whatever, and will figure out who’s the winner and maybe…
0:49:58.4 TW: 42.
0:50:00.0 MH: Will win… Dang it. Come on. Alright, let me read this to you and let’s see how we do it. All right, here we go. In computer science, binary numbers are called bits, short for binary digit. Interestingly, the term bit only first appeared in print in Claude Shannon’s 1948 paper, A Mathematical Theory of Communication. The term however, was not without its detractors. A review in the vocabulary of science panned the term claiming bits for binary digits has nothing but irresponsible vulgarity to recommend it. Who was it that coined the vulgar bit?
0:50:43.3 MK: Is this multiple choice?
0:50:45.7 MH: It’s multiple choice. So I’m gonna give you… Okay, so first, I was sort of half expecting somebody to be like, Oh, I know it, and it’ll be over, it would be like, We got a winner.
0:50:56.1 TW: It wasn’t Ben Franklin, so…
0:50:58.1 MK: So clearly, Helbs doesn’t do trivia with me every Sunday because I take score for a reason.
0:51:03.7 MH: I now would love to learn more about your trivia team. Okay, so let me give you some options. So is it A, Alan Turing? Is it B, Kurt Gödel… Gödel? Gödel? I don’t know, it’s got an umlaut, so I’m trying to pronounce it correctly. Three, is it Grace Hopper or D, Is it John Tukey? Alright.
0:51:29.9 TW: Claude Shannon in the…
0:51:35.5 MK: I’m gonna make a random guess, which is not based on anything other than knowing Matt.
0:51:42.6 TW: Good, that’s a good…
0:51:43.0 MK: Which could be also wrong. I’m going with C.
0:51:46.1 MH: C. Grace Hopper, alright.
0:51:47.0 TW: Was that Grace Hopper? Ugh.
0:51:49.6 MH: Moe has the answer.
0:51:49.9 TW: C.
0:51:50.4 MH: Tim?
0:51:50.5 TW: I feel like you got into Matt’s head in the exact same way I would wanna get into Matt’s head… Not for long and not without a head lamp to find my way out, but I also… I was kind of also drawn to the Alan Turing just from thinking about what’s really going on here, but I would have gone with Ada Lovelace if he’d said that, but that would have been I think too early, so without asking you to restate the question, I think I will go with Alan Turing.
0:52:19.9 MH: Alan Turing. Alright, here is the exciting point in the show where we reveal that you’re both wrong, and now it’s a 50/50.
0:52:29.1 MH: Between Kurt Gödel and John Tukey.
0:52:31.9 TW: Okay then I go with John Tukey then ’cause I’m…
0:52:35.4 MK: Oh good, ’cause I wanted B.
0:52:35.6 TW: All about Tukey post-hoc. Okay.
0:52:36.8 MH: You wanted B. Alright.
0:52:38.1 MK: Yes.
0:52:38.5 MH: So now we have Moe with B, Kurt… Come on Moe, you have to say the name if you’re really…
0:52:44.9 MK: I can’t say the name.
0:52:46.2 MH: Okay. Kurt Gödel and John Tukey. And here we go, there’s somebody probably in their car right now, driving somewhere listening to this podcast yelling at the show, but the correct answer is D, John Tukey. So guess what that means? Sarah Hoffman, you’re a winner because Tim Wilson’s second guess was good enough. So yes, John Tukey.
0:53:15.4 TW: Fun fact, I’ve looked for pictures of John Tukey… There’s like one picture of John Tukey that has him like with a little… I think it’s like a distribution on a chalkboard behind his face, he was not an often photographed man, at least according to the Internet.
0:53:29.5 MH: Interesting. Well, along with developing the box and whiskers diagram, which should be part of every analyst’s tool kit, according to Matt Gershoff of Conductrics, Tukey also came up with the algorithm for the Fast Fourier Transform, or as we like to call it, the first fast and furious. The FFT is a key technology behind almost all digital signal processing, and you know what’s a key technology behind some of the worlds largest A/B testing performance? That’s right, Conductrics. They’re a sponsor for this segment, and we’re happy to have them, but let’s get back to the rest of the show. And as always, there’s a segment we always do on this show, which is our Last Call, so not everything is changing around here, let’s go around the horn and share something maybe we found in these last few months we haven’t been with our listeners we felt maybe worth sharing. We probably have a few built up over the last few months. Moe, do you have a Last Call you’d like to share?
0:54:31.5 MK: I have intentionally chosen to share my favorite Cassie article because that was basically like the whole motivation for us doing this show is me listening. So basically, she has a bunch of blog posts, but she also has a podcast on The SoundCloud where she reads them out and because… Yeah, I was walking Harry around, I got to listen to them. And the one that just like I can’t get out of my head is the one on Data Science’s Most Misunderstood Hero, which is really the one about the analyst and basically how you should be hiring and how to not undervalue your analysts and it’s… Yeah, like I said, the first time I’ve really heard anyone speak about that kind of in-depth and not just be like, We should hire all the data scientists. So I felt it gave me a lot of thought about how to structure the team and the data efforts at Canva, which has been kind of nice. So highly recommend reading that one.
0:55:32.1 MH: What about you, Tim?
0:55:33.6 S1: Well, I guess if Moe said that we’re… We’ve said that we’re trying new things out, so I’m now kind of wondering, I just have this sneaking suspicion that as you kind of like we talked about Cassie and then your last call’s about Cassie, I have this sneaking suspicion that the next episode we do, my Last Call may pander to the actual guest and actually be a Last Call of something that he’s actually written or produced and maybe that’s a new thing we’re doing, maybe it’s just coincidence. Maybe I can predict the future. Maybe time is a fickle thing.
0:56:01.4 MH: I feel like you’re being a data charlatan right now, or… [chuckle] No, I’m just kidding.
0:56:06.9 TW: So my Last Call is… It’s an R package. So you know why not? And it actually came to us from a long time, loyal listener, now a co-worker of mine, Ben Woodard, who we created the Adobe Analytics R package, but the package, he did a thing internally at Search Discovery where he talked through the Anomalize R package, which came out of Twitter, it’s been around for a couple of years. They actually wrote a really nice paper that kind of explained what they’re… Some of the challenges of finding anomalies in time series data, and the fact that I kind of understood with part of… One of the approaches they took and selectively incorporate is working off of the median instead of just working off of the mean. Basically, that’s an over-simplification, which is one of those things that I’ve always been… Once I discovered median absolute deviation or MAD, and I just thought that was one of the most brilliant and elegant things. All through, when you’re a kid and you’re learning statistics, you’re like, You learn mean and median and mode, and you’re like, Who the hell cares about… Why do you care about median?
0:57:17.0 TW: And then you have one little example, and the Anomalize R package, it does some pretty cool things, but from a anomaly detection, because Ben, he actually built in an anomaly detection function into the Adobe Analytics R package that uses Adobe’s Anomaly Detection, but it’s kind of comical that he’s now exploring the Anomalize package, as well, that it’s pretty cool and kind of goes to some of the challenges of detecting anomalies, so that was a stem-winder of a single Last Call. What about you, Michael?
0:57:46.7 MH: Well, let’s see, I have one… So if you’ve ever written a line of code, you’ve probably gone to Stack Overflow, and recently I discovered another company that the folks that founded that company created called hash.ai, which is actually an initiative to help create accessibility to simulations, and I thought that was pretty cool. Basically, it’s a website, you can use it, I think, for free right now, where you can build simulations and run them, and they’ve got technology behind it, anyway it’s pretty neat. Simulations are something we’ve covered on the show a couple of times and in a couple of different contexts. Once we were talking about the politics stuff, and once when we were talking a long time ago with Kevin Hillstrom a little bit. So it has a lot of applications, a lot of different areas of data, but I think that was pretty interesting. All right. Well, it’s great to be back. Can I just say that? It’s really great to be back. And you know what? We’re happy…
0:58:53.5 TW: Speak for yourself.
0:58:54.5 MH: Yeah, we’re happy that everybody is back. We’re happy to be back.
0:59:00.0 MK: I missed you guys.
0:59:02.0 MH: Yeah, exactly. This is fun.
0:59:03.9 TW: It’s like we’ve had no contact. We had no contact, yeah.
0:59:07.1 MH: Complete silence, and that, along with COVID and everything. Yeah, it’s made it real rough on all of us, no. But of course, we’d be delighted to hear from you. How have you been weathering this storm of not having new Analytics Power Hour podcast episodes to listen to? So please reach out, you know where to find us, Measure Slack, and Twitter and LinkedIn. And obviously no show would be complete without a shoutout to our awesome producer, Josh Crowhurst, who by the way, you’ll hear this in every episode at the outro, but the music, Josh made that music. Like, wow. And it’s like, that’s the jam… I already said this before, but that’s gonna be Hot Analytics Summer, right now, that song.
0:59:55.4 MK: Can I also just add, thank God we’ve got Josh because the three of us are really not known for our creative streaks or like…
1:00:01.9 MH: Not at all.
1:00:02.5 MK: He has this amazing attention to detail, like really good with design and music, and thank God. We really need him on the team.
1:00:13.0 MH: He’s shown up over and over again entering this process, so.
1:00:16.6 TW: Literally, a version of the logo that didn’t get used, he pointed out pixel-level imperfections in the logo. That we then decided was not a pixel-level imperfection, it was actually technically accurate, and then we didn’t use that logo anyway, but that’s what we’ve been doing.
1:00:32.2 MH: That’s right.
1:00:33.2 TW: We have been interacting with each other.
1:00:36.0 MH: Somewhat, and… So that’s the show, and we’re excited to be back. We’ve got another new episode coming to you in a couple weeks, and we’re really excited about that one, as well, so if you’re not subscribed, get us back into your rotation. We’re back and we’ve got new content every other week on Tuesday. Alright, so, Cassie, if you’re listening, come on the show, it’ll be fun. It’ll be a good time. And if you know Cassie…
1:01:05.8 TW: We don’t wanna sound desperate.
1:01:06.8 MH: No, it’s not… It’s more like we are a huge fans, and if you know Cassie, put in a good word for us. Anyways, alright, Tim and Moe, I know I speak for both of you when I tell our audience, ’cause this will never change, that no matter whether you’re an analyst or a data scientist or a statistician, you should still keep analyzing.
1:01:34.0 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @AnalyticsHour, on the web, at analyticshour.io, our LinkedIn group and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.
1:01:51.6 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.
1:01:58.3 Thom Hammerschmidt: Analytics. Oh my God, what the fuck does that even mean?
1:02:06.5 MK: Hi, I’m Moe.
1:02:09.0 MH: You don’t wanna say more than that?
1:02:09.4 MK: You wanted more than that? Oh okay.
1:02:11.8 MH: I figured you could do like a whole like…
1:02:14.4 MK: A spiel.
1:02:15.3 MH: Yeah, like if this was the first episode we ever did, how would you… Yeah. There you go.
1:02:20.2 MK: Okay, I do data stuff at Canva.
1:02:22.8 TW: Wait, again, if you could maybe do a hard pause, ’cause I’m pretty sure Josh is gonna wanna edit.
1:02:28.3 MH: Oh, I was gonna let that be part of it.
1:02:30.2 MK: Yeah, I was gonna let it all stay in. [chuckle] We’re gonna keep…
1:02:35.4 MH: We’re a conversational podcast, Tim, geez.
1:02:40.7 TW: Josh, I’m so sorry.
1:02:42.9 MH: You know what?
1:02:43.5 TW: I’m so, so sorry.
1:02:46.7 MH: Now we do need to do a hard pause. Selling our podcast souls, take one.
1:03:00.0 MK: I have got a little bit of my zing back. Alright, I will stop. [chuckle] Oh, my God.
1:03:08.7 MH: I think we’re waiting on you. [chuckle]
1:03:11.1 MK: Oh, shit. [chuckle]
1:03:13.7 MH: And I’m Tim Wilson.
1:03:18.7 MK: Oh, fuck. I thought I’ll just start again.
1:03:27.8 MH: I’m glad you got your zing back, Moe.
1:03:33.1 MK: Oh, my God, I’m crying.
1:03:35.1 MH: So this is all staying in, Josh. This is the beginning of the podcast.
1:03:39.3 MK: Oh, shit.
1:03:40.3 MH: Alright.
1:03:40.9 MK: How am I gonna get myself together? [chuckle] Between Helb’s not responding to stop and Moe’s indecisions, Tim are how you on an anxiety level?
1:03:54.4 MH: Oh, I’m sure he’s on his last free nerve. I don’t care anymore, I can’t manage it. It’s gonna be alright.
1:04:04.3 TW: You stopped caring about my last nerve… You got past that, I don’t know…
1:04:06.9 MH: Years ago.
1:04:07.0 TW: Three, four, five years ago?
1:04:45.6 TW: Rock, flag and we’re back!
1:04:51.7 MH: Nice.
1:04:55.0 MK: Good one.
1:04:55.7 TW: Could have held that longer, I guess.
1:04:55.8 MK: Oh, I think that was plenty long enough.