#238: The Many Problems in Dealing with Data Problems

The data has problems. It ALWAYS has problems. Sometimes they’re longstanding and well-documented issues that the analyst deeply understands but that regularly trip up business partners. Sometimes they’re unexpected interruptions in the data flowing through a complex tech stack. Sometimes they’re a dashboard that needs to have its logic tweaked when the calendar rolls into a new year. The analyst often finds herself on point with any and all data problems—identifying an issue when conducting an analysis, receiving an alert about a broken data feed, or simply getting sent a screen capture by a business partner calling out that something looks off in a chart. It takes situational skill and well-tuned judgment calls to figure out what to communicate and when and to whom when any of these happen. And if you don’t find some really useful perspectives from Julie, Michael, and Moe on this episode, then we might just have a problem with YOU! (Not really.)

Resources and Other Items Mentioned in the Show

Photo by Joshua Hoehne on Unsplash 

Episode Transcript

[music]

0:00:00.4 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

0:00:14.5 Michael Helbling: Hey everyone, welcome. It’s the Analytics Power Hour. This is episode 238. You know, one of the more memorable initial consultation calls I’ve ever had in my career involved one of the business stakeholders describing the problem with your analytics as, “Everything is broken and nothing works right.” And armed with this, also precise definition of the problem, we were able to make a lot of progress on their problems. But I mean, maybe this is an example that’s a bit extreme, but it is often an underrated aspect of analytics, which is handling when business partners, shout out to Vela Apgar, tell us when something is not working right. So let’s bring in two of my co-hosts and get this conversation started. Hi Julie, how are you?

0:01:01.9 Julie Hoyer: Hello. Hello. I’m doing great.

0:01:06.3 MH: Awesome. It’s great to be on the show with you again. Moe, how you going?

0:01:12.8 Moe Kiss: I’m going great.

0:01:14.0 MH: Alright, well I’m so excited to talk to both of you about this because I think both of you bring a really great perspective and a lot of experience. But let’s dig into it. Have you ever received a screenshot of a broken report and a Slack message that was taken such that you can’t even tell what report is being talked about?

0:01:30.1 MK: Oh my God, I feel like such like trauma hearing that and like, and then my favorite is when like the screenshot doesn’t even have like the title of the graph Or like…

0:01:42.3 MH: And no dates.

0:01:43.4 JH: You took the words out of my mouth.

0:01:45.2 MK: And you’re like, how would I possibly help in this situation?

0:01:50.4 JH: Yeah. Oh my gosh. Yeah. Where to start? But I love that you said that Moe ’cause that’s exactly where my mind went when Michael, you brought that up. So many bad screenshots with not enough information and it’s kind of funny ’cause it’s not the first time they’ve asked you about an issue. So you’re like, okay, here’s my classic rundown. What date? Where are you pulling the data from? What are you comparing it to? How are you calculating that metric that you’re screenshotting for me?

0:02:16.5 MK: Julie, I love that you just like rattled those things off so incredibly quickly. Like I don’t feel like… I feel like I’ve been in that situation so many times and I’ve never even thought of like a series of questions the way you just pose them. And I think this is the first time I’ve actually reflected, and maybe this is kind of shoddy on my part that I haven’t self-reflected on this before of like, there are things that your stakeholders should provide in this moment. And I think it’s actually pretty reasonable to have some like expectations on them to come to you with the right information. Do you think that’s fair?

0:02:54.7 JH: I do. I definitely do. And actually, it makes me think of my coworker Anna Silom, shout out. She and one of our biggest clients that we work on together, she is amazing because, again, it’s like I was coming at it, I may have had a list I could ramble off, but I never formalized it past that. But we get so many tickets coming in from this one client. A lot of them are bug fixes. And so her and the implementation team, she set up a perfect template to say, Hey, if we have these bits of information, like we can get going on it and start running without having to follow up and have a live chat and whatever. So she even took it further and said like, these are the actual templated items that if you tell us we put them in our ticket. And then they’re also disseminated to our team so much faster where people have those basic things of like, okay, I can get my hands around what the problem actually is to start digging into it.

0:03:43.5 MK: Okay. I’ve kind of got like two directions of like, I love that. I do wonder whether you could do it as effectively in-house versus like, I feel like when you’re a client, you’re kind of like incentivized to have to reduce the like comms and all that sort of stuff. But I do wonder, it’s like one of those things. I see the advantages and I definitely think it’s like a practice my team should probably adopt. But at the same token, does it mean that you then don’t have that first conversation? And would you potentially lose out? Like are there negatives by not… Like, having that five minute call or that slack exchange? Because like often that’s where you dig deeper and you find out other stuff.

0:04:27.7 JH: Yeah, that’s a good point because I think one of the things I’ve been reflecting on leading up to this episode was yeah, like when someone sends you a single screenshot that we’ve all had and the follow up of it, the one good thing about the conversation that I’ve experienced before, to your point Moe, is sometimes it’s broader than you think. They’re like, oh yeah, I saw this number was off and I saw this and I saw this. Or you start to get a feeling of like, what were you even looking at when this came up? And then sometimes it does uncover that it is a broader thing than that single example that they sent you. And so I think to your point, it can be really good, again, to be able to actually define the problem you’re trying to go figure out the conversation can be really, really valuable.

0:05:16.7 MH: Yeah. I think one thing before we dive too far into solutions too is, how awkward it is to have these problems and how unfair it can sometimes be to the analytics team that the problem even exists in a way. ‘Cause even sometimes when a report… Let’s say someone sends you a report and it’s like, hey, this report isn’t working right, there could be any number of reasons, including some reasons that are at the the user’s fault. I’ve literally had people report problems to me because they went and applied a segment to something and then forgot that they had applied it…

0:05:56.7 MK: Stop it.

0:05:56.8 MH: And then looked at something else and then were like, this number is way off. And it’s like, well you do have only like a third of the traffic being represented in that particular report right now. But, so like those kinds of things happen too. But one of the other things that I feel like happens is it is sort of credibility damaging or can be.

0:06:15.1 MK: 100%

0:06:16.3 MH: And that’s really hard reputationally to overcome. And I’ve seen organizations who… I worked with a company who literally went through multiple iterations of trying to build out a series of dashboards and reports and because they didn’t think they were “getting it right,” they would just tear it all down and start over again and again.

0:06:37.9 MK: Oh no.

0:06:38.3 JH: Oh my gosh.

0:06:43.3 MH: And the team was like literally being like beaten to death basically, metaphorically speaking just to be clear. But it was really overwhelming their whole organization by the time we got involved because no one was really taking sort of an iterative approach to the problem. There were deeply entrenched problems that needed to be worked out, for sure. But like the way they were going about solving it was just like everybody just solving it harder. Like, and it was really like breaking up the whole org a little bit around it. And it was just… So it’s sort of like the simple problem or the funny problem of oh, somebody sends a bad report, but like when these things start to layer onto each other, like it can really metastasize into like just a lot of different directions or go out of control, I guess is the way I’d say it.

0:07:37.6 MK: And I think the thing that you hit the nail on the head with is like the analytics team are often the like representation or the… I can’t think of the word, but they’re the people representing data at the company. And this has certainly happened and it’s come up a lot in teams that I’ve worked with where there are upstream things that are affecting the quality of the data, like the data warehouse team have made changes or the people responsible for implementation of the tracking and things like that. And it’s normally the analyst who has built the report that is like, I guess dealing with the brunt of the stakeholder frustration and like the problems that aren’t working. And sometimes, it’s very much out of their control. And I think that honestly is one of the most difficult situations to be in as an analyst because like you don’t wanna point the finger at your colleagues and be like, well they broke something and I’m trying to like manage it, but like that is the reality. And you as the analyst deal with that reputational damage or like that loss of trust and confidence, which there isn’t a good answer for. It’s a hard situation.

0:08:46.0 MH: Yeah.

0:08:46.8 JH: It’s like don’t shoot the messenger and the analyst always has to be the messenger of like, oh, bad news.

0:08:53.7 MH: Yeah. No, it’s difficult because it’s sort of like how do you separate the challenge of the data and how it’s looked at from the people who are looking at it and preparing it and basically understand the nuanced differences which people don’t understand? You know, I don’t mean to make this like a woe is us kind of, our job is so hard thing but more just culturally it’s really important to understand that sort of, there’s a group of people making a best effort to make sure this stuff is coming out correctly and yet there are things outside of their control and we have to kind of balance or understand those two things, which I think is something that some organizations do extremely well. And actually, I’ve never really sat down and thought hard about what that looks like, but I’ve seen some orgs just really like take that in stride and then some orgs, it’s just been like firing squads. Like why is this report not right? How could you show up to this meeting with an incorrect report and this not be okay? And as a consultant a lot of times if you show up with an incorrect report, I mean you’re done, you’re toast. So you really need to make sure everything is correct if you’re on the consulting side.

0:10:04.2 MK: Do you think there are cultural differences there in those two scenarios you just described?

0:10:09.3 MH: Yeah, that’s sort of what I’ve been thinking about is, I wanna say maybe in a way like, what I’ve observed is that maybe the way to talk about it is the one where that happens in a positive way. It’s sort of a culture of believing that everyone is giving their best effort and sort of like, in other words, sort of like we walk in with the idea that everybody is actually trying really hard and we trust in everyone’s effort or everyone is pulling in the same direction. So it’s a more united kind of team focused, how are we gonna solve this problem as opposed to the finger pointing or blame shifting kind of culture where it’s sort of like, you caused this, you better get it right or else, or I’m gonna talk to your boss or I’m gonna do this, or whatever the… There’s gonna be repercussions type of thinking, which sometimes that’s warranted. I’m not saying people don’t do a terrible job sometimes, but it is interesting.

0:11:12.4 JH: Yeah. And actually on that same culture piece, I love that you brought that up ’cause it’s even making me think. Again, I’m coming from the consulting point of view and I’ve been the analyst to bring the message and I’m working with our implementation team who are also consultants, right? So they’re hired and so when you find a bug, you have clients where either they’re really mad when you say, Hey, we found a bug. And you have other clients that are like, oh I’m so glad you were a diligent analyst and found a bug. And of course, we communicate to that to them and say we’ve put in a ticket with our team, it’s gonna be a fast fix, it’s high priority, we’re taking care of it. Some clients, that’s like they’re really upset that there was a bug at all because you’re in charge of the implementation. Other clients are thankful for like the partnership of like looking out for them. So Michael, when you said that it really just sparked that and I’ve never necessarily verbalized it that way, but I’ve definitely experienced both of those scenarios.

0:12:06.9 MH: Yeah. ‘Cause I just think about different clients I’ve worked with. I was like, well sometimes it’s super pleasant and we just tackle it together and it feels like we’re a team. And then other times, I feel under the gun and stressed out because I’m not sure how this is gonna go. And it’s like, oh well there’s a difference.

[music]

0:12:23.8 MH: It’s time to step away from the show for a quick word about Piwik Pro. Tim, tell us about it.

0:12:30.0 Tim: Well, Piwik Pro has really exploded in popularity and keeps adding new functionality.

0:12:36.5 MH: They sure have. They’ve got an easy to use interface, a full set of features with capabilities like custom reports, enhanced e-commerce tracking and a customer data platform.

0:12:48.2 Tim: We love running Piwik Pro’s free plan on the podcast website, but they also have a paid plan that adds scale and some additional features.

0:12:53.5 MH: Yeah, head over to Piwik.Pro and check them out for yourself. You can get started with their free plan. That’s Piwik.Pro. And now let’s get back to the show.

0:13:05.7 MK: Do you think let’s say the goal is like better handling of of data problems. Do you think there is something that analysts can do to have, I guess that better relationship where it really is about tackling it together versus it being like quite combative?

0:13:25.7 MH: I mean I think probably going back to sort of things we’ve talked about in the show before, I think as an analyst you have to be making sure you’ve done everything you can do to bring forward the right solution. So in other words, if you haven’t kind of double checked your work and those kinds of things as an analyst that I think you’re opening yourself up a little bit. And I think there’s other ways of framing problems as well, that kind of help people understand or help people not necessarily put you in the cross hairs. It’s a tricky communication thing. ‘Cause like on the one hand you are like, okay, yeah this needs to get fixed. It also comes from where we think the problem is generating from, right? So like, okay, let’s say a data feed goes down and we are missing three days of data from this particular API.

0:14:16.2 MH: Well, as an analyst there’s really nothing you could do about that besides get with the data engineering team or the analytics engineering team and figure out what needs to happen to get that data API restored so that data starts flowing in again. And so it’s not really like a you kind of a thing to solve. There’s other problems where it’s like, oh, okay, well I don’t know. I’m trying to think of like what else might show up in a reporting error. But where maybe it’s more in your control and you’re like, oh, okay. Oh here we go. Power BI dashboard needed 2024 added to it for one of our clients so they could start viewing 2024 data. Okay, that took 15 minutes. We probably should have thought at the beginning of the year, we didn’t. We just added it. So problem solved. But it was easy. Nobody was upset. It was easy to solve. So that kind of stuff isn’t more in your control maybe.

0:15:10.4 JH: I do think a key part of that, even what you said Michael, like it’s the timeliness of your, once you know of it, how quickly do you let your stakeholders know?

0:15:22.1 MH: Oh yeah, it’s good.

0:15:23.2 JH: Like in my mind, again, I’m kinda coming from the consulting side, having clients, but it’s like if I find it, I’m not gonna let it sit for a few days. I’m going to immediately know, I’m going to talk to the people on my team that know about the implementation. I’m going to raise the flag internally immediately. I’m going to ask some questions, I’m going to kind of get together like a, hey, okay, put this in the backlog ticket or we’re gonna take care of this, get an explanation together. And to me, it’s almost like the clock is running for how quickly can I let the client know that we found this. Not only that I’ve found it, that we’ve already kicked off like the solution part because I think that helps build the trust that, and sometimes I won’t even say like maybe I don’t even know why the problem happened and I’m not gonna point at another team and say, we forgot to put 2024 in, but I am gonna be transparent and say, 2024 hasn’t been put in yet in the data.

0:16:14.9 JH: So we have three days missing it, but it will be fixed by end of day right after I’ve talked to my team that we can commit to that. So I feel like it is that like not lie by omission, but you maybe leave out some of the wording that would put blame on someone, but then you also are just very transparent about the timeliness of you found the issue and it’s gonna be taken care of. So hopefully, you start to build the partnership feel with your stakeholders so the next time something comes up, I think that’s can be really beneficial.

0:16:43.0 MK: I think the timeliness matters. I don’t know. As I was listening to you, I was like, I actually think of it maybe from a different perspective and maybe it’s the wrong one, which that like the timeliness probably matters more to me from the like catching it first. Like you want to communicate it quickly so that you are the one that finds it and you’re being proactive versus your client or your stakeholder finding it and then coming to you. But I don’t know, I agree that the timeliness in talking about like what’s being done from a solution perspective matters. I do wonder sometimes though about like the urgency. And before you started talking about this, the one thing that was like kind of on my mind is like some stakeholders think that like a data problem that they’ve got is like the most urgent thing in the world. And it’s like you’re missing two days of data. You have a check in five days and you make decisions weekly. This is not an urgent problem. And so I do wonder like if you are too, like in that… I’m not saying like timely, but like you can create urgency where maybe there shouldn’t be some. Maybe I’m just also being really antagonistic and saying the total opposite of whatever you say today.

0:17:58.0 JH: No, no, I think that’s a great point. It’s hard. It’s very nuanced because depending on where you sit, the people you’re working with, the type of issue you find, I really do think like those are just a lot of things you have to take into account. And I actually love that you said that because another thing I think is crucial is the idea of quantifying the impact of it. Because you can’t just expect to throw, like you said, not everything is actually urgent because if everything’s urgent, nothing’s urgent. So if it’s a small data quality issue, then it’s like, okay, let it ride. We know it’s there, it will get slated to get fixed. But if it’s actually like a huge impact, can you quickly quantify that and say, hey, half of your customers are having this issue. Then like, yeah, it needs to be done asap. So I’m glad you brought that up Moe ’cause it’s such a good point.

0:18:44.0 MH: Yeah, I like it. So a couple of things I think I heard you say in some of this, Julie was sort of taking on a proactive approach of sort of like okay, identifying walking through resolution steps. And I think maybe a lot of times what I’ve observed is that one of the ways to defang this a lot of times too is letting them know the process you’re going through in detail to fix what’s gonna… How we’re gonna approach the fix and the timeline of that fix. So that way they’ve got some input into the process. So it’s like oh, It looks like we’ve got a data quality problem here. We’re gonna assess it this way. We’re gonna do this, we’re gonna do this. We’ll probably have an update or a fix in place by X time. Here’s what we’re gonna do about it. The other thing, and this is just something that I’ve randomly started doing in the last few years that has sort of been very interesting to me ’cause I sort of started it as a little bit of a psychological experiment, is I stopped apologizing. I never say sorry in emails or I apologize in emails.

0:19:50.5 JH: I don’t either.

0:19:50.6 MH: Almost ever. And not ’cause I’m not sorry or I don’t care. But I’m sort of like fascinated by how you get treated if you do versus if you don’t. And people don’t necessarily need you to say, I’m sorry as much as they need to have clarity and know what’s gonna happen. And so I feel like that’s a more valuable thing for me to do or say a lot of times anyways, it’s just sort of one of those weird things. It’s like I felt like a long time in my career I felt like oh, if I’m being like oh, I’m so sorry that happened. We’ll get that fixed. Like that’s better customer service. I’m not convinced anymore That’s true. And so it’s just fascinating like again, that’s more of like a communication thing, but it sort of is in line with that a little bit of like okay, listen, I’m not gonna tell you I’m sorry ’cause it’s not really my fault. Or maybe it is my fault. But I’m still not gonna apologize. I’m just gonna explain why…

0:20:39.3 MK: I don’t think it’s the thing that you should apologize for because the reality is like mistakes happen and human error happens.

0:20:48.8 MH: Yeah, exactly.

0:20:49.7 MK: Look, if you are like not doing your job, that is very, very different. And you probably should apologize, but like there’s a whole… But I do think…

0:20:58.8 MH: There are situations where I still do, which is like if I made a mistake and I’m owning up to that mistake, then I’ll be like, Hey, that was my bad. I’m sorry about that. Here’s how I’m fixing it.

0:21:08.9 MK: I think I have on occasion said I regret that happened, which is genuinely true. But I would, I generally steer away from apologizing as well.

0:21:19.5 JH: Yeah. Unless you’re gonna own it. Like Michael you said.

0:21:21.7 MH: Yeah.

0:21:21.7 MK: I have a question, which is kind of controversial. Do you think there’s ever a time that you shouldn’t tell your stakeholders or clients when there’s a problem?

0:21:32.2 JH: So it’s funny you say that. I was kind of mulling that over, not like recently in a real scenario, but as we’ve been talking because when you raise an issue that they haven’t seen, you know what I mean? Like you’re bringing it to their attention, you’re adding to their worry. You’re bringing it to light. So when is there a scenario Yeah. Where that’s not worth like raising the alarm, like crying wolf like if you do it all the time type thing. Are they just gonna be more stressed out?

0:22:07.2 MK: Totally.

0:22:07.3 JH: So I do wonder about that. Like I guess if it’s something they wouldn’t feel the pain of down the line in their reports, if it’s not something they’re gonna like catch you. I’m doing air quotes, that’s not helpful on a podcast. Catch you on later. Maybe it’s not worth it. Or you let them know once it’s already fixed like Hey, we noticed this, we took care of it, it’s all good. Like maybe it’s in a passing status update or something. But I don’t know. How do you guys feel?

0:22:36.1 MH: Yeah, I’m trying to think of examples of times where I’ve not informed people ’cause I know I haven’t. I think it’s all about kind of like you mentioned earlier Julie sort of measuring the impact. So like what is the impact of knowing versus not knowing also like if it’s just gonna get fixed. ’cause on the one hand, like you said, it could be sort of like this crying wolf thing. It could also be like kind of perceived as like a hey look at me, I’m doing stuff kind of an attitude which nobody really likes either when it’s not really germane. So in other words, if you see a problem that no one else has seen and you fix it and it fixes the problem and no one needs to know about it. Like maybe if you need to make sure someone knows that on your direct team just so that there’s something to follow up on or something like that. But like I don’t think you gotta send out a system wide email to be like Hey everybody, I helped us avoid a massive loss of data today. Ah, I don’t think so. And then a lot of organizations are dealing with a ridiculous amount of data. Like you think about all the different dashboards that go on. Probably at Canva, Moe like there’s 100s probably and like organizations we work with.

0:23:51.4 MK: I would say 1000s.

0:23:53.2 MH: Yeah, Maybe 1000s. And that’s probably not atypical in a large company with BI reporting tools and all those kinds of things. Someone might not look at this report for another month. It might’ve been a report that got built because someone wanted to “do an analysis”. Like we’re always having to kind of balance those things. So you got zombie reports everywhere. So sometimes you want maybe leave the problem to see if anybody notices because maybe you think no one looks at this report anyway. So it’s like.

0:24:22.3 MK: Yeah.

0:24:22.4 MH: Is this even data we’re using?

0:24:24.0 MK: I think in my case it actually comes down to I only get so much like “FaceTime” or like opportunities to bring up data stuff with senior stakeholders. Like that time is not indefinite. There is like certain windows I get and it’s like is this mistake or this problem big enough that I should sacrifice some of that time? Or is it more important that we discuss alignment on a particular metric or like a target that we’re missing? Or I feel like I have to juggle that a lot of like especially I think… I don’t know, and this is not to like say Canva’s like the most amazing place in the world, but like a lot of our data problems are really complex. So it’s like I can’t just be like oh hey there’s this thing that’s broken, don’t worry, we’re fixing it. It’s like in order to like… You have to provide some detail and then you’re like adding to their mental load. ‘Cause they’re trying to like understand how this thing works and it’s like do I really wanna spend energy on that or do I wanna spend this like 25 minute meeting I have focusing on something that is probably more important for this particular person at their level. So I think.

0:25:36.6 MH: That’s a good point.

0:25:38.8 MK: I would say like when it comes to data problems, we are often quite selective with who we do share the information with because I don’t think it needs to be like every single stakeholder we work with, it’s like okay, maybe this person needs to know, ’cause I look at that report daily, but like their manager doesn’t need to know and their manager… Like Yeah, I do find that we need to dance that dance a lot.

0:26:00.8 MH: Yeah. As a consultant, I think there’s a temptation to wanna take credit for everything you do. So you wanna write down in a status report that you did something. So you do have sort of a tendency maybe to maybe mention stuff, but like to your point, like maybe don’t dive deep on because yeah, sometimes they’re really complex problems and they’re really nuanced to understand. And it’s like… And I’ve seen meetings go off the rails where someone has a bullet point in a status doc and it’s like yeah we did this. And they’re like whoa, whoa, whoa. So what was that? And it’s like oh great, we have 15 minutes of trying to figure out and explain this to everybody now. So like that’s a good point. It’s a really good point. Moe.

0:26:44.6 JH: Actually along those lines, I have a question for you guys. What do you think is the most effective form to communicate like a data issue and it’s like solution, a PowerPoint, an email, face-to-face explanation where you’re showing the dashboard. Like what works?

0:27:02.8 MK: It depends on the stakeholder and it depends on your company. I don’t know, my company is really Slack heavy and so we do like data announcements in like a bunch of public channels. But I don’t read all of Slack. Like if I read every channel that I’m in, I literally would not have time to do my job. So I actually think it’s a lesson I learned from my old CEO. It was actually when we were trying to get the business to adopt a new metric and he was like Moe, you need to tell them many, many times in many, many different formats. And we would share the same information at sit down. We would share the same information via email in Slack. And like I don’t think for a data problem you need to like broadcast it that way essentially or like comprehensively. But there definitely is a thing where like you can’t assume just because you’ve told one person one way one time that they’re gonna have like absorbed that information.

0:27:54.1 MH: Yeah. As a typical consultant, I feel like it does depend because, well and you work with different companies so different companies have different cultures around this stuff. So one of ’em is well we keep track of all that in Jira, right? So just go update it there. Or Asana like where their ticket lives. And so that’s gonna be the best place to have conversations and updates around it. A lot of times it’s email, sometimes Slack. I don’t like Slack for that only because it’s easy to miss stuff in Slack.

0:28:28.3 MK: It is.

0:28:29.7 MH: Like especially when you’re busy.

0:28:31.8 MK: You have to find it again.

0:28:31.9 MH: Yeah. And so stuff that you need to be able to like access again quickly, it’s just not a great system for that. And then sometimes when explaining stuff to people, I find little Loom videos to be really helpful to just walk through and show people like, okay here’s the steps that went into this and blah blah blah. And then record yourself kind of replicating and explaining and sometimes that really is helpful. But again it goes back to the culture, right? So what that team is used to, what their preferences are. But yeah, those are some examples. I don’t know Julie, what have you seen?

0:29:05.8 JH: Well it’s funny. I feel like it goes along with that old saying sorry I didn’t have time to write you a shorter message.

0:29:15.8 MH: Nice.

0:29:17.4 JH: Type thing because.

0:29:17.6 MK: So true.

0:29:17.8 JH: Again, under the scenario someone sends me a screenshot, like minimal evidence that something weird’s going on and they’re like I need you to dig into this. I kind of go about it. It’s funny like probably the same steps I would say is for good like data presentations or like storytelling. Like I try to start with let’s just define the problem. So like can I even figure out where this starts and ends and what’s being affected? And then if I can define that and show what I can see, like I like screenshots plus like very clear text and I try to say like so here’s the issue we’re seeing. Then I try to go like narrow down to like the possible cause and give them the background and then it’s like okay and what are we gonna do about this?

0:29:57.9 JH: Or what’s the impact? I try to quantify like what the actual impact is like we talked about. But to do all those bits and pieces in my head, I have to format it almost like in an email. ‘Cause it’s gonna be long, it’s gonna be kind of wordy. I need to be able to throw screenshots in or I’ll do a deck depending on the complexity. ’cause once you have that running email, like that gets too long and your eyes glaze over and that’s annoying. So sometimes I’ll do a deck if I have to be like this is complex and there’s like multiple pieces going on. So those are probably the two most common ones. Not that I have a true final format, but.

0:30:29.8 MK: The deck is to be absorbed by reading it. Like not that you would walk them through it. Is that…

0:30:36.2 JH: Yeah.

0:30:36.6 MK: Is that how you do it?

0:30:37.9 JH: Well usually I would walk them through it live but it’s definitely like a leave behind as well. But you’re right. Normally paired with a live meeting.

0:30:47.2 MH: It’s interesting ’cause I think this sort of comes, that we’re sort of walking into the resolution phase or whatever of these data problems and what’s interesting to me over my career is how much helping resolve issues or taking problems and really resolving them well turns into sort of opportunity in a way that builds trust between stakeholders and you sometimes. And that’s kind of another interesting aspect we can kind of dive into. But your email that you were talking about, Julie reminded me like oh yeah, I’ve written a ton of those emails in my career where you like really methodically step through. Like, okay so here’s what we did And that’s the thing is like when something bad goes wrong, mostly people want to know what happened. What are we doing about it and do we need to do something different going forward to make sure it doesn’t happen again? Like that’s really it. Anyway, so your email example really brought back to mind the few times where I’ve written those emails.

0:31:46.0 JH: I mean I spent hours.

0:31:46.9 MK: Do you think…

0:31:48.8 MH: Yeah.

0:31:49.3 JH: Doing it.

0:31:49.4 MH: It’s a long email and you really.

0:31:49.5 JH: Just the email part.

0:31:52.2 MH: The crafting of that email is actually something very important. I agree.

0:31:57.4 JH: Yeah. Yeah.

0:32:00.6 MK: Do you think it’s a skill, like you just get better with practice? Like I have seen some of those emails or messages and stuff where like there ends up being a lot of technical detail and like as those three points that you just mentioned Michael, that people really care about like that I don’t know, I would probably at this stage of my career err away from like really any detail unless it was super important to be understood. Like for example with a compliance risk like that is generally a time where like you might have to provide some technical information but do you see when you’re working with younger people, like this is a skill you really need to teach them to understand that balance or do you feel like most people intuitively get it from the start?

0:32:41.9 MH: I think it’s a big time skill. And the other challenge you have is so often you’ve got so many different people on that email, that need to read it at so many different levels. So you’ve got the CMO who really only cares to know that this got solved and then you have a bunch of directors and then maybe some managers and they all wanna know all the details of how it got… So this email is trying to solve so many problems for so many people that that crafting that well is actually kind of something I think people go to grad school for probably. I don’t know how people learn how to do this.

0:33:19.9 JH: And I was gonna say too, like as a young analyst, I really do think that it is a crucial skill to just be able to clearly communicate complex ideas and it’s not going to come naturally necessarily like to verbally do it offhand, you have to be really comfortable with it. I mean even stepping into the space, just being comfortable with the technology that’s going on. Me going through crafting those emails as young analysts helped me learn so much about like what do I call these things? What is this? How do these things work together? Why is this an error? Why is this even an issue? Why do they care? What would be the consequences of this happening? I feel like you have to get all those facets And so it’s kind of this weird like slog of studying it and then having to put it in your own words and being able to put it in your own words actually like fingers to keyboard is a really great way to like ingrain it in your brain and so you get faster at it. But I really vividly remember the first couple of times I had to take in one of those weird requests of like this looks wrong, why is it wrong? Can you fix it? And it was an uncomfortable learning situation but the more you do it, I remember I felt like wow, I finally understand how the bits and pieces of this all go together. So definite skill. Definitely worth doing. I would say.

0:34:38.5 MH: I think another component of that is you’re also having to make an assessment about how well your stakeholders understand the underlying data and how that data works. Like I’ve had some of these conversations where you’re like okay, the email marketing manager has a question about something and they have no idea about how this aspect of the website works and how orders are processed or any of those things, but they’ve got a concern. And so like you find yourself explaining in an asymmetrical fashion and it’s sort of like you gotta do a bunch of upfront education on a bunch of things or it makes you have to kind of think through how you explain a little differently. So it’s very… Honestly this might be one of the more nuanced things, especially if the problem is fairly complex. If it’s easy then you just sort of fix it and move on. But if it’s sort of like a complicated problem then, yeah that is interesting ’cause yeah it does take practice.

0:35:36.9 JH: One other thing that just made me think of, if you’re a young analyst out there and you’re stuck in this role, ’cause usually you have a lot of time and they’re like go dig and do your thing and tell us what you find. I actually also think it is a great like foot in the door to communicating directly with your stakeholder even if it’s through email. I remember also being that analyst and you don’t necessarily get the FaceTime, they don’t necessarily know your name. Like live, right? You’re sitting in on meetings but you’re not a key person speaking to things. But if you are suddenly the person helping fix their problems and learning to communicate really well to them about their issues, like they like you really fast and you make a name for yourself really fast. Like it’s a hard spot to be in. But like now looking back on it, I’m really glad I was put in those positions on different clients.

0:36:28.1 MK: I love that Julie.

0:36:34.1 MH: Yeah. That happens where you suddenly become that executive’s favorite analyst and then suddenly you start fielding these random questions for them all the time about, okay so what about this? It’s like they just had these pent up questions that they couldn’t ask anybody else but now they know they have someone they can trust. That’s all.

0:36:47.6 JH: Which slippery slope. I mean don’t become…

0:36:50.7 MH: Yeah exactly.

0:36:51.5 JH: That’s the only thing you do your whole job and like to be honest, like you know you’ve made it when you’re asked to duplicate yourself and so you have to train someone else to do it so you can like let it go. Like make sure you don’t forget that part of the process. But like it’s still a really good starting point.

0:37:06.2 MH: Yeah. Oh man. Alright, well did we cover it? Did we get it all? Anything else we missed?

0:37:15.0 MK: I don’t feel like you can ever really cover it, right?

0:37:19.7 MH: Well, yeah I mean it’s gonna be tricky every time ’cause it’s sort of like okay, how do we define the problem? Can you please send me something that helps me know what you’re even looking at? How do we figure out where the problem exists? How do we make sure we’re not getting run over by the train? And then how do we schedule arrange a line and fix and communicate that?

0:37:40.6 MK: So one thing we haven’t talked about, which is like looking forward, data problems moving into the future are getting significantly more complicated and generally speaking, like we are starting to use models where explainability is reducing. And so like all the examples we’ve talked about is like a broken report. But the truth is I mean we actually, we had an issue with a model that broke a little while back and it had a very significant impact. But the explainability was far more difficult. And so it’s like I actually feel like this whole space is going to get harder, not easier.

0:38:26.7 JH: Yeah, that’s a really good point.

0:38:32.4 MH: Yeah. Well if Tim was on here he could probably say something pretty useful but. No, I’m just kidding.

0:38:37.9 JH: No, but I think it goes back to some of our big things we were talking about like people are still gonna be needed. You gotta watch over your AI that’s coming. Like you have to document things to solve for those future pain points. I feel like I mean there’s no easy answer here, but it’s just making me think of some of those big things we’ve talked about. Like you have to be very thoughtful about the fact that an issue may arise and maybe the only way to help, to your point Moe, because things are just gonna get crazier, is like the forethought of how if something happens, how can I make this easier on myself to pinpoint what is going on?

0:39:14.4 MH: Yeah, I mean it is gonna be tricky to trace explainability problems through the minefield that is LLMs. So that’ll be fun. Although if you’re writing an email to explain a big problem, just dropping it in ChatGPT and say rewrite this for an executive audience and you can get a summary real quick like that. So there are tools that’ll help you with your explanations at least. Anyway. All right, well why don’t we wrap up and this has been really fun to talk about only because I think this show idea happened because of a bad reporting, a problem report that we all sort of were laughing about together and I think we were like we need to do a show about this. So thank you Moe and Julie for sitting in and helping solve some of these problems and I think it’s relevant ’cause at what organization doesn’t struggle with this. Alright, well let’s go around and do some last calls. Something that might be of interest to our audience. I don’t know who wants to start. Julie, do you have a last call you wanna share?

0:40:23.0 JH: Sure. This one actually, I was perusing Netflix and came across a maybe a newer documentary, it’s called Bitconned. And it was pretty good. It was about this like lifelong scam artist and how he stood up an actual bitcoin company, I mean with toothpicks and how much money they made off of it. And it’s just, it was really interesting and to be honest it just had me thinking about all of the like startups and the schemes that are out there, the sketchiness of even maybe some AI companies coming out. I don’t know. But it was a great documentary and it definitely got me thinking of all the possible future scenarios like this that could happen.

0:41:07.3 MH: Alright Moe what about you? What’s your last call?

0:41:15.5 MK: Okay, one is a repeat from Tim. Tim has mentioned it before, which is how I got onto it in the first place. I just can’t stop listening to Katy Milkman in Choiceology. Like I’m absolutely obsessed. Like I am completely binge listening. Anything she’s ever done, I’m now like stalking her on her university page, seeing what research she’s doing. Like I am next level. But since that’s a bit of a repeat, I will also share something junky, which is I look at Instagram like I am of that era. I will be full disclosure, but it’s a post from @librarymindset, life hacks. I wish I knew at 20 and I don’t know, these things always just get me reflecting a little bit. So I’m not gonna read out all 30, but a couple like the best productivity app on your phone is called Airplane mode and you should use it.

0:42:07.0 MK: And I’m like oh yeah, that’s a really good point. I know now phones are more sophisticated, etcetera. But like there were times for that. The one that made me think is like make know your default, whether it’s new projects, social gatherings, saying yes to non-priorities ruins your priorities. And the funny thing is I don’t agree with this advice, but it made me consider it, because I used to have a rule of saying yes to everything and I did it, especially when I moved, when I started a new job. Like when you’re like… And so I think that advice is really important for the life stage you’re in. If you’re in a stage where you’re really actively like trying to grow connections or like practice a new skill or learn something, then you should have that yes mindset. But when you get to other stages where you’re actually trying to be like a bit more thoughtful with your time and like really focus on your priorities, then you actually need to like rethink that and perhaps move to a no model.

0:43:02.4 MK: But number three is one that I really like which is like normalize. I don’t know about that as like a successful answer or like the way I often think about it is like can you tell me more about that or, I’d love to hear more about that idea or whatever is like you don’t have to know about everything. And I just… I don’t know some of this stuff, I know it’s corny and it’s on Instagram, but it does often make me like rethink my own priorities and like why I’m doing something or whether I should be, and anyway, very long-winded ramble there. But what about you Michael?

0:43:32.1 MH: Well this is a little bit tangential to analytics but still somewhat useful. I ran across a really in-depth article by a guy by the name of Patrick Campbell who was the CEO of a company he sold very successfully. But in that he wrote this big thing about how he does competitive research and strategy for I think primarily product, but as analytics people, a lot of times we’re in a position where we need to kind of help product teams think about analysis and about competitive landscape. So I just thought it was a really great primer on sort of thinking through like all the different things to think about in terms of how to think about the landscape, how to think about pricing, how to think about monitoring, all these different things. So anyways, good read, very useful. So share a link to that in the show notes.

0:44:18.7 MH: Alright, well this has been excellent and no show would be complete without a huge thank you to Josh Crowhurst, our producer. Thank you Josh for everything you do. And I think as we’ve gone through this, it’s reminded me of so many awesome and fun experiences throughout my career, which reminds me of how great it is to work with so many talented people. So thank you Moe and Julie for this conversation. It’s been a lot of fun and I know I speak for both of you when I tell all those great people out there, no matter what problem you’re dealing with, keep analyzing.

0:44:58.8 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour, on the web at analyticshour.io, our LinkedIn group and the Measure Chat Slack group, music for the podcast by Josh Crowhurst.

0:45:21.9 Charles Barkley: So smart guys wanted to fit in, so they made up a term called analytics. Analytics don’t work.

0:45:31.8 Kamala Harris: I love Venn diagrams. It’s just something about those three circles and the analysis about where there is the intersection, right.

0:45:41.8 TW: Rock flag and could you at least tell me the date range.

[music]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#239: Non-Technical Backgrounds in the Modern Analytical World with Kirsten Lum

#239: Non-Technical Backgrounds in the Modern Analytical World with Kirsten Lum

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_239_-_Non-Technical_Backgrounds_in_the_Modern_Analytical_World_with_Kirsten_Lum.mp3Podcast: Download | EmbedSubscribe: Google Podcasts | RSSTweetShareShareEmail0 Shares