#230: First, We Must Discover. Then, We Can Explore. With Viyaleta Apgar

Seemingly straightforward data sets are seldom as simple as they initially appear. And, many an analysis has been tripped up by erroneous assumptions about either the data itself or about the business context in which that data exists. On this episode, Michael, Val, and Tim sat down with Viyaleta Apgar, Senior Manager of Analytics Solutions at Indeed.com, to discuss some antidotes to this very problem! Her structured approach to data discovery asks the analyst to outline what they know and don’t know, as well as how any biases or assumptions might impact their results before they dive into Exploratory Data Analysis (EDA). To Viyaleta, this isn’t just theory! She also shared stories of how she’s put this into practice with her business partners (NOT her stakeholders!) at Indeed.com.

People, Articles and Resources Mentioned in the Show

Photo by Andreas Chu on Unsplash

Episode Transcript

[music]

0:00:05.8 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

0:00:14.2 Michael Helbling: Hi everybody, welcome to the Analytics Power Hour. This is episode 230. Analysis in all its forms, we love it, or most of the time, we pretend we do, and all too often we see other people, and that’s never us jumping feet first into analyzing a data set before ever considering what’s in there. This is a topic about structured data analysis and what is the ready set part of ready set go when doing analysis. So let’s talk about it a little bit, but like analysis, there are pre-reqs for a podcast, which include introducing my co-hosts, Valerie Kroll, optimization director at Search Discovery, great to have you on this episode.

0:01:00.5 Val Kroll: Hey party people. Excited for this one.

[chuckle]

0:01:03.4 MH: Yeah, me too. And Tim Wilson, the quintessential analyst. And I know this is a topic you have a lot of interest and opinions on.

0:01:15.4 Tim Wilson: Yeah, and opinions about that intro, but save that for off the mic.

0:01:20.0 MH: Everyone I’ve ever explained it to agrees with me that that is true. And I’m Michael Helbling, but we wanted to add some depth to this conversation, so we needed a guest. Viyaleta Apgar is the senior manager of analytics solutions at Indeed.com. She’s a data scientist and has held roles in data science and BI throughout her career, and today, she is our guest. Welcome to the show, Viyaleta.

0:01:44.4 Viyaleta Apgar: Thank you so much, Michael. It’s an honor to be here and to have this conversation with you guys.

[laughter]

0:01:49.9 MH: Well, let’s see how you feel after talking for an hour. So I think probably to kick things off, I’d love to just sort of talk a little bit about how you started down this path and some of your first findings and sort of what got you thinking about this in a broader context, and then we can kinda jump into more details about it.

0:02:12.8 VA: Yeah, for sure. So I started working with data probably right out of college, I majored in math, so it was a very natural path for me to go, and I spent some years working in non-profits doing a lot of database administration, inventory control, and then I went into business intelligence and I was like, I have no idea what it is, but it sounds interesting. And it seems kind of very similar to what I’m doing right now, and when I got to BI, there were pretty much the type of work was, okay, let’s get a request from stakeholders, do the analysis, do the request, provide them with insights and results, and then they’ll take the information and do the decisions, so very typical BI, and I remember when I joined Indeed for the first time, I started doing this, but Indeed has complex products, different product offering, so many different divisions of the business, types of stakeholders and clients that they serve, and the information that like to be really, really complex. And I remember being like, okay, I get a request to have a very little amount of time to complete it, I have to think on my feet and I have to just do the analysis and move on, and the team I was working with specifically was the client insights or the client success.

0:03:19.8 VA: So they essentially provided Indeed’s clients with information and insights about their products or their competitors, or the general market information about various jobs, and I remember being like some requests would be easy and we do them, but some requests they would come in and be like, okay, I have no idea what this means. I was just gonna go ahead and roll with it, but at the end, when you provide the information, your stakeholder is like, well, that’s not necessarily correct, or that’s not necessarily what I was thinking about, or let’s redo it and you’d redo it and then provide the results again, and you kinda started to do this dance, and then what I started to officially realize over time it’s like, okay, if we just take a step back and spend a little bit more time and learn what they need specifically, or what they’re asking about, we can probably serve them better and not have to go back and redo the analysis many times, and I think we’ve all been in kind of similar positions, but I think that being in that situation, just having redoing it over and over again for a couple of years.

0:04:26.2 VA: Sorry to frame that in my mind, but it wasn’t until I had a couple of years ago, I had an honor to start my own team called analytic solutions, and really for us the focus was a little bit more of having that broader vision and broader picture, and me and my manager at the time, Peter, had this division, we were like, okay, in BI, we used to not have the ability to have the big picture and have that discovery process and be able to investigate, now let’s actually frame a team around that process and see if we can make analytic solutions, analysis by first spending a long time in that discovery phase. Yeah, so that’s kind of how it came about.

0:05:12.2 MH: It seems like to me, that’s one of the signs of a sort of maturing analyst, is realizing that just taking the request and taking the data at face value is just a recipe for disaster. And we probably could have said, you wrote a post on Medium last year, and that kind of is what inspired us to reach out to you because it really talks about kind of what you just described, I think, is that if you… We have this urge to jump into the data, and if we just jump in and make assumptions, oh, we understand what the business was asking, and it’s a new data set, but surely, subscribers means exactly what I think subscribers means in the table that it just… You get burned so consistently and then I think as you were describing, you end up having to do re-work ’cause you either don’t meet the needs of the business or you actually made a bad assumption about what the data was saying or how complete it was.

0:06:21.8 VA: Yeah, yeah, exactly. And I think the big part of it is if you make those assumptions and you don’t relay those assumptions to your stakeholder, somebody is going to end up making bad decisions based on that information, and really, I think one part of it is knowing what you’re dealing with but the other part is knowing those assumptions and telling your stakeholders, here’s what we don’t know, or here’s what we have to assume, because that kind of puts them into a frame of reference for their decisions. So can I really make a decision if I made this assumption? Sometimes yes, sometimes no, but that risk kinda gets delegated to the decision maker.

0:07:02.5 VK: So that’s actually one of the things that I was curious about, especially as I was reading your very well-written article with that, the approach and all the questions that you ask yourself as an analyst, so one of the questions that’s in there is, is the data quality, good? So when the answer is no, what is like the logic tree that you think about with that? So I’m assuming that sometimes it might stop your analysis all together, sometimes it might mean that you have some additional data to collect, perhaps it shows up as a footnote or a big caution, use results with caution sign when you’re presenting results. I know you talked about assumptions, but I’m sure there’s like a variety there, and I would love to hear how you approach the answers to those questions as you explore your framework.

0:07:43.0 VA: Yeah, for sure. So I think the data quality in itself is just such a big issue in analysis and data science, but I think for data quality, there has to be a strategy, so going into it, you’re gonna have to have like, okay, I know that it’s feasible to stop this analysis or I know it’s not feasible, and if it’s not feasible, then what can I do? Is it possible to collect more data? Is it possible to write out those assumptions? Is it possible to kind of reframe the question or provide insights that are framed differently so that the data quality is not hindered? And it really depends on what the issue with data quality is, right, is it missing data? Is it missing data points? Is it incorrect data? Is it incomplete in a specific sense, that biases the information, and I think that kind of gets… You can really address that piece by piece, but I think if you have a strategy and you communicate with your stakeholders well, and you know, okay, I know that I can stop this if the data quality is bad or I have to keep going and I have to provide something to them, because I think, especially if you work in the corporate world and you work a lot with sales and CS and client-facing teams, you always have to keeping going and almost always have to keep going, so communicating that strategy ahead of time, I think is the most important key to dealing with that.

0:09:00.0 MH: Can I ask on that, ’cause that brings back memories of some specific instances where dug in we’re sure the data is complete, it goes back six years for whatever you need to do, dug into it, or it could be an experiment was run and the experiment was botched. And there wasn’t randomization. So all sorts of those. If you find that and you go back and say, wow, this is not what we thought the data was going to be, and it sometimes feels like it’s the stakeholder, it’s not the data they assumed that it was. And then, like you said, you still have to give them something, so you say, well, we can… We’re not gonna be able to do everything we thought we could, but we can do this other little thing, or we can use this other technique and sort of try to get at it, and it’s not gonna be what we thought we were gonna do that… Have you run into the challenge with that sometimes, seems like then the assumption becomes, oh, well, the smart analytics data science people figured out how to work around the crappy data, we don’t have to go and fix that in the future.

0:10:14.1 MH: It’s one of those where you have to be as aware of, wait a minute, we’re gonna do this thing that’s really making us uncomfortable, but it’s the best we can do, and somehow we have to make sure that there’s not an assumption that, oh, everything’s still just as good is if the data had been in good shape, if you run into that, you have thoughts about how to deal with that.

0:10:36.2 VA: Oh yeah. Yeah, and all the time, and I think a lot of it is not like… I wouldn’t even say it’s kind of the stakeholders fault because a lot of it is, there’s data producers and data consumers, and a lot of the time, sometimes they’re the same person, sometimes they’re not, and if they’re the same person, you’re lucky because you can kind of like, okay, well… For next time, let’s do this and let’s figure out a strategy to make sure we can run analysis on this, but if they’re not the same person, then you’re stuck with like, okay, now we are dealing with data producers who produce like mature data data, and data consumers who need an answer, and I think that really comes down to the risk of decision-making, so are they sending a space shuttle in space, are they performing surgery or are they just telling their clients something to upsell them, and it really depends on that threshold because there’s so much risk that they can take on with faulty data and you can produce some insights, and you can produce insights under those assumptions right in that box. But if you’re asking them to take on a lot of risk, that could end up being really bad for whoever their stakeholders or whatever their client is, so I think you really have to do…

0:11:44.4 VA: And I think, because in analysis, it’s a lot to do, but you have to do that risk assessment yourself and also make sure that your stakeholder knows what risk they’re taking on. Because in some situations, it may not be as dramatic or as bad, and maybe they can get away with some faulty… Not necessarily faulty analysis, but some analysis that relies in heavy assumptions and poor quality, but in some situations, it’s just not possible, so you have to go back to your data.

0:12:15.7 MH: That gets back to your point that you really need to understand the domain and not just the domain of the business in that case, is what are the stakes of the decision being made, and even if they’re really high stakes and the data is just not gonna be there… Better to have that discussion and say, well, it’s a really high stakes decision, so I guess I have to come up with something and cross my fingers that it works out.

0:12:47.8 VA: Yeah. Exactly. I couldn’t have phrased it better myself, that’s literally exactly the point.

[music]

0:12:52.8 MH: Alright, it’s time to step away from the show for a quick word about Piwik PRO. Tim, tell us about it.

0:13:00.0 TW: Well, Piwik PRO is easy to implement, easy to use, and reminiscent of Google’s universal analytics in a lot of ways.

0:13:04.4 MH: I love that it’s got basic data views for less technical users, but it keeps advanced features like segmentation, custom reporting, and calculated metrics for power users.

0:13:13.0 TW: We’re running Piwik PRO’s free plan on the podcast website, but they also have a paid plan that add scale and some additional features.

0:13:19.6 MH: That’s right. So head over to Piwik.pro and check them out for yourself. Get started with their free plan. That’s Piwik.pro. Alright. Let’s get back to the show.

[music]

0:13:30.6 VK: And when you’re having this conversation is like risk tolerance calibration with your stakeholders, I’m assuming that’s happening before you’re delivering the analysis, because this is a process that you’re asking as a call to analyst to think about before they get into the actual coding part of all of it, right. So is that a conversation you also have as stakeholders beforehand.

0:13:52.7 VA: So yes and no, right? Because sometimes you don’t necessarily know until you dive into data, but a lot of the time you should be able to assess, okay, I’m getting this data from data producers, can I talk to them about it? Can I talk to the engineers or whoever is making that information, the scientists who are running the experiment to verify its quality, but sometimes it just has to be a communication process, and it’s not… It can’t always be once and done, you just have to exercise a little empathy and go back and forth with them and feel like here’s my honest opinion. Let’s keep going back and forth and see, what are we trying to do together and where that threshold is?

0:14:28.5 VK: I like that a lot. One of the things that makes me think about is you talked about that when you were starting this new team with your manager that you wanted this to be a new way of working or like your new ethos that you were adopting is this type of practice. Can you talk a little bit about if there was any change management that you were kind of guiding your stakeholders too, ’cause I’m assuming you talked about really fast turn around times and some quick analyses, and now you’re inserting a much more thoughtful process and a lot of times having those conversations upfront and it’s framed in a completely different way, so what was that like? If you could share some details.

0:15:03.3 VA: Yeah. Yeah, that was difficult.

[laughter]

0:15:08.1 VA: But it was, I think it was well received after a while, and we kinda had to do, I guess, like elevator pitch with some of our stakeholders, and some of our business partners. Well, first of all, we went from calling them stakeholders to calling them business partners.

0:15:22.0 MH: I love that.

0:15:22.8 VA: And then they were like, oh, we’re partners. Cool. Now, we’re starting to work together. And then we’re like, okay cool, so here’s what we’re gonna do. We’re actually gonna try to learn as much as we can about what you’re doing, what you’re trying to do, about the subject matter that you’re working with, and then we’re gonna do the solutioning process, so we move it away a little bit from just doing analysis to more like analytic solutioning, and in that sense, we’re like… We will take a look at everything. We’ll map it out and then we’ll tell you, okay, we know you think you know what you want. And we will either confirm that or we will tell you yes, but a little bit differently. So it’s a little bit different than just like, I guess doing analysis, but in that sense, the team I’m doing right now, and it does similar, so we do like analysis, but we also do data modeling and other solutioning, but we employ or try to employ the same principal around it, and we found it, the change management was difficult, but eventually after we had a few successes with it, it was easier to work with the business partners and be like, oh okay, this is working, so let’s keep doing what’s working and we had a few failures.

0:16:30.7 MH: So was that the business partner… ‘Cause this again is giving me flashbacks to the case where… No, no, no, I just asked you to do this thing. And the business partners are the stakeholders, the business thinks that the way this should work is they make a request through whatever mechanism and let’s do stuff with it and the results come back, and that’s it. And if you go to them and say, but no, help me understand more about your business. To me, it feels like it’s a split, sometimes people love to talk about their business. They’re like, oh, let me pour my heart and soul out to you, you’re a receptive ear, nobody understands all the challenges I have, and that’s great, but then I definitely feel like I’ve run into… And I’m curious if the ones who say no, I made a request, give me the answer. And was that literally like, well, you just kind of muddle through with them until you have some successes with the other partners and eventually they come along, or are there ones that… I’d love if you’ve got… Don’t name names, but if there’s anybody in your mind that you’re like, oh yeah, this one was a doozy and it took years.

0:17:48.0 VA: It’s still taking years. [laughter]

0:17:51.2 MH: Okay. [laughter]

0:17:53.4 VA: Yeah. I think there’s also individuals who they know exactly what they want and they’re not as willing to really open their mind and think about the other potential information that they could be provided, but I think that’s okay too, because what we can do is, for example, if they’re asking for one thing, we could say, I’ll look at all the other insights we have, or another example of what we dealt with was if a person asked for something, here’s the information, here’s all the assumptions and here’s all the other info you should know about what you’re asking for. And it kind of like, I think it’s good again, because if it helps them make their decision, that’s what’s best for them, but they have to know what again, what risk they’re taking on in the making that decision, and my team hopefully is guiding them through that.

0:18:42.1 MH: So one thing I wanted to come back to is sort of around knowing the data, we’re getting up to speed with the data, ’cause sometimes you’re working in an environment where you’re dealing with the same data again and again, and so you develop a really deep understanding and fluency with that particular set of data. But a lot of times in organizations or if you’re in the role of an analyst or a data scientist, you might have brand new data sets thrown at you in a project. Can you talk a little bit about sort of, okay, well, how do you build up a data fluency? Or what process do you use for that? ‘Cause I think in some cases, it’s sort of an ongoing thing that you use all the time, and in other cases, it’s sort of like, okay, now I’ve gotta go in and explore this data specifically.

0:19:26.6 VA: Yeah, I think I’ve really dealt a lot with it at Indeed especially because there’s a lot of product changes and new products that are rolled out in new feature, so it happens all the time, and you kinda have to go into it and you can… So what my team and I do, we have this discovery document, at least for some of the analysis projects, and I list out, okay, here’s everything I know about this product, about this feature, and if I don’t, then I’ll schedule some calls and schedule some follow-up and read on the Wiki, and literally just learn everything, here’s kind of the facts, and then I’ll have like a dictionary, it’s like okay, here’s what this term means, because even for example at Indeed.com, right, we have a site that hosts jobs, hosts all the jobs theoretically, but even the definition of a job can be different depending on who you ask, and depending on what kind of decisions they’re trying to make, is it a job posting? Is it a job position? Is it a number of openings? Is it one of the openings? And it really changes, and even so simple definitions like that can get lost in translation depending on who you talk to, so I think kind of having that upfront, if you… Especially if you don’t have that vocabulary and you start something new, or you’re dealing with a new subject matter is so so vital to the success of your project.

0:20:49.4 VK: This is definitely one of the things that I’m interested in. I’m very much a process person, and as people know, I say I’m a process person ’cause I actually hate talking about process, so in my mind, I was thinking about like, okay, these are so many great questions, so many great thought experiments to put yourself through. I know where I work at Search Discovery, there were some analysts that collaborated on an analysis approach outline document, so shout out to Ryan Dupont, Samberg, and Julie Hoyer. I’m sure she’s dying. She would have been dying to be on this episode with you, but that document has some of the similar things in there, like what you talk about an article, not everything the same, but there’s definitions and assumptions and approach and the hypothesis. And so you mentioned a discovery doc, and so that’s why my ears perked up. So can you talk a little bit about how you operationalize what you do here, is it the same document all the time and you just customize it for the different lines of businesses that you support, or does it evolve, is it pretty stable? I would love to hear you talk a little bit about the process side.

0:21:48.6 VA: Yeah, so that is actually a good question, but it’s actually not the same document and every time there’s no template for it, and each individual analysis or each individual project will require typically a new one, and it’s because usually we kind of tailor it to whatever analysis we’re doing or whatever decisions are going to be made at the end. Because even two similar questions can be so, so different and will require different parts of the business, so for example, if my business is moving to from, let’s say a pay-per click model to pay for an application model like Indeed is doing. So all of your questions will be completely framed, like completely differently, whether you’re on one pricing model or the other pricing, it could be the same question, but a slight change in the definition of what the customer is paying for causes for a completely different perspective. So a lot of the discovery is actually framed around the question that’s being asked as opposed to just kind of general knowledge.

0:22:51.4 VK: So that’s really interesting that the document is different every time, and the way you explain that makes perfect sense, but I’m curious, just again, thinking about the operationalizing of all of this, does that make it really hard for analysts to join your team, like does someone not come new to your team from the outside? Does someone have to kinda be promoted within, so that they can really appreciate the nuances of the business and the unique way that your team approaches it. I’m just curious on that one too.

0:23:15.7 VA: Yeah, I think definitely it could be difficult, but we try to make it easy, so when I have individuals work, I usually have them work together in small teams, the division of labor or division responsibilities, like we all have a reporting lead analysis lead, data modeling lead for a project, and generally, they tend to learn from each other and support each other, but I think they can also refer to old discovery documents and the old documentation for different projects. I think we’re still trying to find ground footing, so I think it could be difficult, I can see that, but yeah, I don’t know, I hope previous knowledge makes it a little easier.

0:23:53.5 MH: The putting two or more analysts on it, I think that makes so much sense. And again, I’m thinking probably processing trauma of at times being told until you’ve been burned a few times, that is one way to learn, keep touching the hot burner a few times, and you learn that, oh, maybe now I should not make a bunch of assumptions, not write them down, make assumptions about the data, make assumptions about the business users need and run off and do stuff. But to have a little bit of that slow down and what are my assumptions and how can I write them down? The idea of having a couple of people saying, let’s do this together so we can bounce off… This field says, job post date. Okay, do we both think that’s really obvious what that job posting date is, somebody may say that better be one of the fields when we look at the distribution of it, ’cause if there are a bunch of them posted in 1972, that’s probably an alarm bell. The reality of work seems like, well, oh, but that feels like redundancy, but it also seems like in that early phase, you’ve gotta get a ton of pay-off to have more thoughts on this upfront work four eyes instead of two seems like it could head off all sorts of problems.

0:25:24.7 VA: Yeah, for sure. And well, first of all, it makes it easier to gather as much information as possible too, but also even, for example… The example of your job post date could be the day the job was posted, or maybe it was the day it was reposted, or maybe it was the day it went live for the first time, or maybe it went live the most recent time, ’cause you can pause jobs. There’s so much… Even when you start breaking down a simple… You always see on Kaggle, even you see a simple data set and it looks so simple and so easy right? But then you start digging into and you’re like, oh, there’s actually so much that this column could indicate and could be. And we know nothing about it. And I think that way, I feel like analysts are victims of the Dunning Kruger effect, where it’s like when we first look at it, it looks so simple and so easy, but unless we start doing it more and we’ll just assume that we know everything about it.

0:26:24.1 MH: Have you ever sort of had the opposite where the data is sort of stopping you at first, but then as you dig in and learn more, you find these alternate paths sort of thing, ’cause I think sometimes that happens too.

0:26:37.7 VA: Yeah, I think so.

0:26:40.6 MH: Yeah.

0:26:42.3 VA: I think the more complex processes you deal with, then you’ll see that probably, especially with financial systems and things like that.

0:26:50.1 MH: So Val had started to go down the path of your team a little bit, and I kinda wanna go back to that and just get a better… I love the idea but I’d love to understand a little bit more about sort of what drove the creation of it, but I am also curious as to, do you hire people directly in from outside to be on that team? I’m not about to ask for a job, even though this question may start to sound that way, are they coming in… It does feel like to Val’s point that they would be people who are a little bit more experienced. It would be hard to take someone straight out of university and drop them in, because it seems like there’s some nuance there, but at the same time, you also said you’re trying to train them and team them together, so I’d love to understand how you put together a team like that.

0:27:53.1 VA: Yeah, okay. So yeah, maybe I can give you a little bit more information about my team, but I have hired very experienced people and I have hired somebody right out of the university and both of them have been very successful, and from internally and externally as well. So my team is a solutions team like I mentioned, analytics solution, so some stuff what we do is analysis, like statistical analysis, data analysis, insights and recommendations, and some of the stuff what we do is putting together core data sets that enable our business partners to do analysis as well, and some of that is just data visualizations and things for individuals who are not trying to dig too deep, so we kind of do a lot of… In the way, when we came to be, really the goal was to… We need somebody who knows the business or who at least wants to know the business, so we can empathize with the business and provide them with better insights and recommendations, so they can make better decisions that are not inaccurate or not based on heavy assumptions.

0:29:03.1 VA: So I have, in my interview process, I actually have an analysis as part of the interview process, and the analysis is very, very nuanced, it’s actually a very, very simple data set. I hope I’m not giving away to any of my future hires. But it’s a very, very simple transactional data set of… If I give an example, like a housing dataset where for example, a set of houses that would have been sold like last year in a specific area, and it’s like five columns is super easy. And when you look at it, you’re like, oh, that’s so simple. I know everything about it. Or you can look at it and say, oh, let me assume that I know nothing about it. And let me try to figure it out. And a person who looks at it and says, I know nothing about it, let me ask questions about it, make sure I clarify all the assumptions, make sure I clarify all the biases and find all the biases in the data set or within myself when I look at a dataset to analyze it, those tend to be the most successful people, and those are the ones that I hired.

0:30:08.0 MH: In the assignment, do you leave it open to they can come back and ask, ’cause that seems like one of the challenges with assignments is that when it gets handed over and it’s like, well, this is totally artificial, I would never just take a data set and run with it, so, are they able to come back with the mechanism and actually probe, and now you may be giving people tips, ’cause probably the way they disqualify themselves is by not doing that. So…

0:30:33.4 VA: Yeah, so they can definitely come back and probe, but usually by the time they come back and probe, they’ve already made assumptions about the data set and have based their recommendations on those assumptions, so they’re kind of encouraged and we encourage them to ask questions of course, but they’re encouraged to ask, well, to get as much information out of there into your as much as possible.

0:30:58.1 MH: Got it.

0:31:00.1 VA: Okay, we should change the interview…

[laughter]

0:31:03.6 TW: I am kind of curious, are there any particular job services, sites out there that you get good candidates through? No, that’s kind of a joke. I was.

0:31:15.3 MH: Come on Tim.

0:31:15.7 VK: Good one Tim.

0:31:18.4 TW: Don’t share the secrets. Disclose.

0:31:21.5 VA: There’s a site with an I and ends with a D.

0:31:24.5 MH: Oh there you go.

0:31:27.8 TW: I was making it funny, Michael, if you weren’t following.

0:31:29.7 VK: You weren’t picking up what he was putting down, Michael.

0:31:31.3 TW: Yeah, yeah. I never share where I find candidates from. Those are the secrets.

0:31:38.2 VK: So one of the things you talked about earlier is creating empathy for your stakeholders, and one of the things that you did in this team is re-frame even that whole concept of your relationship and change it from stakeholders to business partners. And I’m curious about how much the activity of sitting down together and talking about the data and the definitions and some of those assumptions, has that gone a long way too for building some of those relationships. Obviously, the proof is in the pudding when your recommendations are way more spot on and something that they can immediately take action on versus, well, actually we need to massage this, I need you to go back, so I’m sure that’s a great proof point, but I’m curious about the exercise of sitting down, shoulder to shoulder and really understanding what it means to interact with that data and what they’re charged with, I’m wondering if that had an impact too, on some of the ways that this is all adopted in some of the successes.

0:32:29.2 VA: Yeah, definitely. I think, again, we don’t deal with the same exact data set every single day, and a lot of it is actually just piecing together different datasets from different engineering teams and then putting it together. I think a lot of the processes and a lot of the things that we do as a team have emerged from those conversations that like my teammates had, and it has emerged from literally just applying in practice what we talk about. So a good example is we would assemble a team to do an analysis or build a dashboard for our business partners, and then it would come back a month later being like, oh, we should change this, or we should change that, calling our stakeholders business partners actually just emerged from putting everyone in one room and one of our direct reports, Shawn, shoutout, that was one of his, actually he was like, hey, why don’t we go call them business partners, and it really has, I think, had a tremendous impact in my opinion.

0:33:31.7 VK: Yeah, I like it a lot. I like that a lot.

0:33:34.0 MH: It’s interesting ’cause we teach the same thing, there are partners, and that helps frame the relationship differently. Yeah.

0:33:44.6 VA: Yeah. But I think they also need to know that the partnership goes both ways, just because you’re my partner doesn’t mean that I’m not yours, so it has to be like reciprocated.

0:33:56.2 MH: Yeah, but the stakeholder concept kinda makes it that order-taker, they just tell you what they want and you just have to figure out how to fix it or do it or provide the analysis without necessarily the interaction that you’ve developed as the process, which I think is excellent.

0:34:16.9 TW: Does that sort of reframing of both the… How you think about them and how they hopefully think about you and how you interact. Does it lend itself to them getting a deeper understanding of some of the nuances of the data you mentioned earlier, some of the business partners come in and they know exactly what they want, and I think Val maybe you are coming at this as well, they know what they want, but they may not realize that what they want isn’t directly available, if what they want is pull data from the job posting system and pull data from the leads system and just smash those together and then do me an analysis and what they’re missing is… There’s messiness where that join doesn’t work as they would want it to. Does that come up where you can say, well, let’s… I’ve gotta sit down and, we’ve gotta talk about this because we have to make some decisions.

0:35:29.6 TW: I’m seeing some people from my past saying, I don’t want to get into details, I know what I want, and it’s your job to go make it happen, but there are judgment calls involved there, and you really need their help on making those judgment calls, does that… Which means then the fourth time you’re working with that stakeholder… Shock me. The fourth time you’re doing a project with them, hopefully then they’re more open, like I’m envisioning this kind of nirvana where you’re moving forward and it becomes much more Kumbaya, but the business partners are getting a more nuanced understanding of some of the limitations of data, I guess.

0:36:19.7 VA: Yeah. So I think then you have to ask them why, and I think why is such a good word to ask because sometimes they don’t even know why, and it helps them in their own discovery journey to figure out why they need this, or they could say, oh, I need this, this, and this. And then I say, okay, why do you need this? They’ll start actually thinking through the entire process and it sounds different in your head and it sounds out loud to somebody else, and when you say it out loud, it’s starting to form into actual… Oh, actually, maybe that’s not necessarily what I need, so this is a really good strategy, but also, it really depends on how much trust you also have with your business partners. There was one project where I was working with an enterprise sales team and we kind of took all of their requirements and we just said, no.

[laughter]

0:37:06.5 VA: They provided us with spreadsheet even. We were like that’s fine but I don’t think that’s what you’re looking for, and we started… We were just like, let’s start with just one metric, what is the most important. They’re like, we don’t know, and okay, let us think from your perspective, so we sat with them and we shadowed them and we talked to them, but like, okay, so the most important thing you want to know is revenue, so let’s start there, and we gave them a dashboard, and here’s your revenue numbers, what else is gonna help with.

0:37:37.0 VA: We love them, play around and do the cycle, and I think maybe just doing the cycle to help a lot, but also with just asking, why do you need this?

0:37:45.7 TW: In that process, it sounds like you’re doing a lot of education as well. Is that accurate to say?

0:37:52.5 VA: Yeah, that’s right.

0:37:55.2 TW: And in a certain sense, like building the data fluency of these external teams with these business partners in terms of their own data, what they are like that’s… That example you gave is like, there’s so many versions of that that you could probably share or people could share is like, oh, they don’t even know their own data, they don’t even know what they’re trying to get from the data necessarily, so it just shows the validity of the approach, I think in a lot of ways.

0:38:25.4 MH: But I think there’s a piece of it, which I think I’ve often struggled with that, when they come with that big list, that is the best intentions and they are often they’re trying to help. They’re saying, I don’t wanna come to you with, I don’t know what I want, so let me just write all this stuff down, and I’ll give you more and more must be better, and there’s that… And I’ve worked with some analysts who will… That can come back in saying, I mean, you said no, you were being a little flippant, I’m pretty sure you didn’t say thank you very much. No. Throw it out. Sit back down. I have dealt with analysts who, that was kind of… They almost had contempt for their… They weren’t business partners. They were like, they don’t know anything. So it does, and I think Val you’d said empathy earlier as well, that there’s a piece of even saying, why do you need that? There’s like why, why, why. There’s like an attacking abrasive way of doing it, and then there’s more of a, well, help me understand why, so that I can better serve you, so I can’t believe somehow this worked its way into Michael’s emotional intelligence world.

0:39:41.8 TW: Yeah!

0:39:42.2 MH: But when you think about who you’re hiring and how your team is working, like it does seem like the empathy, emotional intelligence, communication skills, the relationship building, how do you… Do you look for that? Do you coach for that? Do you have activities to help drive that, or is it not that critical after all, you just tell them to know piss off and come back with better requirements.

[laughter]

0:40:18.6 VA: So yes, 100%, we look for that, we hire that and then we also train that, and I think… Again, like my previous manager, Peter, and we have discussions about that before, because we found that those soft skills are so much more important than you’re aware of technical background. You can give me a stack of resume, information about all of your technical abilities, but that’s so easy to teach, but that ability to think outside of the box, think from somebody else’s perspective, think critically. That’s so hard to teach. So we try to hire people who already have that, even if their technical background is not as strong, and then we’ll continue to train them.

0:41:04.9 TW: That is such an important message. I wish every analytics team everywhere, could just hear you say that right now, with some of them will just… Awesome. Yeah, I so agree with that.

0:41:22.5 VK: I have another question for you, a slightly different direction, so in your article, you juxtapose you’re approach against the exploratory data analysis book. I had heard of that, but I don’t have as deep of mathematical background or training as you do, not by a long shot. So I’m just curious, is that really… Was that a really big part of your training? Was that practice something that you brought to the role in your early days and then you kinda discovered this over time, or was this just something you kind of knew was out in the ether, and this was something that you had to kind of combat with your approach, I’m just kind of curious about some of your experiences and inspiration there.

0:41:57.4 VA: Yeah, so that’s a good question. So the exploratory data analysis that EDA is essentially down a lot in statistics and data science primarily, but it can be done in data analysis too, and it just allows you to take a look at your data set in twisted and turn it like a Rubik’s cube multiple times. But I think the reason I juxtapose it with is just because discovery has to be a proceeding step. But actually John Tukey himself said that statisticians play in other people’s backyard all the time and that’s all they do.

0:42:31.3 VA: And by that, he meant like you have the skill set to work on any problem within any subject matter, but I think you have to explore that backyard to understand what you’re dealing with or what kind of information you see, ’cause backyards are made completely differently. But yeah, so I think that’s kind of like the… If you have discovery, if you have that knowledge that even the exploratory data analysis comes easier. It allows you when you look at and you twist that data different ways, you know what your columns mean, you know all of your definitions, you understand where that information is coming from or what it means and what kind of bias you can have ’cause there’s, you can have some kind of confirmation bias already, and unless you check yourself, you may produce insights and recommendations that are confirming to some kind of opinion that you might have.

0:43:24.9 TW: So they’re complementary. I like the backyard… If you just dive into the data, which this is like my knock against somebody who says I wanna be a data scientist that did a six-week boot camp, and that means to hand me a data set and all of a sudden they make assumptions that this data set defines the backyard that I’m playing in, whereas if they haven’t gotten the skills to say, let me understand what that yard looks like. Now, once I understand what the backyard is, I probably wanna take the dataset for the backyard and dig into it and say, does that match up with what I’m hearing. That’s like, I think the first part of the exploratory data analysis does this make sense? You told me this backyard was represented in the population, but my data set says that 95% and the people in the backyard are over 70 years old. That doesn’t make sense. So the fact that they’re complementary, and we will definitely link to the article, it is short, but it is really, really, really interesting and making the case to say, you gotta figure out what the backyard is first and then do the other parts. I think that’s such a great, great point.

0:44:43.4 MH: Yes, Tim, it’s such a great point. And now, we definitely gotta start to wrap up, but this has been actually been an entirely great conversation, and Viyaleta, thank you so much. I think the way that you’ve expressed some of these ideas and the thought process you put into them has done such a beautiful job of taking some fundamentals of analysis and data and sort of framed them in a new light and put a new facet to them and I just… This whole conversation has been incredible. So thank you so much for coming on the show. I really appreciate it.

0:45:17.2 VA: Thank you so much.

0:45:18.8 TW: Oh no. Seriously, this has been fun. And so one thing we like to do is go around the horn and share something that might be of interest to our audience, anything at all, it’s called our last call, and Viyaleta, you’re our guest, do you have a last call you’d like to share?

0:45:35.1 VA: I do have a last call, I was gonna say, ’cause I prepared for this. My last call is actually a video by Veritasium, I don’t know if you guys are familiar. He’s a YouTuber. Yeah, he’s a physicist YouTuber. I think YouTube is now his full-time career, and one of the more recent videos that I watched and I shared with my husband and I shared with my team, is called is success luck or hard work? And in that video essentially, Veritasium shares that we all have some egocentric bias where we think we experience our life to ourselves, right?

0:46:16.2 VA: So we spend most of the time with ourselves. So we think we do actually more than everybody else, or we perceive ourselves to do more than everybody else just because we spend more time there in our body, so we tend to underestimate the role of other circumstances or other people that play on our life and on things that we do, which also means that we tend to underestimate how much circumstances or even luck play in our success, and he talks about how it can be… It can be helpful to think that our achievements are completely our own and because it gives us confidence and gives us that boost to keep going and working, working really hard, but at the same time, if we do succeed and we are very successful, we also tend to have survivorship bias too, because we don’t see all the people that also work just as hard if not harder, but failed due to circumstances or bad luck, and I think it was a very powerful message for analysts and all people around, so I really wanted to share it.

0:47:22.2 MH: Awesome. Thank you. It’s awesome.

0:47:22.4 TW: I think the upper middle class white dudes really, really need to hear that message.

0:47:29.3 MH: I think… I’m here because of hard work. Tim, well you know, pull yourself up by your boot straps. Okay, Tim, what is your last call?

0:47:44.7 TW: I am gonna do a quick two-fer just because I’m gonna say people should go look at Viyaleta’s Medium page because you do write on to say that it is a broad range of topics, today, I wound up getting sucked into another old… Making a Sky Map in Python, which is another old post, and then there’s stuff that is way over my head, and diving into Julia was another interesting thing. So that’s kind of a quick one, it is just… I kinda wanna get into your brain is, I think I’ll do this other completely different thing and write it up is fun and fascinating. Then my main last call is that Lea Pica, past guest on the show, friend of the show, I think all of the co-hosts know her, her book is now out ‘Present Beyond Measure.’ So that’s also the name of her podcast. But it is Seth Gordon blurbed it a simple, clear and constantly overlooked a wisdom, it’s time to stop wasting time and start making a change happen, but her book, it is a hefty tome about designing, visualizing, and delivering data stories that inspire action. So check that out.

0:49:03.7 TW: I can’t say I’ve finished reading it ’cause I just got a copy, but I’m diving in and kind of flipped around a bit in it and it’s like we have got to express the stuff that her wisdom has been building up for years and years and years, and written it and they have in 4X, so there are four X to it with intermissions even. So check that out.

0:49:29.0 MH: Nice. And Val, what about you? What’s your last call?

0:49:33.2 VK: So this one was completely unrelated to this topic, but throughout the pandemic, we were all on Zoom more and more, and turning ourselves into potatoes and all the interesting backgrounds, and it kind of felt like Zoom wasn’t gonna surprise me anymore. But I was on a call recently with Manuel Da Costa from effective experiments. And he had this crazy dynamic background with this live agenda and had a time-keeping aspect to it and, I’m like, what is all these shenanigans on your Zoom, and he made me aware of the Zoom app marketplace, and I double-checked with some people at work, and I am not the last person to discover this, so hopefully this is relevant for some folks, but there’s actually a lot of powerful tools with some kind of cool integrations beyond just those note takers, that 12 of them join every call nowadays, which… God, I love that. But even the timer one, which you don’t have to ask for extra admin approval, if you’re trying to keep people on an agenda and you’ve got a status call or something like that, it can be really helpful. So I’ve had fun playing around with that, so if you haven’t checked out the Zoom app marketplace, there’s some fun things in there for productivity and scheduling meetings and interactive agenda, so maybe you might find that fun. How about you, Michael, what’s your last call?

0:50:55.2 MH: Nice. Well, you know I like to run a crisp status meeting, so that sounds right up my alley. Alright, well, my last call is something we’ve probably talked about on the show before, but I wanted to re-bring it up, which is our good friend, Kelly Wortham, who’s also been on the show, has a community of folks who are dedicated to the discipline of experimentation, and optimization called the test and learn community. And recently, it became a non-profit, which is super cool. And it’s just so cool to see that community grow. I think they have a Slack group now that’s over 1000 people, so if you’re involved in that space at all and experimentation… That is a great group of people who are very similar to you, and you should go. And we’ll have a link to that in the show notes so you can check it out. So that’s my last call. Alright, you’ve probably been listening in and having some brain waves happen about… Oh my gosh, that’s a great idea, or I wanna learn more about that. Well, if you’ve got any thoughts, we would love to hear from you, and the best way to do that is probably through the Measure Slack community or LinkedIn page or on X, because we’re still there for some reason, and that’s it.

0:52:15.5 MH: And I don’t know, Viyaleta, are you active on social media anywhere that people could reach out to you, or obviously, you have your Medium blog post, so we will provide a link to that.

0:52:26.5 VA: Meet me on LinkedIn.

0:52:30.2 MH: Perfect. LinkedIn. That’s a perfect place. So no, great, and thank you so much for coming on the show. It’s been a real pleasure to get to know you a little bit better and the work you’re doing, and Indeed, it’s pretty awesome, and of course, no show would be complete without a huge thank you to Josh Crowhurst, our producer, may be especially this time.

0:52:50.5 TW: Especially this one.

[overlapping conversation]

0:52:52.5 MH: Things that the listeners…

0:52:53.9 VK: Sorry Josh.

0:52:55.0 MH: Won’t hear, but Josh will. But we really appreciate everything you do to make things happen for the show behind the scenes and make the show a reality. I appreciate it very much. Once again, Viyaleta, thank you so much for coming on the show, we really appreciate you spending the time with us, and I know I speak for both of my co-hosts, Tim and Val, when I say, no matter where you’re at, exploring that data, remember, keep analyzing.

[music]

0:53:25.9 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at @analyticshour. On the web, at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

0:53:44.3 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.

0:53:51.8 Kamala Harris: I love Venn diagrams. It’s just something about those three circles and the analysis about where there is the intersection right.

0:54:01.9 MH: This Riverside platform doesn’t like you Tim.

0:54:05.8 VK: But the Riverside platform makes me sound like a jerkasaurus rex, ’cause in the last episode, I kept cutting him off.

[laughter]

0:54:15.9 TW: I can’t hear Val…

0:54:17.5 MH: I also cannot hear Val.

0:54:19.8 VK: I was on mute that time. The times.

[laughter]

0:54:25.0 TW: You know what…

0:54:25.1 VK: Gotcha.

0:54:25.2 TW: Now you’re just fucking with it.

0:54:25.6 VK: Gotcha.

[laughter]

0:54:29.3 VK: Gosh is like this is the worst experience ever. Okay, regroup.

0:54:37.2 TW: Val, she already said her nerves for coming down, you don’t have to do more screw ups.

[overlapping conversation]

0:54:45.2 TW: Val is going out.

0:54:46.0 VK: This is what happens when you give me a producer responsibility. My face is red. This is too much pressure. Okay.

[music]

0:54:47.3 MH: Rock flag and business partners…

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares