#204: Data as a Product with Eric Weber

Have you ever built a data-related “thing” — a dashboard, a data catalog, an experimentation platform, even — only to find that, rather than having the masses race to adopt it and use it on a daily basis, it gets an initial surge in usage… and then quietly dies? That’s sorta’ the topic of this episode. Except that’s a pretty clunky and overly narrow summary. Partly, because it’s a hard topic to summarize. But, data as a product and data products are the topic, and Eric Weber, the data scientist behind the From Data to Product newsletter, joined us for a discussion that we’ve been trying to make happen for months. It was worth the wait!

Blogs and Other Resources Mentioned in the Show

Photo by Kelli McClintock on Unsplash

Episode Transcript

[music]

0:00:05.9 Announcer: Welcome to the Analytics Power Hour, analytics topics covered conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.

[music]

0:00:22.0 Michael Helbling: Hi everybody, this is the Analytics Power Hour, and this is Episode 204. The use of data to transform decision-making and business outcomes, that’s certainly something we all believe in. And the methods involved in doing this are many. And you might have heard of the idea of data as a product before. Should all of our data be a product, or what does that even mean? Well, we’re gonna explore that thought in a lot more detail in this episode. Let me bring in my two co-hosts. Moe Kiss, do you use any data products?

0:00:56.2 Moe Kiss: I’m starting to dabble, yes.

0:00:58.6 MH: Well, you presented on this topic at the Marketing Analytics Summit this past summer, so I’m sure you’ve got some relevance for you.

0:01:09.0 MK: It’s very near and dear to my heart, yes.

0:01:12.1 MH: And Tim Wilson, how are you, sir?

0:01:12.8 Tim Wilson: Good. My middle name is Data Product actually.

0:01:16.9 MH: Oh, Tim Data Product Wilson. I love it. And I’m Michael Helbling. Alright. Well, we needed a guest, someone with some experience on this topic. Eric Weber is the senior director of data science at Stitch Fix. He has 15 years of experience working in and teaching others about data. Prior to Stitch Fix, he’s held numerous data science and experimentation roles at companies like Yelp, LinkedIn, many others. His newsletter, From Data to Product, is excellent. And today he is our guest. Welcome to the show, Eric.

0:01:48.8 Eric Weber: Thanks so much for having me.

0:01:51.0 MH: Oh, it’s a pleasure to have you.

0:01:52.8 EW: My middle name is not Data Product, so I feel like I’m behind.

0:01:56.3 MH: Not yet.

0:01:57.7 EW: Not yet.

0:01:58.3 MH: Not yet.

0:01:58.8 EW: Can change, can change.

0:02:00.7 MH: Well, you know, there’s a new king of England, so he might start off by just knighting a bunch of people. So you could become Sir Data Product maybe, I don’t know. [chuckle]

0:02:09.7 EW: Will that allow me to retire?

[laughter]

0:02:13.6 MH: I don’t know. Anyway, we’re very excited to have you on the show. Moe has talked about you and your conversations together quite a bit. And so I think maybe to get us kicked off, maybe just tell us a little bit about what you do and your role and some of the things you write about, and we’ll jump into the conversation.

0:02:33.6 EW: I think the best way to think about my role at Stitch Fix is maybe just think more broadly about consumer-facing products. And I think this is, back to your point, about… We talk about decision science and making better decisions and setting up businesses to do that. The reality is, it’s really, really hard to do, because you have people making these decisions [chuckle] and not… You haven’t automated them away.

0:03:02.1 EW: My role is to focus on how to create reusable tools, platforms, capabilities that at least remove some of the pain of making these choices and decisions. Certainly one of those areas is in experimentation, trying to actually establish some degree of causality so that when someone says, “Why?” you actually have a reasonable answer. But there’s a whole bunch of adjacent things you can do to that, where, whether it be through causal inference or even really thoughtful analytics, and we can get to that later, “analytics” is such a tough term.

0:03:43.9 EW: But done in the right way, you can still make a lot of thoughtful, reasonable choices understanding this one way or the other. My team is focused on building those capabilities and in some cases those come in the form of internal user-facing products, in other cases, there’s a lot of just technical architecture on the scene.

0:04:05.8 EW: So Stitch Fix, like a lot of companies, is trying to leverage technology to innovate. And as you see experimentation crop up across many different companies, people keep asking like, “Why don’t we just build one product that does it for all these companies?” The reason is that each company is fairly different in how they make choices, and very few of them are comfortable outsourcing that to a third party product.

0:04:34.2 MK: But so I’ve been reading your blog for a really long time, and anyone who has spent any time with me at a conference or in the office knows that this is a topic I’ve really been kind of going deep on it and reading pretty widely. And yeah, it’s fair to say that you and your blog were a big inspiration for me in kind of going down this path about data products.

0:04:56.6 MK: But what triggered it for you? What made you start? ‘Cause you write about it pretty often. And you can… I, in my conversations with you, have seen you… That your thinking has been shaped over time.

0:05:09.7 EW: Yeah.

0:05:10.9 MK: Where are you drawing from?

0:05:12.8 EW: I wish I had a great answer, other than I had enough conversations where the gist of it was we kept talking about these companies that just have essentially internal graveyards of stuff they had built and then discarded. And they were like, “Hmm, we have a lot of stuff that we’re not using. We probably paid a ton of money to produce those things.” And I think it’s more a question of like, “Was that necessary?” like, “Could we do it differently? Could we do it better? Could we have actually thought about our approach to building these things?”

0:05:46.7 EW: And with… I think with how data science came onto the scene, it came onto the scene through, in some cases, just using old SQL databases that people were like, “Oh, I’m gonna do new things with.” Or in some cases, it was starting to write code in R or Python. So you just got all these things tossed together all at once, and it was kind of almost okay in the name of like, “We’re investing in data science.”

0:06:15.6 EW: I think now, we’re at a point where a lot of companies are asking the right questions around, “Is this worth it for us? What does it mean to actually do this the right way?” And so to me, it’s actually trying to be a bit more responsible with, “What do we build? Why do we build it? Should we continue to invest in it?”

0:06:36.9 EW: I had enough of these conversations that… I was like, “I should write.” I like writing, and at that point, I had no other idea like, “Okay, Substack seems alright,” and it was like one day I just sat down and started doing it. I think what keeps it going is that I get feedback from people that actually just changes how I think about things, it challenges some of the assumptions that I have.

0:07:04.8 EW: And that’s pretty rewarding because it ends up actually benefiting me at work to be like, “Oh, I should do this differently, or think about this differently,” or, “Oh, I have no idea what I’m talking about.” And that’s often a good experience when someone can teach me that.

0:07:19.2 TW: So, as you were saying that these things get built and then they kinda wind up in the graveyard of like we invested a lot in this, but they’re not getting adapted. I couldn’t… I can’t help but think back to the data warehouse boom, and it feels similar, that there was a time where when an earlier wave of organisations wanting to use their data, realising not really a good idea to just be writing queries against production systems, the data is not in good format, they put ’em in data warehouses.

0:08:00.0 TW: But there was this a lot of hand-wringing back in the late ’90s, early ’00s, of like we keep building data warehouses and we think that’s gonna be the answer, and then the data warehouses kind of died on the vine as well. And it seems like from… If I understand a lot of what you talk about with data as product, it was a similar symptom.

0:08:26.2 TW: It was kind of the myth of, “If we build it, they will come,” which, is that fair? And is there also a risk that even with data science people are like, “If I throw a bunch of Python at this stuff, or if I stand up the ability to build a model, they will come,” and what gets missed is, “No, wait a minute, what are we doing? How do we need to do it effectively and efficiently? Who’s using it?” And is that the miss?

0:08:56.1 EW: I think that seems right in terms of the miss. What I’m writing about now probably could apply to many other periods where companies go through some period of innovation and a new set of tools. If you thought about when everyone in maybe 2010 or 2011 was like, “Hadoop is gonna be the thing that changes everything so we need to definitely go that direction,” and now it’s like who would do that? It doesn’t make sense. So I think it’s… Maybe part of it is just, can we… I should not assume that data science and a lot of these tools are the last time we’re gonna go through this explosive change.

0:09:39.8 EW: I think it’s a question of five years from now, is there a way for us to actually plan or think through what we’re investing in so that we don’t end up with in the same position. Or like, “We have a whole bunch of stuff. We might not need it.” ‘Cause honestly it impacts, like maybe even more so in a major way, like the cost of data science, especially a lot of modeling, and it has a real tangible cost that shows up on the balance sheet in a way that is very obvious, and the more we can get ahead of that, I think the better.

0:10:15.6 MK: And the funny thing I think, I actually was having a conversation about this yesterday, exactly about the cost. How do we know when say like a machine learning model should be deprecated? And we were having this conversation with a senior data scientist who was like, “Well, you just compare it to another version of the model and see if it’s performing better.”

0:10:30.7 MK: And we’re like, “Yeah, but how do we know we should be doing this model instead of something totally different?” There does become a point where the cost of maintaining it, whether it’s to do with the volume of your data, whether it’s to do with how often you have to update, whatever the case may be, even in an automated world, there is a cost, and I sometimes feel like data has gotten away.

0:10:53.5 MK: I work with data teams all the time that are like, “Oh, there’s no way to quantify our impact,” and I’m like, “But we’re telling our stakeholders to do that. We have to be held to the same standard.” And it’s I think a big part of this for me is that it has been easier to quantify the impact, because you have something that the team have worked on, they’ve put together, and you can be like, “It’s reduced FTE or it ad a substantial impact on our foundational goal.” Or like I personally have found it easier to tie it to I guess the business value.

0:11:25.4 EW: Yeah. Maybe it’s a reasonable question to say, and I ask my team this, I’m like, “Where do you think we show up on the balance sheet, on the income statement? Are we… ” Honestly, it’s a fairly… They don’t love that question because they’re like, “Well, I do science, not business.” I’m like, “Well, your paycheck is definitely not signed by science, it’s signed by a business.”

[chuckle]

0:11:50.2 EW: So I think it’s an important question to ask, to say, “What is responsible from a business standpoint to invest in?” And in some cases like this focus on data as a product, for me it often shows up in the context of how can we be more responsible with what we’re investing in, when we’re investing in it, and when we should deprecate it or get rid of it. It’s not an easy question, but again, I think asking the question is pretty important.

0:12:22.4 MH: So this may be stepping back a little bit, ’cause I’m still a little confused of what we’re talking about when we say “data”. I get that when we say “a model”, so that can be thought of as a product, but now it’s kind of in the vernacular of “products and services” so data-as-a-service. And I think you had one… You did have an article where you were like, “Not everything to do with data should be considered a data product,” but I guess I even get wrapped up around the data products, so I think of things that are like these…

0:13:00.6 MH: You know, they’re an application, there’s something that is a thing that you can draw a box around and it can be product managed. But there are times where data as a product, is that the same thing? And where does the service of performing ad hoc useful analyses that it needs a skilled analyst or a data scientist working with the stakeholder for a one-time thing, does that still fit in this world or is that something separate? No, okay.

0:13:34.7 EW: I think it’s separate. So I think about things like analysts, consultants, whatever, we create a lot of useful assets. So I think those are more one-time reports, analyses, like, “Here’s what’s true right now, here’s how to think about this, here’s why this metric is the right one to use in our business.” And actually those are really, really important and should not be like… Not everything should become a product, primarily because businesses operate off of those analyses, those discussions, those… And so those can be really, really valuable.

0:14:17.5 EW: Now the question of like, How do you make something like that live on and continue to have business value? Maybe does that analysis need to be repeated? Do you need, do you want someone to be doing the repeating of it, or do you want that to be automated? That’s maybe where you start talking about a product. And a good question for me, is if I’m trying to define something as a product, I’m like, “Who’s the customer? Who’s interacting with it? Who’s benefiting from it?” And that often helps me try to think through that.

0:14:47.8 MH: So in a way, kind of taking that work structure and going up a level and sort of saying like yeah, that cadence of business interest, analysis, response is some sort of product itself in a way.

0:15:05.9 EW: Yeah. Or it can be. I think that’s an important question. Really common for people to start when they’re developing metrics to have together a query that is like, “Here’s the metric definition.” And if they know what they’re doing, they can potentially implement that and actually put it into production so that it works at business scale.

0:15:30.6 EW: But do you want to make that easier? Or do you want to make that layer accessible to a large group of people who are maybe not experts? We have a metrics layer and metrics product internally that allows people to define and actually manage these definitions. I think that’s an example of going from a one-off, “Let’s get this thing so it works,” to making it easier to do something and manage it over time.

0:16:00.3 MK: And also just the reliability. For us, we’re starting to really talk about metrics, I guess, that you do wanna productise or whatever you wanna call it. And for me, it’s also, are there core metrics across the business? And this does come back to that whole idea of a metrics layer that you need… Reliability is a real issue.

0:16:18.4 MK: If it goes down for a day or two, is that gonna be problematic? Versus you running some code on your local machine and turning out a report for one stakeholder, and it’s not gonna be the end of the world if it’s delayed by day. That for me is a big differentiator as well.

0:16:33.9 EW: Yeah, I think experimentation is a good example because it tends to be a fairly high leverage use case. For my team, experimentation platforms, certainly in Stitch Fix, every single piece of traffic goes through it. And so we have to decide, what experience are you gonna be exposed to? If the platform goes down, or it’s not doing what it’s supposed to be doing, it essentially throws hundreds of experiments simultaneously into chaos and they’re no longer reliable.

0:17:04.8 EW: Arguably, if it goes down for five hours, it basically invalidates a huge set of experiments. So things like that. I think there’s levels too like tier one, tier two, tier three type of systems. Metrics certainly are important, if they went down for an hour, potentially it would be okay. But you start to get into these questions of, how long can your business function without this thing working?

0:17:35.5 TW: Well, so that’s… An experimentation platform in this world, that is… Presumably there is home-grown or whether it’s one that you’ve bought that you’re managing, that is one class of data products.

0:17:49.3 EW: Yeah.

0:17:49.3 TW: Right? You just mentioned the metrics. Can you help me understand what product metrics data product is?

0:18:00.0 EW: I mean, in this case it’s… I would think about it as an interface that allows people to interact with metric definitions, health of that metric, lineage of how that metric has been defined over time. So in some cases you can… There are certainly third-party tools that do it.

0:18:19.0 EW: For me it’s like, people are actually able to interact with, understand and explore those metric definitions. It’s not just the management layer where you’re actually defining the ATL, you start to actually expose it to other customers internally. That would be an example.

0:18:38.7 TW: So that could be a data product that the analyst may be using as they’re doing ad hoc services, but it’s also tied into some self-service tools that a business user… So that’s like a… It’s a multi-use… And it’s gotta figure out back to your, “Who’s the customer?” Like treating it with, who’s the customer, where’s the value this is delivering, and then product managing the data product?

0:19:04.3 EW: I think that is really important to just think about, who’s benefiting or who do we want to benefit from this thing? ‘Cause if you don’t answer that question, it’s really hard to build something effectively and actually even decide how much to invest in it, when should we invest in it.

0:19:24.9 EW: For large… Arguably, for small data science teams, the difference between if you have five people versus hundreds, that probably shifts once you’re prioritising in a pretty major way, just from a company standpoint. ‘Cause you gotta also ask what’s right for the company as you’re thinking about who benefits from this.

0:19:47.6 MH: So what are the other… I feel like we’re starting to tease out… I remember Moe presented this list, but I think she totally lifted it from you, it was like different types of data products and we just hit a couple of them.

0:20:00.7 EW: Yeah.

0:20:01.3 MH: Do you rattle off the types of data products you see out there off the top of your head?

0:20:07.4 EW: I think I have a… There’s a danger of saying that we’re an analytics, but there’s enough product that purport to be analytics products that I think it’s worth talking about at least. So if you think… I mean, analytics is such a broad area, but one certainly one would be a data visualisation layer. It’s like, how do you want to enable people to interact with the data that you have and metrics you’ve defined, experiment results.

0:20:40.7 EW: So I think like UIs often are like, if they are enabling people to interact with data, and actually this is maybe more on the decision layer, that’s really important, like how you design these, how you create these. I’m gonna bash Looker here for a second. I hate it.

[laughter]

0:21:03.5 EW: I hate their UI layer. In particular, if I think about it, it’s like how do you enable people to get to what they’re thinking about and to actually think more deeply about the data? I think products sit across that spectrum, like who they’re trying to do it for and how easy it is to use. So analytics is the UI layer, and then I feel like we’re gonna get into controversial, like, “What is analytics and what is it not?” here super quick.

[laughter]

0:21:32.9 MH: We’ve never shied away, and that’s why we’re the top-rated explicit analytics podcast.

[chuckle]

0:21:42.2 EW: So I think the other one that’s probably really relevant to talk about is just like EML-related products, just because I think they’re probably a good illustration of how do you make ML… Even beyond what machine learning engineers need, how do you expose layers of ML on models to people in a business so that they can make sense of it? We have… One of the teams in my organisation is called our Algorithm UI Team, and they just build UIs so that we can think about how to enable people to interact with our recommendation systems.

0:22:22.3 EW: One of those is personalised style, so we have this huge visualisation layer that’s just called… Basically it allows people to visualise in the three dimensions, what is someone’s style and how are we measuring it? And so I think for ML especially, because it can become so complex so quickly, there’s a huge number, huge opportunities to build these products that allow people to make sense of things that otherwise, even five years ago would have been really challenging to interact with.

0:22:57.6 TW: Does the email platform, so there’s the making sense of what comes out, but is it also the explosion of startups that are pretty much auto-ML, throw your data in, they will do it for you…

0:23:10.5 MK: Well yeah, see I’ve always thought of more as like an actual pipeline so that engineers can push their models back to the product, but maybe I’ve totally misinterpreted this one. [chuckle]

0:23:25.0 EW: No, I don’t think you’ve misinterpreted. I think…

0:23:26.5 TW: Depends on who the customer is. Depends on the customer.

0:23:30.1 MK: Mm-hmm, mm-hmm.

0:23:31.1 EW: Arguably, the word “platform” is a horrible word, because even within experimentation, you probably have some products that you can fairly easily define. Similar for machine learning, you go back to who’s the customer. For people who are benefiting and need to understand what these models are doing, that can potentially be one group.

0:23:58.2 EW: For ML engineers who are just trying to make sure they have stable and reliable pipelines that are feeding back and forth to the product, that’s another entire class of these things. And so I think what I’ve learned and probably what I should start doing more often is to just try to not use the word “platform”.

0:24:16.4 MK: Good luck. [chuckle] I feel like I say it 12 times a day.

0:24:21.7 EW: “Platform” is just this… The image of it makes it seem so much cleaner and well-defined than it ever is. Like saying, “Oh, we have this platform thing.” But there’s, it’s like, what makes it up? What makes up that platform? And I think for data products, that probably without… Saying the word “platform” probably hides a lot of that complexity, when it’s actually a bunch of different things that make it up.

0:24:50.8 MK: You know, the funniest thing is, as you were saying this, I was thinking… So the one… For anyone that has heard me talk about this, I have taken some of Eric’s ideas and adapted them very selfishly for my own purposes, and one of the additions that I’ve made is a measurement platform.

0:25:06.7 MK: But as you were talking about this, I was like, “Does measurement platform fit in? ‘Cause it’s not really one tool. It’s actually a suite of tools that we use for measurement.” And now I’m like, the problem is just that I’m using the word “platform”, which implies that it’s one thing, when it isn’t, it’s actually like five or six totally separate tools we use…

0:25:26.3 TW: What does a measurement platform do? What are those five or six or three or four? Is that the collection of the data? Is…

0:25:35.7 MK: We have an MMM, we have an experimentation tool, like a match market test, then we have brand health surveys and panels, we have a causal inference tool that we use. And I’m forgetting… Oh, and then we have a reporting interface. So it’s like there’s actually there like five totally separate data products, if you will. Some of them are in-house, some of them we have bought, but ultimately the goal is that that suite of tools tells us how we’re performing from the measurement perspective.

0:26:04.5 EW: Something that that brings up for me is the people who are designing these products might think about this very differently, and the people who are actually interacting with them. If you want… Ultimate the end today, I wanna give someone an easy way to doing these things, and I don’t want them to think about the design decisions or how these things fit together. I just want them to do things. For experimentation, especially, I don’t want them to to think about how many multiple things we’re trying to bring together.

0:26:36.0 EW: I want them to log in and be able to see something and make a choice or understand the trade-off as efficiently as they can, and so the design is different than the experience, which I think is pretty… Which is pretty important.

0:26:52.0 MH: That’s kind of getting it the… The reason that I would… Looker, Adobe Analysis Workspace, Google Analytics, Go4Explore. You name your tool, when it’s a third party, they’re definitionally trying to design the interface for a very broad and deep… Like their business model is they have to appeal to a broad set of people, so it almost feels like there’s a…

0:27:17.6 MH: Once you hit a certain level of scale of an organisation, it can start tilting… From the data product perspective, it can start tilting you to the more of the build versus buy. Or maybe you buy components. But asking that question, “Who is our customer?” within Camber or within Stitch Fix, “Which customer are we trying to serve?” and probably being a little more diligent that the data scientist is a different persona than the marketer, and so you can have a little more control over, is this one product serving both with different interfaces? Is it two products?

0:28:00.6 MH: It feels like… I used to make a… It’s a terrible joke, but I’ll make it anyway. That it’s pretty easy, the only… Like everybody wants… It’s pretty simple what people want from their analytics platform. They want a clear, simple, intuitive, easy-to-use interface that gives them access to all of the data. Like great, not possible, ’cause the data’s complicated.

0:28:23.5 MK: I thought you were gonna say they want certainty, and I was like… Or a soapbox.

[chuckle]

0:28:29.6 MH: Well, that too. I guess that’s this other piece, like when you started to say you don’t want them to have to think how all the nuts and bolts of what’s connected, like I struggle with wanting some friction. Because it is, it’s to that point too. They’re like, “Well, I just want this to give me the answer.”

0:28:44.1 MH: And you started, Eric, by talking about causality. I’m like, well, if somebody’s wading into one of these data products and they don’t understand some basic intuition around causation and uncertainty, where do they have to pick that? Can the tool add some friction and forced education, or is that a totally separate beast? I’m sure that’s a simple answer. [chuckle]

0:29:11.5 EW: I think the tools… I mean, I think friction is good. It’s just a question of like, do you wanna make it really obvious? So for experimentation, people log in and look at experiment results. We explicitly introduced a cost to adding additional metrics. We adjust competence intervals significantly. What counts as significant as you’re adding… Like, we make you pay for it in the product.

0:29:41.5 EW: You don’t get anything for free. Which is, again, that’s arguably pretty heavy-handed, but the alternative is that people show up with 70 metrics from an experiment, and I’m like, “Look, this is directionally positive, which creates tons of chaos and churn and swirl.” So I think within that specific space, I think it’s possible to make some design decisions.

0:30:09.3 EW: Within analytics though, to your point, people want to hunt around for something that seems like, “Hey, it seems plausible. Here’s my story and my narrative. Let me carry it forward from here.”

0:30:21.4 MH: I love the idea of the tool forcing trade-offs, like teaching trade-offs by saying you have to…

0:30:29.8 MK: It’s like The Hunger Games for data. I’m like really into it.

[laughter]

0:30:33.9 TW: Well, what that says to me is, no matter how good of a product you create, there are still gonna be plenty of people willing to use it wrong. And that’s just part of it. But I think it’s a discipline that a lot of people when they sit down to do these exercises, don’t think or they don’t really put their product management hat on.

0:30:53.4 TW: Which I kinda think is what we’re talking about a lot, and sort of like, how do we wanna enable this or what behaviours are a good priority for us in designing this, versus we need to address and help corral the wrong thinking, this need for certainty that people might want on the business side because of the increasing cost curve that it creates in terms of the data.

0:31:20.3 EW: The thing that I struggle with the most with anything like this is people, if they are using any of these products, they probably have some preconceived narrative of what’s true and what’s not, and it’s really hard to disrupt the narrative that people have mostly decided to operate by. And so that often… Like within analytics platforms especially, it produces…

0:31:49.9 EW: They go on the hunt for things that fix, that fit the narrative and kind of explain away things that don’t. And I think that is one of the hardest parts. It’s like people want to simplify the story to a degree that makes it easy to share, and like these products, I don’t know how to design them in a way that breaks some of that.

0:32:11.8 MK: But isn’t that the point?

0:32:13.9 MH: The cognitive bias, the cognitive bias corrective data product, is that the…

0:32:17.6 EW: Yes, that’s what… No, honestly, we probably just got 10 million in valuation and funding…

[chuckle]

0:32:25.5 MH: A machine learning product that finds you the metrics that put your idea in the best light.

0:32:32.0 TW: People are just throwing money at…

0:32:33.4 MH: We could get so much money, yeah.

[chuckle]

0:32:34.9 MK: But isn’t that the… Okay, so the thing I most wanted to talk about, and I’m really conscious that I wanna turn the conversation in that direction, is like that is the role of your embedded analysts. No data product is gonna fix that problem, because that is an inherent problem of just people’s biases. That’s what your embedded analyst should do. And actually, everyone will have to indulge me in a short anecdote.

[chuckle]

0:33:00.4 MK: But one of the things that I have been talking about, and actually when I talked about this at Marketing Analytics Summit, I hadn’t actually reached this conclusion yet. I was still in the like, “How do we get everyone from data-as-a-service to data products?” And the truth is, like I’ve really evolved my thinking there due to Eric, where that isn’t actually the goal, the goal is not to move your whole team from data-as-a-service to data products.

0:33:25.5 MK: Because in my mind, data service was very reactive and data products is proactive, ’cause you’re thinking about the future, you’re thinking about the long-term problems of the organisation that you need to solve. But Eric, can you share a little bit about what your embedded statisticians do at Stitch Fix and how that works to support the data products? ‘Cause this is the bit that over the last couple of weeks has blown my mind.

0:33:49.3 EW: Yeah. For us it’s… So there’s product design, but also there’s organisational design that has to reinforce… If you want these products to do what they should, you have to have the organisation set up around it. So if you don’t have really thoughtful senior people who are trying to incentivise and generate behaviours and ways of approaching problems, that are then potentially reinforced by the products you have, you’re just gonna create tools on one side and people on the other and people are gonna get an entirely different experience, instead of expectations, if they don’t reinforce each other.

0:34:34.5 EW: So it’s really common to have a consulting model where people just kind of go where they’re needed and it’s like, “Cool, we’re doing the right thing for the business.” But then how do you translate some of what they’re doing into these repeatable scalable capabilities? I think that’s really hard to do.

0:34:57.0 EW: ‘Cause they often have context knowledge, specific understanding of people and time and place and politics, I guess is another thing to throw in there. You’re not gonna put that into a product. And so for me, it’s an art design thing too, for leaders to be like, “How should we set up this organisation and what should we build alongside people so that we actually have this function?” I think that’s really hard to do.

0:35:23.4 EW: For us it works because we have super senior people who are doing most of the consulting, and so they work very closely and try to incentivise the right behaviours that our products also try to support.

0:35:37.1 MK: And I think the thing that I just love about this approach is that you’re really elevating the role of the embedded data practitioners, because it’s about them answering the most complicated questions that a data product can’t answer because for various reasons. They’re just too complex to build some kind of automated product around it.

0:36:01.0 MK: And so what you’re actually doing is you’re making the work of those embedded folks super interesting, because as we started to move towards this approach and we did set up some data product teams, they were doing all the cool shit. And I really felt for the embedded data folks, because they’re kind of like, “Oh, those people get to work on the fun stuff, and I’m answering these random stakeholder questions.”

0:36:22.4 MK: And now it’s been really amazing to take the team through, “Yeah, but the point is, they’re going to automate and build solutions for the boring shit. Like what can we automate, or what can we take off your to-do list that we don’t need to spend analyst time on, so you can answer the really interesting questions?” And for me, that has been like this huge light bulb. It’s really invigorated me.

0:36:46.1 EW: Yeah, that is something I wish we could get more right. And maybe it came from just because data science has matured after some engineering organisations have. It’s like, what do you have your most senior engineers doing? They’re certainly deciding that we should invest in something scalable and reusable, but they’re also probably tackling the thorniest and weirdness and hardest problems that still matter.

0:37:18.8 EW: Whereas senior engineers are definitely builders, they can make things that are really powerful for a company. But it’s also, if you look at really senior people, they’re able to actually navigate fairly complex areas that are not well understood enough to even build a product around. And I think it’s like defining, what does it even mean to be a senior analyst?

0:37:47.9 EW: That’s a hard problem and something I don’t think we’ve probably spent a ton of time on, which is like, there’s a lot of these senior staff, principal archetypes for engineering. We have not fleshed that out in the same way for data, that is certainly true. Long way off.

0:38:09.0 TW: Is there, I guess Moe, the risk or the challenge? I’m really intrigued by the customer-centric aspect of the data product side, that we don’t just build. Because I feel like that’s a huge suck of value, is the jump to build product. We’ll stand up the BI tool or we’ll build you the dashboard, and there’s the risk that it’s like, I’m just trying to shed manual, repetitive stuff, and it becomes…

0:38:44.8 TW: The goal is, I’m doing this repetitive thing, I wanna shed the repetitive thing, it winds up in some form of a data product, and nowhere in that hand-off or continuum is anyone saying, “Wait a minute, what are we doing? Why are we doing it? Where’s the value?” And it seems like on the analyst doing the service, if they’re good, and Eric, even as you’re talking about, a good analyst is gonna say, “I may be able to hack together and automate some stuff, I’m basically building a product prototype, but I may not have the discipline to actually do the customer assessment.”

0:39:28.9 TW: So it feels like there’s kind of a slippery slope and I’m probably guilty of it in a small way, where I’m like, “Sure, I’ll hack this thing together with Excel or R, or something.” And then say, “Great, I have proven that this works and people like it, so now let’s go build it,” and it becomes a data product. And somehow that just manages to miss the whole validation of, “Is this useful?” Does that make sense? Is that a…

0:39:57.8 MK: I think the issue is, in that process you just described, there is a step missing, that’s like, “Is this valuable?”

0:40:02.1 TW: Yeah. But…

0:40:04.6 MK: And if you have…

0:40:05.2 TW: Yeah.

0:40:07.2 MH: But the challenge is if I may say, “Is this valuable?” and I show it to a customer, and they’re like, “Yeah, that’s cool, I want that.” And that’s not doing the step, that’s making you… That’s like the feeling like you did this step, because it was pretty and they liked it.

0:40:19.8 MK: But we all know you can’t literally listen to your stakeholders, they’re gonna ask… They’re not always asking for the right shit.

0:40:24.9 EW: That’s like asking someone, “Would you like some insights?” Sure, of course they would like that. Everyone wants some insights and they want more of them. That’s where having that question of asking them for their opinion and approval is often not gonna get you… Because then you just up end with a lot of pretty graphs and not much else.

0:40:50.0 EW: Instead of saying, “What do you want them to do? What choices do you want them to trade off on?” For experimentation, this is pretty explicit, to be like limit the number of metrics they can look at intentionally, because we want them to force that question of, “What is important to us?” and it has to come at the expense of something else.

0:41:16.8 TW: In essence, you need to conduct your own user research and drive into not just sort of like, “Oh yeah, this will give you insights,: But what insights are you getting, what are the top five actions you’re taking? Or if you had this, what would that enable for you? And almost, I think that’s where a lot of teams like, we’re all bashing Looker this time, so I just keep picking on it, but a lot of…

[chuckle]

0:41:46.1 MH: You know what, I tried to spread it out a little bit, but no, if you don’t wanna go there with me…

0:41:50.6 EW: Gonna get sponsorships going away.

[chuckle]

0:41:52.9 TW: We’ll use Google Data Studio.

0:41:56.9 MK: No, Mode. Mode.

[chuckle]

0:42:01.4 TW: Whatever. You name it, whatever it is. The problem is it gets built once and then never enters a product life cycle type of structure, where we’re adapting and are in a back and forth iterative feedback loop that helps us enable enhancements to, in that case, with the UI layer that you described in terms of the layers of the products that analytics would encompass.

0:42:31.0 MH: I feel like those tools, they do, they have product managers who are talking to customers, figuring out what they need.

0:42:32.8 MK: I totally disagree. I’ve got like thousands of dashboards that are sitting there, and it drives me absolutely bonkers.

0:42:43.2 MH: No, no, no, no. But the dashboard enabling technology. I guess that’s in the distinction.

0:42:49.6 TW: It’s not the technology, it’s the interface between… Reports come in and out of use, like they start out potentially useful, but then fade as companies grow in sophistication or asking different questions, or the business cycle pivots, all kinds of reasons why a report will just not be useful anymore. And that’s what I’m talking about.

0:43:07.7 MK: But that’s exactly why data as products makes sense, because this is a concept, some product that we can feel.

0:43:16.2 TW: That’s what I’m trying to say. But that’s what I think we’re not doing as analytics teams, very much of, which is having those conversations and spending the time to do that work. Honestly, I don’t think we’re even given that time, a lot of times. Nobody’s like, “Oh, do you need time to figure this out?” It’s like, no, there’s 40 bazillion other requests they’d like you to work on.

0:43:37.8 TW: Most data teams become order takers, and they’re just pulling down orders and trying to answer them like short order cooks all the time, and just trying to get stuff out of the kitchen as fast as possible. That’s no way to live. We all know that. But it’s hard to kind of…

0:43:55.8 MK: That’s exactly what we need to shift.

0:43:57.8 TW: Yeah, make the shift.

0:44:00.2 MK: For example, our team now review dashboards every three to six months, and we actually look at who’s looked at them, and if people haven’t looked at them we put them in a deprecated folder for six months, and if they’re still there six months later, we’re deleting them.

0:44:14.2 MH: Back to Moe, your point, if there is a data product that is the, “This is how we produce and deliver dashboards,” and however that’s built, if it doesn’t also provide visibility into who is accessing and using them, not in every case, that generally seems like a great feature, and there are generally digital analytics platforms that make that very hard or impossible, and they definitely have heard that from their customers and they definitely aren’t incentivized to provide data on how people are not using the products that are created with their platform.

0:44:53.9 MH: But I think there’s still, there’s a level of detecting things that aren’t being used, and sometimes you just say, “Probably you know what, these things naturally tend to have a life cycle of 6 to 18 months, this one’s probably at the end of it, let’s just kill it. That’s natural.” Versus, “Wait a minute, we’re two months in and no one’s using it.”

0:45:15.9 MH: That feels like we may need to go back to our customer and say, “We think we must have missed the mark. Can we go back to our understanding of what you were trying to do?” It goes back to the customer.

0:45:28.1 MK: But can I just harp on one last last… One last soapbox? [chuckle] I promise this is my last soapbox for the episode. One of the things that I’m really chatting to the team about now that I love about adopting ideas from other disciplines, product is all about an iterative process, and I think data folks are so obsessed with this idea of, ” A stakeholder has asked for a dashboard, I’m gonna build out wire frames. I’m gonna double check it’s what they want, and then we’re gonna go away, I’m gonna build the most perfect dashboard, it’s gonna have all 42 metrics they want, by five different filters with,” yadda, yadda, yadda.

0:46:04.9 MK: And it’s like, product… The thing that we wanna learn from product is like, no, this should be about iteration. You build them the skateboard first. So you say, “What are the five most important metrics?” You build that first. You go back, you check if they’re using it. Are they still using it? Great, you add another six. Or whatever the case is, right? And that process that we can adopt or product is like for me, part of the genius of this. It’s like stopping us from seeking perfection.

0:46:30.2 TW: I’ll play the old guy card again, that’s a literally where the consensus was 20 years ago of where data was. ‘Cause it was very similar, data warehouses, they threw traditional developers at it, and they wanted to use a waterfall development methodology, get all of the requirements, spend six months developing it, roll it out.

0:46:48.7 TW: People are like, “This is great, used it once. By the way, the business has moved on.” It doesn’t have this new metric and they didn’t use it. And so there was a, I just very clearly remember this kind of epiphany of like, “Oh no, a data warehouse, which is definitely like a monolithic data product, I think, needs to be living in iterating and not just growing in complexity. So I agree 100%.

0:47:18.7 MK: Shit. How did that happen?

0:47:23.1 TW: I don’t know.

[chuckle]

0:47:23.9 TW: I’m drunk.

[chuckle]

0:47:27.7 EW: You argued about which internal product affects more people’s decisions on a daily basis. I asked someone this internally. They were like, “The org chart.” That’s it.

0:47:41.7 MK: Oh my God.

[laughter]

0:47:43.7 EW: That affects more people’s decisions about what to do and how they interact. Honestly, we went and looked at on our platforms, we control pretty much we can see what everyone accesses. The org chart is by and far, by like a factor of 10X, the most accessed thing. Things.

0:48:00.9 MH: I’ve had clients that don’t have their org charts readily available and it drives…

0:48:07.0 EW: It’s insane.

0:48:09.1 MH: It drives everyone crazy.

0:48:09.2 EW: It’s the most commonly trafficked page in nearly any company.

0:48:12.6 MK: But see, Eric, what is the purpose of asking that question, like how do you use that with the team?

0:48:17.3 EW: I was just curious from my… If you think about the idea that you’re arguing that people are gonna re-use something then come back to it, I’m like, “Why would you come back to something?” In this case, they’re coming back to it at trying to figure out, “Who is this person report to, who owns this thing, what do you… How do I… I want something to happen. Who do I go to?”

0:48:39.6 EW: It’s not the same thing as looking at a metric, but I’m trying to more think about what causes people to come back to something habitually. At LinkedIn it was interesting ’cause our VP plus team would get this dashboard sent them on the hour, every hour of every day, that was basically, “How is the product functioning?” It was like essentially an executive dashboard that popped to their email on the top of every hour.

0:49:09.0 MK: Every hour?

0:49:11.0 EW: Every hour.

0:49:12.2 MK: The made the decision of every hour.

0:49:16.8 TW: Come on. Come on, it was LinkedIn, clearly no one was looking at it or acting on it.

[chuckle]

0:49:17.7 TW: Sorry.

0:49:17.8 EW: They were just, they wanted it every single hour. It’s fascinating to think about just the scale of it, ’cause they would notice things that were weird and they would ask a lot of questions, and once they get used to seeing something regularly… Honestly, the org chart is probably a good example, if I go to a company that doesn’t have an org chart I’m like, “I don’t know how to function. Who do I talk to? What am I supposed to do? What do you report to? I don’t know who to talk to.” So I would lose it.

0:49:48.2 MH: “Do you matter?” [chuckle]

0:49:50.2 EW: Yeah, do you matter? Exactly where do you fall on the workday org chart? “Workday.” Anyway, I’m gonna not bash another company.

[chuckle]

0:49:58.8 MH: Yeah.

0:50:01.1 TW: Well, it’s fine. They’re not a sponsor.

[laughter]

0:50:05.9 MH: Okay, we do have to start to wrap up, but it was worth waiting for that bombshell ’cause that was outstanding, so thank you for that little last piece, Eric, ’cause that rang really true. Alright, we do have to start to wrap up, which has this been such a great conversation, but one of the things we like to do is go around the horn and share something that we think might be of interest for our listeners. We call it our “last call”. So Eric, you’re a guest, do you have a last call you’d like to share?

0:50:32.6 EW: Yeah. I think product management is fascinating, people… Universities continue to want people like to teach about this, and then they run into there’s no regular on how it’s defined. “What the hell is product? Like, how do I teach a course on this?” So I really admire newsletters, and I think people that are trying to at least take those ideas apart. Will Lawrence, who used to be at Meta is now, he’s moved on. Has a newsletter called Product Life. It’s on Substack.

0:51:08.6 EW: It’s very popular, and I think the reason is that it actually gives structure to thinking about product. Frameworks are all mostly wrong to some degree, but I think it really helps give a tangible idea of like, “Okay, what is product management, how do these different companies think about it? How do I actually be productive in this type of role?”

0:51:35.5 EW: I just find it really useful. I think it makes things tangible to a point where you can actually discuss them, instead of just getting lost in generalities in hand wavy statements. I really admire it. I think it’s really hard to write in the first place, but he’s built up a really solid community of people, and I tend to learn through the things that he writes. I think that’s very useful.

0:52:03.4 MH: Outstanding. Thank you.

0:52:03.5 TW: Nice.

0:52:04.4 MH: Alright, Moe, what about you? Do you have a last call to share?

0:52:08.6 MK: I do, but I can’t vouch for it yet because it’s a recent discovery and I’ve only listened to part of one episode, but it’s actually got me really excited and I’m gonna be downloading a bunch of them to listen to on the plane. There is a…

0:52:25.1 TW: Poor form to be listening to that first part while you’re recording this one, but…

0:52:30.9 MK: Yeah, yeah, I’m that good at multi-tasking, Tim.

[chuckle]

0:52:36.9 MK: It’s called Making Sense of Martech, and it’s an irregular set of interviews. I think the reason that I got really excited about it is because it’s obviously this intersection of marketing and technology, and I don’t actually feel like there’s a crazy amount of people that are talking about this space. It tends to be like a marketing podcast that does an episode on something to do that’s relative to tech, or whatever the case is.

0:52:58.6 MK: So I started to have a listen to the episode about the privacy winter, there’s some about the tech behind attribution and that sort of stuff. So that’s why I’m super pumped and I will keep you updated on how I find the episodes.

0:53:10.5 MH: Nice, thank you. Alright, Tim, what about you? What’s your last call?

0:53:14.8 TW: So I’ll start, it’s in the very near term, but CXL Live is next week in Austin so I will be there for part of that. Michael will be in the vicinity.

0:53:28.8 MK: Oh it’s such a fun conference.

0:53:32.2 TW: If you’re heading to Austin for the F1 race at the Circuit of The Americas, it’s immediately following the F1 race, so I expect to see Danny Ricardo talking optimisation.

[chuckle]

0:53:46.4 MK: Are you presenting, Tim?

0:53:48.0 TW: I’m leading some discussion groups.

0:53:52.3 MK: It’s such a good conference, I love that one.

0:53:55.3 TW: So yeah, so it’s a short notice, but I’m pretty sure you can still get there. And if not, if you’re already planning to be there, then I’ll look forward to seeing you. Please stop by, say hi. I’ll have some podcasts stickers, maybe other swag.

0:54:14.9 MH: You can’t get those just anywhere.

0:54:16.1 EW: F1 models.

[chuckle]

0:54:21.2 MH: That’s right.

0:54:23.6 TW: What about you, Michael?

0:54:27.4 MH: Recently, I ran across a blog post by a company called Count, and apparently they’re working on a new product, so we were talking about data products and things like this and then it sort of was like, “Oh, that brought to mind this post I read.” Say you’ve been trying to figure out how to integrate decision-making and the UI of data reporting, and so they’ve come up with this canvas-type approach, which I described to my team internally is like Miro for data. Which some of my team was like, “Ugh, that sounds awful.”

0:54:55.0 MH: But it actually is kind of an interesting concept. I don’t know where it’ll go, and I don’t know where it actually can be useful, but I found it quite interesting. We’ll put a link to it on the website, so if you were interested in something like that. But it’s kind of cool how they’re trying to let companies have everyone participate in the discussion about the data and the decisions being made in a real-time environment or a collaborative environment, and I thought that was at least an interesting attempt to solve some of the problems we were talking about even on this episode. So anyway.

0:55:28.6 TW: It’s called Count?

0:55:29.3 MH: Yeah, it’s Count.co.

0:55:35.8 TW: And two episodes ago, you, I believe if I recall right, your last call was equals?

0:55:42.6 MH: Yeah, I think so…

0:55:44.1 TW: Are you fucking with us?

0:55:47.9 MH: No, I just pay attention to what’s going on in the industry.

0:55:50.3 TW: Okay, it’s gonna be some “dot CO” that next time, I don’t know.

0:55:55.7 MH: Well, they’re not cool like us with “dot IO” but you know. It’s fine, it’s fine.

0:56:02.2 TW: Okay.

0:56:03.3 MH: Anyways, that’s my last call, and we’ll just, we’ll put a little editorial lens on years in the future too, Tim, how about?

[chuckle]

0:56:09.8 TW: I’m just seeing a pattern. I’m gonna set up a day project to do a little text mining on that.

0:56:15.5 MH: That’s perfect. Alright, well, I’ll make sure to throw a couple of… Nothing more I love than to disrupt a pattern, so challenge accepted.

chuckle.

0:56:28.5 MH: Alright. Well, one thing that’s definitely been a pattern was how awesome this conversation has been, and so Eric, thank you so much for coming on the show and sharing some insights with us. Really appreciate it.

0:56:40.8 EW: Thanks so much for having me. I really enjoyed it.

0:56:42.9 MH: And as listeners, you’re probably wondering, “Where can I learn more? How can I participate in this conversation?” Well, as luck would have it, Eric has an excellent newsletter, which we did mention earlier, but I’d highly recommend subscribing, which I have, and you can read a lot of his thoughts there. Are there other places where you’re active, Eric, on social media, like on many other places? Or is that the best way for people?

0:57:08.7 EW: LinkedIn a little bit, but I’ve primarily shifted to just trying to write newsletters. I have more time to think through things.

0:57:14.2 MH: Excellent, so that’s the best place and we’ll put a link to that on the site as well. And if you wanna get in touch with us, we’d love to hear from you. You can reach out to us on the Measure Slack group or on Twitter or on our LinkedIn page. So if you’ve got questions, comments or anything like that, we’d love to hear from you. And of course, no show is complete without a massive thank you to Josh Crowhurst, our producer, for everything he does to make this show happen. So thank you, Josh.

0:57:44.0 MH: Alright. I know that as you’re listening to this, you’re gonna be thinking, “How am I gonna manage all of these things in my day-to-day and become a product manager as well as an analyst, as well as the data scientist, as well as a machine learning expert?” Well, there’s one thing I know I could say with confidence. From both of my co-hosts, Tim and Moe, as you figure out what your future job title should be, just to remember one thing, keep analysing.

[music]

0:58:15.5 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @Analytics Hour, on the web at analyticshour.io, our LinkedIn group. And the Measure chat Slack group. Music for the podcast by Josh Crowhurst.

0:58:34.7 Charles Barkley: Smart guys want to fit in, so they made up a term called “analytics”. Analytics don’t work.

0:58:38.7 Tom Hammerschmidt: Analytics. Oh my God, what the fuck does that even?

[music]

0:58:45.3 MH: This is the thing, the three layers, you can pick which way you wanna go, ’cause most people don’t remember how hard it was to use data in Tableau, which, Tableau I think I gives you one of the most beautiful…

0:59:01.1 MK: I’m still a deep Tableau lover. I don’t care how complicated it is, I still…

0:59:05.1 MH: Visually it is sort of like at the top, I think for a lot of people, but for many years you had to do so much to put our data in a way that you can actually get it in there, and the companies that simplified that on the other end of like, “Here, we get your data in,” and they relate it really easily, never had the work done to get the visualisation piece super slick. And so you just…

0:59:27.7 TW: That’s why it’s…

[overlapping conversation]

0:59:31.4 TW: It’s so intuitive.

[laughter]

0:59:37.2 MK: What beverage are you drinking?

0:59:37.7 MH: This is an iced coffee. In a Mason jar, ’cause I’m…

0:59:44.6 MK: Cool. You’re hip.

0:59:49.1 MH: It’s just a…

0:59:52.2 TW: Vodka.

0:59:52.3 MK: Is it actually? No, I didn’t think so. You’re also trying to be cool.

0:59:57.3 MH: No, I just had leftover coffee.

1:00:00.0 TW: We’ve aged on this podcast. When we started recording it, we literally always had a drink.

1:00:03.5 MH: Yeah, we would always have a cool beer or whiskey or something, and now it’s like, “Well, some water is nice.”

1:00:12.4 TW: We’re too old for that. Back in, my 2017 self could did handle that and work the next day.

1:00:19.0 MK: We had a couple of episodes where we had to put the time because of Europe, which meant that I could drink, and that never ended well because I would be on the show like a few glasses of wine deep after a big day of work, and it just… Had to keep me on the morning schedule. But mostly it’s me saying stupid stuff, let’s be real, because Tim knows when he’s hit the “record” button, so he always says he’s stupid shit before wine.

1:00:47.7 MH: Tim is always trying to catch us saying something damaging.

1:00:52.4 TW: Alright, it’s usually you’re saying something damaging, I’m like, “Crap, I need to be recording,” so I’ll hit “record” and help you do it again.

[chuckle]

1:01:01.3 TW: I feel like you’re like… You’re like refuting something that I did not remotely. Michael, on the other hand said, “We don’t get even time to do this,” which is a complete load of crap and we have an episode on that.

1:01:12.9 MK: But you have to make time.

1:01:17.8 MK: Yeah.

1:01:18.3 TW: Yeah, that I’m gonna react to separately, but I have no idea what…

1:01:19.6 MH: I didn’t notice myself say it exactly like that, Tim. That’s fascinating…

1:01:24.6 TW: I’m just getting triggered by our owning versus helping…

[chuckle]

1:01:29.0 MH: Yeah, yeah, that’s fine. And remember, Tim, there’s a couple of differences as it pertains to you specifically, because as the quintessential analyst…

1:01:36.3 TW: ‘Cause I’m a jackass? Yeah.

1:01:38.0 MH: People listen to you when you walk into a room. I mean, you have this command and respect and authority that others of us in the industry just don’t possess.

[chuckle]

1:01:46.0 MH: And so that’s one of those things.

1:01:52.2 TW: You’re such an arsehole.

1:01:53.4 MH: Thank you, thank you.

[chuckle]

1:01:54.4 MH: Alright.

1:01:56.9 MK: Poor Eric is like, “What the fuck is… ”

[laughter]

1:02:07.4 EW: After the meeting, honestly, the stuff that we’re saying here, I wish this could translate to actual business meetings where people are like “Yeah, you generally are full of shit,” type of thing.

[laughter]

1:02:22.8 EW: And everyone knows it in the room.

1:02:24.4 MH: Yeah.

1:02:26.4 TW: “Rock, flag and hookers. Amazing.”

[laughter]

1:02:32.5 EW: Oh wow, doubled down on that one.

1:02:32.9 MK: You really did.

1:02:34.2 TW: I figured what the heck.

One Response

  1. […] This episode of Analytics Power Hour with Moe, the rest of the Analytics Power Hour Crew and Eric Weber on “Data as a Product” […]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#260: Once Upon a Data Story with Duncan Clark

#260: Once Upon a Data Story with Duncan Clark

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_260_-_Once_Upon_a_Data_Story_with_Duncan_Clark.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares