#273: Data Products Are... Assets? Platforms? Warehouses? Infrastructure? Oh, Dear. with Eric Sandosham

Is it just us, or are data products becoming all the rage? Is Google Trends a data product that could help us answer that question? What actually IS a data product? And does it even matter that we have a good definition? If any of these questions seem like they have cut and dried answers, then this episode may just convince you that you haven’t thought about them hard enough! After all, what is more on-brand for a group of analysts than being thrown a question that seems simple only to dig in to realize that it is more complicated than it appears at first blush? On this episode, Eric Sandosham returned as a guest inspired by a Medium post he wrote a while back so we could all dive into the topic and see what we could figure out!

Articles and Other Interesting Items Mentioned in the Show

Photo by eduard on Unsplash

Episode Transcript

0:00:05.8 Announcer : Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

0:00:16.2 Tim Wilson: Hi, everyone. Welcome to the Analytics Power Hour. This is episode number 273, and I’m your host, Tim Wilson. You know, we occasionally get asked how we decide which hosts are going to be on any given episode, and our process for that has varied over the years. One of the considerations is ensuring that we’re keeping kind of a reasonably balanced rotation of different co-host combinations. There are various other considerations, more than you’d think. We’re not going to go into it. That sort of prevent us from having a strictly formulaic process. But for the rotation tracking, which co-hosts are on at any given episode, I built a chart that automatically updates, that visualizes the frequency of different combinations of co-hosts. It dynamically updates, but it’s all within this highly sophisticated platform called Google Sheets. It’s kind of a unique chart. It’s a derivation of what’s called an upset chart, which I’d never heard of until I was trying to figure out what visualization to use, but it turned out to be pretty useful. So does that mean that I built a data product?

0:01:22.4 Tim Wilson: I mean, it’s just one chart, but maybe I did. Maybe I’ll know one way or the other by the end of this episode, because the topic of this show is just that, data products. I’m joined by a couple of co-hosts. Val Kroll, what’s your initial reaction? Can a single chart be a data product?

0:01:41.2 Val Kroll: I cannot wait to get into this because old Val would have said, no, it’s just a chart. But I feel like we’re going to challenge that on this one. So maybe it is.

0:01:52.9 Tim Wilson: Ooh, the perfect ambiguous, ambivalent answer. And we’re also joined by Moe Kiss from Canva. Moe, what say you? Can a single chart be a data product?

0:02:03.5 Moe Kiss: Well, also, I’ve been having this exact discussion with my team over the last week. So I don’t think I have a strong opinion yet, but let’s see where I land.

0:02:13.7 Tim Wilson: All right. Well, by the end of this show, I’m expecting us all to have a definitive answer one way or the other. Otherwise, we really whiffed on our guests. But I know we didn’t because we have had Eric Sandosham, founder and partner at Red & White Consulting Partners on the show before. Way back on episode number 254, he was on for a lively discussion about benchmarks. And it turns out that it was Moe, Val, and me as the co-host for that episode, too. In addition to his consulting work, Eric is on the adjunct faculty at Nanyang Technological University, Singapore Management University, and the Wealth Management Institute. We’re actually capturing him right mid two-day session with the Singapore Management University. So he has been talking a lot yesterday and he’ll be talking a lot tomorrow. He was previously the Customer Intelligence Practice Lead for North Asia for SAS. And before that, was the managing director, head of decision management at Asia Pacific Consumer Bank at Citibank Singapore. And he’s now well into year two of publishing a weekly article on Medium about data and analytics. I look forward to reading each installment when it lands in my inbox every Saturday.

0:03:27.4 Tim Wilson: But one of those recent articles was about, shocker, data products. We’re excited to have him back on the show. So welcome, Eric.

0:03:35.7 Eric Sandosham: Thank you. Thank you for having me back. Yeah, it was a real pleasure.

0:03:40.4 Tim Wilson: We’ll also say that it is early morning for Eric as we’re recording this. So he said that was fine, but I’m wondering if that was… It sounded fine when we were scheduling it. And now if he’s second guessing that decision.

0:03:54.7 Eric Sandosham: Yeah.

0:03:55.9 Val Kroll: He’ll also be here for some hot takes.

0:03:56.9 Tim Wilson: Yeah. So Eric, in that Medium post that I just referenced, you started off by trying to put together kind of a reasonably tight and clear definition of what a data product is. We already covered that maybe we’re not all that clear. So maybe we can start there. Like, where you landed, maybe what you found problematic in some of the other definitions that you found.

0:04:22.0 Eric Sandosham: The article actually started in a casual conversation with a in-house data lake person. And even before, even during my time in Citi, and from an infrastructure perspective, they would say, how can we up our game? How can we provide more value, more utility? Can we create products? And I suppose it also fits into thinking about this space in IT and data warehouse. If you think about the role in Amazon, for example, in AWS, they call themselves engineers. So they are making stuff as opposed to my time when I grew up, IT was IT. We didn’t refer to them as engineers. And so nothing wrong with that. So it got me thinking, do you really have products if you’re part of the infrastructure or your role is sort of managing data and all of that? And even then, if you flip it to the front end, if you’re a data scientist, are you truly making products or doing analysis? I mean, if you’re building a predictive model, is that a product? And then, it sort of got me into a little bit of a rabbit hole. And I realized at the very core of it is this lack of clarity between what would seem as a product versus a feature, even in the physical world, actually.

0:05:47.9 Eric Sandosham: Maybe less so in the physical world because you can touch and feel something. It’s a chair. But if I took the chair and I had to replace the legs, for example, and it’s not bespoke, right? And it’s sort of part of a modularized kind of chair. Would that be a product? So like a chair that could be swapped out and swapped in, would that be a product? I think most people would say the entire chair should be seen, maybe those are sort of components to it, right? But once you get into a virtual digital space, where do you draw the line between something that’s physically a whole versus a component, right? Because you can disambiguate almost everything at an atomic level. So then I realized, oh, actually, maybe that is the central challenge in the digital space and the virtual space, because you don’t have this clear distinction between when the product ends or when the component begins. So I sort of just took a crack at it myself from a data practitioner perspective. Could I put forward perhaps a much clearer definition? When I scoured the internet, I found two opposing kinds of very extreme ends, which I sort of highlighted in the article.

0:07:09.8 Eric Sandosham: One being that if I’m creating some sort of mucked up meta tag table, right? So if it’s a… And for lack of a better term, let’s call it, let’s say, a database, a form of database that’s referenced and all of that, it’s a data product because someone can use it and it’s curated, it’s put together, it’s organized and all of that. Great. And then on the other extreme, what someone would also define as a data product is a tool or an application that’s processing the data, right? So if I throw some data into it and it spits out some kind of output, and both are fundamentally different. One is a processor, one is an ingredient in a sense, right? Of a database. And there was no hard and fast rule to say which should or should not be a data product. So I was eventually quite fascinated and say, why would, you know, why even this term, why is there even a need for this term? Why can’t we just say it’s a data processor, it’s a database, right? Yeah.

0:08:20.9 Tim Wilson: Well, if you say data processors, then you wind up in like European privacy regulations, right? Because there’s…

0:08:30.0 Eric Sandosham: I took a stab at it and said, well, if we define a data product, maybe it had to have sort of these attributes, at least for me. One, it uses data as… And data assets, in my view, are fundamentally different from just data, because there’s a sort of intentional curatedness to that. It’s curated data, it’s identifiable, and then it’s reusable. And it has a specific utility, this data product, of solving or reducing uncertainty for decision making, right? So if it’s just there and it doesn’t do anything to fix someone’s decision making, then to me, I don’t call it a data product. But by making a definition, and because of this discussion, I went back to reread the article and all of that. There are sort of a lot of definition, because it suggests intention and purpose, as opposed to, oh, I made some data and I threw it there, let’s see who can help themselves to it, right? Versus saying, I built this for a very specific utility, for a specific target audience, to solve a specific decision making uncertainty challenge. And to me, perhaps then that would make, say, the data product.

0:09:49.4 Announcer : Imagine this, your data, instantly, wherever you need it, like having cold brew coffee on tap.

0:09:58.1 Tim Wilson: Ah, yes, data on tap, giving you that little pick-me-up, that’s the dream. But in reality, it’s more like data on hold, that is, until now. Fivetran, already the kings of automated data integration, they just acquired Census. Yeah, that Census, the reverse ETL pros.

0:10:18.4 Announcer : Which means now it’s all connected, source systems? Check. Data link? Check. Apps like CRM and marketing tools? Check.

0:10:26.8 Tim Wilson: It’s like your data finally turned 16 and got its driver’s license.

0:10:32.0 Announcer : And stopped asking you for rides everywhere. With Fivetran and Census, your data just works. Everywhere, more insights, less waiting, zero headaches.

0:10:43.2 Tim Wilson: Fivetran, the only pipeline you’ll actually want to maintain. Go check out their site with interactive demos and more at fivetran.com/aph. Once again, that is F-I-V-E-T-R-A-N.com/aph.

0:11:00.5 Moe Kiss: Okay, so I’m going to go back to an idea that you mentioned kind of at the start. But Val started cackling with delight. And I feel like I need to fully flesh it out so I can understand it. So you said that sometimes it’s hard to know where the data product and the line is between it and the feature, right? And I love that chair example, because I feel like that is such a tactical thing that I can understand. Like, the chair is the product, the leg is the feature. But can you give an example of where that line starts to blur and it’s hard to know where the feature and the product start and end?

0:11:38.1 Eric Sandosham: My history is all in banking, financial services. So let’s take, and you know, in banking, if you dare to say financial products, the products are not physically real. What’s a checking account? You never see it, you can’t touch and feel it, right? So most people would say, well, a checking or savings account is a financial services product. But in that checking account, you could do remittances, right? Now, in pre-digitalization days, your remittance would, say, just be a feature of your checking account because you would have to call up, put out to the branch and wire money, right? But then we introduced digital banking, mobile banking. Now, mobile banking as a platform or application could arguably be said as a product. And within that digital banking product, you have the checking product that is enabled as a front end to it. But then within digital banking or mobile banking, you could do direct remittances and set it up, which sort of disambiguated from the checking account. Obviously, it has to debit the funds from the checking account, but it could be argued as a standalone capability.

0:12:58.6 Eric Sandosham: Now, in that context of a mobile bank, most people would say remittance, the remittance ability would be seen as a feature or functionality. Because you wouldn’t have a remittance product manager in a bank, you would have a digital banking product manager, a mobile banking product manager that says, I want to now create more utility, more functionality for my tool or application. Remittances would be great. And it could reimagine the space of remittances that would have been different from how it would have been done in the physical past, right? When you get into that space, so is remittance now product? Technically, you could make that argument because it could stand alone with its own capability taken out of the mobile banking now and perhaps monetize and white-labeled.

0:13:57.7 Moe Kiss: Wow. Definitely more complicated.

0:13:57.7 Val Kroll: I love these examples and thinking about exactly what you’re talking about as well. I was like, I can’t believe we’re already getting into it that early into the episode. But I worked with, when I was in a previous role, I supported at two different financial services organizations, a digital account opening team. They were called the DAO team, was the shorthand. And it was not even aligned to products. So it wasn’t just checking, wasn’t just savings, it also was credit card and doing those soft checks. And so it was the functionality of just the interaction of the customer in the site to even let you into talking with some of those. But that was one of their largest digital product teams. So even thinking about how it cuts it sideways now and it’s not just these nice little categories and thinking about what is the value, the business value and the utility of digital account opening separate from the features and the products that they’re signing up for themselves. So I love all the layers and complexity.

0:15:02.3 Tim Wilson: Well, I mean, not to head down a side path, because we’ve kind of shifted to talking about digital products. I’ll throw in the example of like when checkout gets treated as it’s a product, like there’ll be a product team that is checkout when they’re buying products. I mean, I was super naive years ago when I had no idea that there were like digital products and the components of an e-commerce experience were considered products. I got into a fairly heated argument with somebody else who was completely right. And I thought that was the dumbest thing ever. So I have admitted that to him since.

0:15:40.7 Val Kroll: More on that later.

0:15:43.0 Tim Wilson: So, I mean, anyway, if Ryan Garner is listening, I will once again say, yep, I was way on the wrong on that. But there’s data products and there’s digital products. And arguably, even that starts to be this kind of confusing, like are we using digital products and trying to get definition around those as a way to get clarity on the boundaries around what a data product is? Is that where we’re heading?

0:16:14.0 Eric Sandosham: I think there’s some similarity because both are sort of unreal in the sense of non-physical, right? And they are sort of understood in the utility. The digital, you can only understand this checkout process because it serves a particular utility or functionality. Same like the data.

0:16:38.7 Tim Wilson: So it’s like obscenity? It’s like obscenity? We’ll know it when we see it. No, sorry. I’m not adding value here, but I’m like, is it a product? We’ll know it when we see it.

0:16:48.0 Val Kroll: Okay, so I have a way to help us get back on track, I think, Tim. So back to the Medium article, because this is kind of the connection point between the two. When I was first reading it, you had used an example, and if you want to elaborate on it more, I’m sure you’ll do a better job than I will, about the credit bureau. And I was like, okay, I’m following. And you were talking about the different API calls and that the API itself was a data product. I’m like, yep, I’m following, I’m following. And then you talked about how it should be obvious that the credit score itself was a data product. And I was like, hold the phone. Hold the phone. Like a single number? And like my brain wrapped around, and I probably just sat there for 10 minutes processing that one. But I was like, yeah. And I went back to your definition about that it’s an asset, that it’s the curated data, that it has utility. I’m like, yeah, it works. Anyways, if you want to talk about, give it a little to our listeners, a little bit more of that example, I think that that’s a really strong connection. And I found it to be an amazing kind of way to use that.

0:17:48.7 Eric Sandosham: So I think most people know what credit bureaus are, right? And at the heart of it, it’s similar earlier definition of a data product. It’s a database, right? But in this case, it’s a database designed for a very specific target audience for a very specific target functionality and all of that, right? And governed and structured.

0:18:10.8 Val Kroll: To make a decision, right?

0:18:12.4 Eric Sandosham: It doesn’t exist for you to sort of go and just sort of play around the data. And even in the master data bureau, you could only withdraw certain types or query certain types of information that’s very much, again, aligned to the entire lifecycle of how you would do underwriting and all. So it’s not free for all. It’s very specific, very curated. And in many cases, even transformed information. So it’s not the raw form of the information, but they may give you counts like the number of times you’ve been delinquent, your leverage ratios, and those sort of things. But they are sort of post-computed in the credit bureau. Now, one, I think, can make an argument that the entire bureau, it’s a data product because it has a P&L, it has a business model to it, they sell, and they upgraded their versions and all of that.

0:19:11.0 Eric Sandosham: But then there are things within the bureau which could then be also thought of as products. So one of the examples, I suggest if I had the bureau as a standalone itself, and in the days before there were APIs, these bureaus, the way you would have done it, you would have to send in a batch request. And they could do a single pool or you could batch hundreds of thousands and you would then wait for the next day or a couple of days later, they will send it back, this batch query. And that was just a way to formation back and forth. But now you could sort of do it real time. And that API with the database structures built intentionally for a specific use case and target audience, I felt qualified in my view as a data product. It is sort of coherent as a single value proposition to give me quick and fast access to information that I could use for underwriting. Now, the bureau over time, in many cases, started to offer additional services. I know whether we call those services products or not becomes the dicey piece.

0:20:31.5 Eric Sandosham: So they could offer a report on market share, for example. That report could be something you just log in, view it, and then print it out as a PDF or whatever, or even data. Is a market share report, credit bureau, a data product? In my view, I tend to lean, maybe not. And then we can argue why. But then there’s also, on the other side, the data processing definition. The bureau now produces a credit score. And is this credit score a data product? I look at it, not the output of the score itself, but sort of the entire process of solving a specific problem. So the score required some engineering work, obviously, to figure out whether the data was good, valuable, and those sort of things. And it had to be created in a foot that could then be immediately used in the decisioning lifecycle of the underwriter. So it’s, in fact, almost well thought. You didn’t have to take the score and say, hey, can I pick out a subcomponent? I had to do extra lifting and shifting just to make it work for me. It became really a template, in a sense.

0:22:05.5 Eric Sandosham: And because of that templatized approach from start to end, I thought it was a product, because now it could be monetized. It’s an asset, right? The underlying data, obviously, is an asset. But this score, which is now a transformed abstraction of those raw data, has a different kind of asset value, and even arguably a higher value than the original data that it was built upon, right? And you could just lift and shift it, reuse it in different ways. So when I thought about that, I could say, yeah, it’s almost, you could put your arms around it because in a virtual sense, it becomes real.

0:22:54.6 Val Kroll: That’s a really good example.

0:22:56.5 Moe Kiss: Okay, I want to take us down a completely different path. One of the challenges I found, so I’ve been banging on and thinking about data products for a really long time, and there’s actually someone new on my team, and she shared something with me the other day, and it had this, like, this thinking about data products. And I was like, oh, have you been, like, reading some of my old decks? And she’s like, no. And I was like, oh, cool. Then we’re just, like, super aligned. That’s amazing. But one of the things that I’ve kind of experienced as, like, a bit of a tension, and I’m curious to hear your thoughts. The reality is that all products now have such a big component of data that it’s really hard to know where the line is of, like, what is a data product that data folks own and build, and what is an internal product that is a big user, relier of data that should go to a product team? Because I just feel like it’s pretty murky, because especially when you work in marketing, we have these huge data sets. They’re used for our decision making.

0:24:00.3 Moe Kiss: And so naturally, anything that our internal product team build to serve marketers is going to have this big data component. And it’s like, I sometimes feel like there is this, like, push and pull of, like, should it sit with a product manager and software engineers? And, like, do you embed a data person? Or is it actually a data product that a data team should build and own? And I just don’t know where the line is.

0:24:26.3 Eric Sandosham: Let’s think about food, right? So let’s say that that jar of peanut butter on the grocery shelf, right? Obviously, once it’s made peanut butter, it’s quite clear it’s a product. You go, you pick it up, and you check it out. But behind the scenes, the manufacturer of that peanut butter buys ingredients, buys chemicals, buys stuff. And those companies that sell to the peanut butter maker, in their view, they’re selling products. They’re selling ingredients. They’re selling chemicals, right? And maybe the way we think about products is who’s the buyer at the end? Because if we think about, and the maker of that data, so like in this case, let’s say the peanut butter, and you say, I grow my own peanuts in a field, I harvest them, and then I turn it into peanut butter. It’s organic, right? From farm to table. Does that manufacturer think of the peanut as a product? Maybe not, because they own the entire thing. In their view, they only have one product, which is making that peanut butter, because that’s the thing that I eventually monetize. But if I had to buy it from a supplier, then the supplier is going to think of the ingredients as a product, because I make money off, and I have to figure a way to sell it at scale, and I have to figure a way to templatize that whole process, right?

0:25:58.8 Eric Sandosham: And so if we are sort of thinking in the same data space, and we are now sort of buying or ingesting someone else’s data, then that data could be seen as a product that that supplier provides to our process. But if we are internally managing our data lifecycle, should we think of that data as a product? Because we don’t have a P&L attributable directly to it.

0:26:29.6 Tim Wilson: It’s clear as mud. I mean, I feel like there’s been another layer introduced there. And Moe, you kind of, I think where you were heading, and it goes back to what’s a digital product? I’m using some online, say, presentation creation software, and there is data that has to be curated to feed into that to help me use that, to guide me towards how to use the tool better, to recommend what templates I should look at, or whatever. So there’s data flowing into literally unambiguously the product, the digital product that the customer is working with. On the other end, there are data products that are, this is a data asset that is being curated and used to deliver to inform decision-making, like a dashboard to help somebody decide, or I guess even the credit score. The data is leading to a decision. And then Moe, I feel like you brought in sort of something that falls in between, that a team is using it, and they’re saying, we’re doing personalized offerings to prospects or customers to cross-sell them, and that is bringing in data that’s being curated and modeled, a recommendation engine, but it’s being used by an internal team, so maybe a recommendation engine isn’t the right case. Part of this feels like, oh my God, is this so academic that it doesn’t matter?

0:28:20.9 Val Kroll: What’s the value of calling something a data product? I think we should touch upon and talk about that too. Or even Moe, like the example of if it lived within the data team, what would that mean different from if it went to a product team? Like if it went to a product team, does that mean that a PM is like looking over the roadmap and thinking of like feature enhancements and reduction of tech debt and thinking about extensibility to other teams? And if it was a data product, is it about enriching it and improving it and narrowing the confidence intervals or whatever the goal of that product is like? And I’m obviously using some broad brushes here, but let’s talk about the value of why the name could be helpful or maybe harmful if that’s the opinion, but like what’s different from the other side?

0:29:07.9 Moe Kiss: Okay. The thing that I have observed that I love about this whole concept of data products, I think it really shifts the role of data folks from being like a service function and tickets in a Jira queue to really elevating our work and forcing data folks to think more deeply about how they’re spending their time. And so what I mean by that is when you’re building something that you want to call a “data product”, you’re normally stealing some of those things from the product folks. You’re doing like, what is the business problem we’re trying to solve? Who are our customers? How will we know if we’ve been successful? You’re really leaning on those, just like that product way of thinking. And I think that’s a really strong thing because you also give thought to like, how do I automate this? What is the type of debt we’re going to incur? What’s the maintenance for this like and the effort we need to put behind that? And I think that’s such a good way for data folks to think about their work. And also just as someone who is going through this at the moment, I know, Tim. I know, Tim. You’re really smart. It’s in your book. I bought three of them the other day and gave them to the team. Wait, let me finish. Let me finish. Let me finish.

0:30:24.9 Tim Wilson: No. I’ll let you finish, but I have a counterpoint.

0:30:28.0 Moe Kiss: That’s what I’m saying. But the thing that I find uniquely different as well is I think you have a more sophisticated conversation with your stakeholders because you can be like, here are the “data products” or whatever you want to call it, big rocks, whatever, that we’re focused on for you. And I feel like you can make trade-offs more easily than that like service function of like, here is just an endless queue of tickets and a dashboard is another one in the line. And I don’t know. All right. I’m going to stop.

0:31:05.6 Tim Wilson: So let me provide my counterpoint and then I’ll let Eric, because the thing that terrifies me about as you were talking, and I realize I get triggered at points where it starts to treat the goal to be build data products. And there are data scientists who are out there say we build data products. And the way that you just kind of, I think you kind of perfectly set it up when you said, oh, a Jira ticket to give me another dashboard, which we definitely want to get out of the ticketing for build more things. But what I think is really problematic is when we put data products completely at the center is that it treats it as though that is the be all. And if we just build good data products, and there are definitely people out there who live in like, where is the value of, I actually need a real, I have a hypothesis that needs to be validated one time. And I needed the right data scientist or the right analyst to help me validate this hypothesis. And that feels like it’s a serious danger of people saying, well, which data product solves that? Or which is where I think dashboard bloat comes from is people keep saying, if I have a question, you need to point me to the technology that answers it. And that’s missing a part of the role.

0:32:34.9 Moe Kiss: Okay. This is just my early, like…

0:32:36.7 Tim Wilson: How long until we let Eric weigh in?

0:32:37.9 Moe Kiss: I know, I know. We’ll let Eric weigh in, I promise. This is my very early thinking. But what I have found is that when you try and get the same person to do all of those things, they do none of them well. And so for me, when I talk about like data products, it’s about how do you have a team who are really good at that or a person or whatever it is, depending on your company. And in some companies, it might be part of a person, right? So how do you allocate a percentage of your time to building those solutions? And then how do you have a discrete bucket for exactly what you’re talking about? I think the challenge is that we try and get “data scientists” to do all of them. And they do. They end up with a ticket system that just is endless and you can’t weigh those things off. And so I feel like you really need to put the like, this is the build work we’re doing, the longer term stuff that fuels that. And then I actually think the hypothesis generation and the “insights work” is like, that is a different thing. And you might be using data products to answer those. Sorry, done. All right.

0:33:41.8 Val Kroll: Over to you, Eric. Can you settle this?

0:33:47.5 Tim Wilson: He’s like, you’re all wrong.

0:33:50.1 Val Kroll: Thank God I’m here.

0:33:57.6 Eric Sandosham: So I think it’s a great perspective and question, right? So I will use this term data and data scientists interchangeably, right? So it depends on how we want to view those roles. But regardless of how you want to label the role, actually there are sort of two large hemisphere of work that any analyst or data scientist engages in. So there is a sort of problem framing that ends with a diagnostic output, you discovered something, right? And you’re messing around with data, you’re trying to figure out sense make and all of that. And then there’s the solutioning piece where it starts from that output of the diagnostic and say, now I have to figure a way to extract value or integrate into a decision framework in the organization. In my view, I think when we talk about data product, it really exists in the solutioning hemisphere rather than in the problem articulation or problem framing diagnostic hemisphere, right? And I do take Tim’s concern to say that if I think of my job as solutioning, then actually it misses a big chunk of the work of a data scientist and analyst where the solution only comes because I’ve done the diagnostic and problem articulation piece.

0:35:13.1 Eric Sandosham: And I think in many different discussions, even on this podcast, the challenge really is there’s a gap there, right? How do people sort of do that kind of work well? And the focus, therefore, if you say a data scientist is now ultimately driven with KPIs on a data product, they are in some sense polishing the apple before figuring out how to grow it. That may be a real concern, right? But there is also something to be said that polishing that apple as a way to say, give attention to that last mile, how do you not just say, well, you need a dashboard, I pipe in a little bit of data, the charts look sufficiently okay, and then just get it out. Versus thinking, well, can this dashboard be sort of generalized into a different kind of utility, right? Can I learn something about the way I piped in that data and if that data can be reused as an asset for something else, right? I think you begin to now think like a product manager because you’re trying to think of internal monetization, right? Because if you had to sell the company tomorrow, how would you price the stuff that you’re doing, right?

0:36:28.4 Eric Sandosham: And should it or should it not be an asset? And I think there’s also value in thinking like that, sort of the engineering hat that they wear in the solutioning hemisphere. Yeah.

0:36:41.9 Tim Wilson: So it seems like there’s a gating point. It seems like it’s a very real concern that our business partners will go to, they just want to give me the polished apple. They jump to the solution and that having some clarity of thinking in the problem identification to figure out, is this something served by enhancement to an existing product? And hopefully, those are people who have lived the world of dashboard and report bloat. Is it a new data product, which means is there clarity on the decisions that it will support? And is that the framing under which that product will be built? Or is this something that is way, way upstream? We need to design an experiment to do this. We need to just go commission some research. We need to have a data scientist go and build a model to validate this hypothesis and be able to sort of push back that if somebody came and said, can you add to the dashboard X? That that’s not what’s needed. A data product doesn’t play in the here and now solution.

0:38:04.9 Eric Sandosham: I suppose for me, the trigger point is really just the word product, because I think all the stuff that everyone’s saying is completely valid. How do I make sure that I get the last mile and do it well, make sure it’s integrated and make sure it’s reusable, it’s scalable and all those stuff, right? Thinking I’m not just here to produce a piece of report for an end user. But the minute we use the term product, we want to take on a whole universe of product management vocabulary and apply it, right? Because I could say, look, a data scientist role is also to try and think of creating solution assets. If I didn’t want to use that term, right? It’s a data-driven solution. It’s a data-driven asset. Why the choice of the word product? Because in my view, if you come back to the standard literature, then product has a very strong P&L connotation to it. And the P&L is not just by internal monetization. We don’t call the help desk a product, even though obviously they produce or they generate a lot of internal value to the organization and assign a value in that case. But within the organization, they would have organized themselves particularly to what is the face off to our market? And that will define product, right?

0:39:37.4 Eric Sandosham: And so, should products only be reserved for external-facing part of the business because there’s a real external P&L that could be generated from it? Versus looking inward and thinking, you could always formulate some internal P&L, you can, right? For every function, you could do that but is it helpful?

0:40:02.2 Val Kroll: I think about, I was on a platform team internally that supported, obviously a platform team is, you know, different pieces. There was a part of that team that created email templates that marketers leveraged and used. And so it was all about like repeatable, scalable things that had utility to help teams do their work faster, more efficient, whatever. And so it was really easy for us at the time. And now I’m questioning some of the way we use the term and the product, but to draw a boundary around those discrete pieces that we would, we knew when it was clear that we had define, design, develop, deploy, and then that whole adoption phase and then like handed it off. But I think the, like the thought of platform teams not working on something that’s considered a product, I think is hard. I understand what you’re saying, that if you were to set up a P&L for every single team internally, but I think that even if you were to take some of the core components of your definition and getting into that, like just digital product space, it is kind of, you know, you can atomize it in different ways, like to your point.

0:41:07.7 Val Kroll: But I do think that there’s like a way to draw a boundary on it that is helpful vernacular to talk about those specific pieces, to know when you’re done and you can pivot to the next thing and which one are assigned to those big rocks. ‘Cause otherwise it just turns into, like, it seems like it would be reductive to call it a project, right? Because it does have repeatability and utility. I don’t know. I’m kind of running around in circles at this point, but I think there is value in naming it because of the way that people will treat it. Not just what… Because I agree with you, Moe, not just because of that upfront, but because of the maintenance that that will get and the attention to make sure that it’s supporting the business processes and the decisions that those are being made. In the same way that you would maintain a model similarly. So anyways, I’m on team give it a name when it deserves it. That’s my camp.

0:42:03.8 Moe Kiss: Val, we actually do have platform PMs for that specific reason. We’ve started to delineate because the skillset, especially for like an internal platform PM is much more technical, I would say. Well, not always, but you’re often building internal products that speak to different systems. And like, there’s a complexity there that requires a slightly different skillset than an external facing product.

0:42:29.0 Tim Wilson: But is the platform PMs, is their kind of key stakeholder the product PMs?

0:42:38.6 Moe Kiss: It’s the wider business in general.

0:42:41.2 Val Kroll: Yeah, the decision makers of the business, right? Most commonly, or at least in my experience.

0:42:47.7 Moe Kiss: Well, it’s building product and infrastructure that serves the rest of the business. So it enables them to do their jobs, right? And we do actually have a data platform team that sits in there as well. Which is why I’m really curious about Eric’s perspective on data warehouses not being a product.

0:43:06.3 Eric Sandosham: I tend to think of data warehouses as an infrastructure capability, right? Because it’s…

0:43:11.3 Tim Wilson: I mean, that feels like a platform, right?

0:43:14.0 Eric Sandosham: Yeah, it’s not a singular purpose. It’s sort of, you could sort of bend it, twist it, curate it to serve multiple solutioning.

0:43:30.4 Tim Wilson: If you tried to apply the P&L and try to specify P&L on a data warehouse, like, oh, God forbid, right? I mean, that’s like, that would be an absolute nightmare. Like there’s, it’s like, I don’t want to say treat the data warehouse like the help desk, but kind of in the context of this discussion, maybe that’s a step too far. But to me, platform there, it feels very clearly different that a platform or infrastructure is different from product. I guess maybe that was my misguided question of saying, why isn’t the platform there to support products? And the products are there to support the business. So it’s messy and it’s complicated. It’s why an architect, they’re a more senior person because they got to be kind of saying, how do we put something together here that is going to be able to effectively serve a bunch of different needs coming at it from different directions?

0:44:41.8 Val Kroll: I think we’ve hit our all-time high for amount of analogies and metaphors on this episode. Credit scores, peanut butter. I love it.

0:44:53.4 Tim Wilson: We got an apple, we got a chair. So maybe let’s play a little game that mostly credit to Val on this, although I think Moe and I have been contributing, like we’ll play the data product or not a data product game. And this will be a four-hour episode because we have like eight things to run through here. We’ll see. I mean, it’ll be interesting under this discussion whether there’s a quick consensus. So we’ll just maybe run through these. So let’s say we have an internally built A/B testing sample size calculator. Is that a data product? How do we vote? I say no.

0:45:39.3 Moe Kiss: Okay. We’ve got to have a system here.

0:45:40.2 Val Kroll: Yeah. What’s the game?

0:45:41.2 Moe Kiss: So is Eric going first or are we doing thumbs up and Tim’s going to report on the thumbs up or thumbs down after?

0:45:48.1 Tim Wilson: Yeah, let’s do it all visually. Let’s just put it in the chat and then we’ll all kind of chuckle and not share. We could put Eric on the spot for every…

0:45:58.6 Eric Sandosham: We could just all show a thumbs up, thumbs down simultaneously, right?

0:46:02.2 Tim Wilson: This is an audio format. Let’s not do that.

0:46:04.9 Eric Sandosham: Oh, that’s true. That’s true.

0:46:07.1 Tim Wilson: It’s like Michael Helbling is here. I will take it. I will take a strong, I will take a stand that I think it’s, that it is not. I’ll take a stand on everyone and then you guys can shoot me down.

0:46:19.7 Moe Kiss: I think it is.

0:46:20.9 Val Kroll: I think it is.

0:46:22.9 Tim Wilson: Damn it.

0:46:23.7 Eric Sandosham: Yeah, I think it’s not.

0:46:27.4 Val Kroll: Oh, split. You got the heavyweight on your side too. Okay, so Eric, why is it not?

0:46:37.0 Eric Sandosham: I think it’s a data solution for sure. And with the API, so if you build into a wrapped around template, maybe even an application. But to say this has re-usability, generalizability, I’m not sure because it’s very, it’s bespoke. It’s for a specific type of work. And the data is also very specific to that kind of use case. Would it not be?

0:47:07.5 Moe Kiss: Okay, but what about if you have 150 data scientists and you’re running a thousand experiments a week? Would it then meet the criteria? And all those 150 are using it.

0:47:18.6 Eric Sandosham: Okay, so if you say we built an A/B testing, again, we get into words and semantics, A/B testing platform, right? That is like a recommendation engine, an A/B testing engine, where a thousand data scientists can now use, reuse the same kind of functionality and capability. Then yeah, perhaps, because then in itself, I could even think about monetizing this commercially outside of the organization.

0:47:47.6 Tim Wilson: But I think it’s not a data, there’s no… This is you’re entering some numbers or maybe it’s pulling them, but you’re entering numbers and it’s doing math. Like it’s a tool for the data scientists, but it’s not..

0:48:04.6 Val Kroll: Yeah, so can I give you my use case? The one that we built when I was on this platform team, because this was one that I had submitted. It was built for marketers to figure out, is my list size that I’m about to send an email to big enough to deliver it as a 10, 10, 80, where you do send it to 20% and then the winner gets the 80%, you know, 24 or 48 hours later? Or is my sample size doesn’t support it for like the type of change or my baseline open rate triggering? It was 10 years ago, but it was for marketers to make a decision about how they could be exploring their hypotheses. So that was the…

0:48:39.8 Tim Wilson: So there’s a distinction. I was thinking of the ones that I built where you literally had to enter multiple values and it would calculate it, which I would say is not. It’s just a calculator. It’s no different from an HP 12C that just has an interface built to do some simple math. If the calculator is saying…

0:49:01.9 Moe Kiss: Is a calculator not a product?

0:49:03.6 Tim Wilson: I don’t think that’s a data product. No, it’s just a tool. Right? Because there’s not a data asset in the calculator. That’s the, I am entering some numbers and doing math.

0:49:16.8 Moe Kiss: Did you need specialist data expertise or knowledge of statistics to build it?

0:49:20.8 Tim Wilson: To build it. But I don’t know how to build a calculator that does summation. But I would put that different if Val, what you were describing was somebody picking and saying, these are my criteria. And then it was saying, here’s your list size. Maybe it’s prompting you for a question of what kind of lift do you want? And it’s hooking into an underlying data asset. Like to me, I’m hinging on there has to be an underlying data asset, not just code.

0:49:47.8 Val Kroll: Yeah, not just like, oh, I’m dealing with 10,000. It was saying like, if my audiences have these interests, what type of population size are we talking about? And this is the lift expectation. Yeah. Okay, so that’s a good distinction. Okay, so this rapid fire game.

0:50:02.7 Moe Kiss: Not sold.

0:50:03.9 Eric Sandosham: I like what Tim talked about. I mean, so if that calculator, let’s say, comes with an underlying database, then it’s built as part of the feature of that calculator, then perhaps we can say it’s a data product, right? But today, if that calculator says, I bring data to it, but it actually doesn’t come coupled specifically with a set of data asset, then really it’s a tool, it’s a calculator and not a data product because there’s no underlying data to it, right?

0:50:31.4 Val Kroll: Good one.

0:50:32.6 Tim Wilson: Moe, you convinced? Has Moe…

0:50:36.9 Val Kroll: All right, you give the next one. You throw the next one out, Moe, from the list.

0:50:40.6 Tim Wilson: Which one do you want? You can pick one and then you can declare what you think it is.

0:50:45.5 Moe Kiss: An interactive dashboard containing campaign performance, including targets.

0:50:50.5 Tim Wilson: What do you think? I declared what I thought it was. We still haven’t specified the rules of this game, I realize.

0:50:58.5 Val Kroll: We’re just going with it.

0:50:59.2 Tim Wilson: That has to be a data product.

0:51:00.9 Val Kroll: I think it is. I say yes. If targets are included. If targets are not included, I think there’s an argument that maybe it’s not.

0:51:09.9 Moe Kiss: Why does the target make a difference?

0:51:10.9 Val Kroll: Because it’s helping to make a decision. In my opinion, I think that that’s helping inform action. So that’s what makes it more purpose-built. But I don’t know. Maybe there’s… Eric’s probably like, oh boy.

0:51:24.8 Eric Sandosham: I’m leaning towards not a data product.

0:51:27.7 Val Kroll: Oh, dang! This is going to be a four-hour episode because we still have so much to learn. Okay, so tell us why on this one, Eric.

0:51:36.8 Eric Sandosham: So isn’t the dashboard, the key value proposition is the front-end user interface, right? How do you visualize that information? If I took that away, I mean, the data is still there, right? You always had this stuff anyway. You’re solving a visualization challenge, isn’t it?

0:51:55.7 Tim Wilson: But you’re also guiding them into what they should be thinking about and helping them. So I’m not even sure the targets are as much as I want to have targets on it, that targets are required. If there’s a process in place where they say, I am monitoring campaign performance and I’m looking at it and it is showing that my ROAS is low or my conversion rate is low, and I’m looking at this visual thing from curated underlying data assets, and that is helping me decide that I need to contact the agency, shift channels, shut this off. It feels very much like a data product to me.

0:52:37.3 Announcer : Okay, I mean, if you’re building sort of an integrative layer or capability to it, triggers off certain red flags and it may trigger off certain downstream processes, perhaps, then it becomes almost an engine versus just a visualization layer, right?

0:52:55.1 Tim Wilson: Yeah, this is my fear.

0:52:57.6 Val Kroll: But also, if it’s curated, right? I think this is maybe where you were going a little bit with it, Tim, of what’s in the dashboard is curated to answer the specific questions or actions that that business person is going to take versus just like, whatever is out of the box with whatever tool is collecting or adjusting that data to just show you how it’s performing, I feel like.

0:53:19.7 Tim Wilson: I provided you a visual interface to the data warehouse. So let’s see if we can quickly figure out, because we’re going to do one more. It’s going to be just to try to land the plane. So the chart I described at the beginning of the episode, which is a single chart, it is based on a data asset that is just, I mean, it’s a 250 row data asset that is used for planning. To me, that feels like it is a data product. It’s a very simple data product. And I am the product manager and analyst and business user of it.

0:53:52.4 Val Kroll: What’s the monetary value?

0:53:55.8 Tim Wilson: Yeah, tied to the P&L, since we make nothing. That’s a fair point. But I mean, even knowing we pulled it up in one of our planning sessions for a future episode and said, who should be the co-host? And it was like two people were like, well, that needs to be the combination. I was like, we just made a decision in way faster time. So I’m going to say that that is a very simple data product to me.

0:54:19.1 Moe Kiss: So one thing, though, that has been top of mind for me is if you’re building something for one discrete person or one very discrete problem with no way to kind of… So to me, and we’ve been thinking a lot about this when it comes to some of the dashboarding ones, is like, is there a way we can prep and put together the data and the dashboard that it could serve the needs of multiple teams, right? Because then actually, you’re getting a lot more value from it and you’re also not doing that insane thing where you’re building to one internal person’s specifications and then they leave and it’s never used again. What concerns me about the example that you’re talking about, Tim, is like you’re pretty much the only user of it.

0:55:07.0 Tim Wilson: Except the flip side, you try to build something that serves everybody. You’re starting to try to build a freaking platform and it blows up. And I will say you were 30 minutes late to the meeting, but we actually did have a group use it. So I know some of these things get fallen on me, but I would…

0:55:25.1 Moe Kiss: I had a conflict.

0:55:29.2 Eric Sandosham: Tim, because you had a sort of preview of some of these questions prior to this call and then this specific example, I thought about it actually. And I thought as it stands, my answer would be no, it’s not a data product, but it could be. And then I had to sort of, I mean, this was just my first initial gut reaction. And then I asked myself, why did I think like that? And I tend to lean towards what Moe is saying as well, the sense of scaling, repeatability, reusability. So let’s say if you, so now you say I have a scheduling, if the scheduling doesn’t come with an underlying data asset, then it’s just a tool. But it says that, look, all you have to do is there’s a way to set up this database and everybody will put it in, load it, APIs, whatever. And then I can lift and shift this as a scheduling wrapped solution beyond just for us, but anyone could use it, right? Then it’s again, with a view for commercialization or generalization, whether you choose to charge for it, it’s besides the point, right? But it’s beyond just a single use intent because then it’s repeatable, right? And when it’s repeatable, then you have an asset to talk of because if it’s not, then it’s useful to you, but it’s not really an asset, right?

0:57:00.9 Tim Wilson: Okay, well, so it sounds like we’re not going to totally solve this, but we’ve got a lot of food for thought here and we’re running way over. So we’re going to have to leave it there, but not before we do what we love to do, which is go around and everyone share something, maybe not about data products or it can be about data products, that they found interesting, useful, thought-provoking, silly as a last call. So Eric, you’re our guest. Do you have a last call?

0:57:36.1 Eric Sandosham: I’ve been reading a little bit about these large reasoning models. I mean, when I first heard the term, I’m sure like many of you, extremely skeptical. Is it just a relabel of the same stuff? And in some sense, it’s really just the introduction of reinforcement learning into a large language model and then you’re calling it, oh, it has reasoning ability. So in fact, the two papers sort of crossed my path in the last couple of weeks. One was now they sort of have the ability to look at the artificial neural net at the back end and see what nodes sort of light up when you’ve asked the likes of say ChatGPT to take a step back and argue true why you came to that conclusion. And what the researchers found in that article was that while it does give a better answer, right? It slows it down and it breaks up the workings. It’s actually lying. That means because the nodes at the back are firing off in a different way, but it’s telling you I’m solving it in this sequence of step because that’s what you want to hear. But actually, the neural nets are not reflecting that same reasoning logic that it’s telling the end user. So what can you trust, right?

0:58:57.5 Eric Sandosham: If you did it as a one-shot thing, if you ask it to do it sequentially and break it down, it may look nice and you think it’s doing something reasonable, but it’s also lying to you because it’s telling you stuff you want to hear. And another paper that sort of complements this was they looked at say a regular large language model, an LLM versus a large reasoning model. And a lot of these metrics that they measure in terms of the accuracy and all that are not measured on a one-shot case. So they will typically, let’s say I run it for 50 times, 100 times, and then what’s the accuracy level or the correct levels that come back as a response based on a 50 run trial or 100 run trial? Typically, that’s how they set it up. And so the authors also commented that this is quite nonsense because in the real world, we don’t run something for 50, 100 times. It’s a one-shot deal. Either you get it right or wrong and can I trust it or not, right? But what was interesting is, okay, fine, if you want to use this metric, then let’s just run it. And they found that if you run it enough times, let’s say from the 50, 100 run, let’s run it at 5,000 times. Let’s run it at 10,000 times. They found that the large language models without this reinforcement learning outperform in accuracy the large reasoning models.

1:00:20.4 Eric Sandosham: So the large reasoning models start off well, and then if you run it enough times, the large language models overtake and on a long-term basis, they all perform better than the large reasoning model, which then the authors conclude that there are two takeaways. So one is that what’s happening with the reinforcement learning is it’s asking it to be accurate on a minimum shot approach. So they are pushing for more immediate specificity, but sacrificing long-term generalizability. So you’re sort of asking it to specialize quickly, right? Two, it also suggests that the data that the large language model has was already sufficient for it to achieve the same outcome as the reasoning model. There was nothing new that was put into the mix because with enough time, all the data was already essentially, the ingredients were always there, right? And I thought this was, well, you know, it’s quite interesting.

1:01:20.1 Val Kroll: It’s deep. That’s good. It’s so interesting.

1:01:23.0 Tim Wilson: They’re going to take over one way or the other. Now they have multiple vectors to come at us through brute force of trials or through reasoning. Wow. Somehow that feels vaguely unsettling, but fascinating. Wow. So Moe, what’s your last call?

1:01:41.5 Moe Kiss: Okay. Well, last week I had a heap of fun at Snowflake Summit in SF. And one of the really cool things I’ve been playing around, yes, my team are terrified because I’ve been on the tools a bit. They have this new thing called Cortex, but basically it’s all of their AI and machine learning capabilities. But the really cool bit that I’ve been testing out is basically now they found a way that you can reference the Cortex functionality in your standard, almost like a normal standard SQL function, which is super cool because you don’t need to write like all this separate Cortex code. You can just use like, it looks basically like a SQL function. But I’ve been playing around with images specifically. So you could reference static images at the moment and it will return things like object tags, descriptions, embeddings, all that sort of stuff. And yeah, it’s just been really fun because you can do a lot more image analysis without needing extra infrastructure and kind of just build it into your standard SQL workflow. So just been having play with that and it’s pretty cool and worth checking out.

1:02:46.0 Tim Wilson: Snowflake, still not a sponsor. Just gonna release cool products. We’re gonna talk about them anyway. That sounds interesting. Val, what’s your last call?

1:02:58.9 Val Kroll: Speaking of interesting. So this is a fun one and it’s been a minute since it was released. But if you didn’t catch it, Josh Silverbauer, who is an absolute gem, launched the Third Party Show. I’m sure you’ve seen that on LinkedIn. And his first guest was none other than Tim Wilson. And it was highly enjoyable. They play like lots of different games and Jason Packer made a couple appearances. And so it’s a good time. So just wanted to give you a shout out, Tim, because you guys did a great job and I can’t wait to watch more episodes.

1:03:38.7 Moe Kiss: Oh, wow.

1:03:40.5 Tim Wilson: And it had lots of canned laughter as well.

1:03:43.2 Val Kroll: It was perfect. All right. How about you, Tim? What’s your last call?

1:03:47.2 Tim Wilson: So mine is literally just the simple data visualization with a bunch of data behind it.

1:03:53.9 Val Kroll: Okay, the upset chart. Forget it, Tim. We’re done with the upset chart.

1:03:57.4 Tim Wilson: Yeah, no. So this upset chart, so you take combinations of sets. No, this is David McCandless, who I have kind of a weird, I have no direct relationship with him, but it’s information is beautiful. So I’ve at times had like data just for the sake of making it beautiful, kind of rubs me wrong, but he does really cool things. And so one of them was breaking down the pieces called based on a true, true story. And it goes through like 18 films that are based on a true story and breaks down literally scene by scene, and whether each scene is true, true-ish, false-ish, false or unknown. And you get it just in a simple bar that are that are bands like the red is false and true is blue and unknown is gray. So you get an enormous amount of information and you can click through to the Google Sheet that’s actually feeding it. I mean, there’s a researcher. I have no idea. Compiling this information seems insane.

1:05:02.7 Tim Wilson: So it is a lot of information, very densely populated and just because I know you’re asking, depending on how you select the criteria, the most true to the true story film of the 18 they looked at was Selma followed by The Big Short and Bridge of Spies. The three that were least true to reality were The Imitation Game, which I love that movie, Hacksaw Ridge, which I never saw, and American Sniper. Those were all like half or less than the scenes were actually true or true-ish. So it’s just a very dense. And when I saw it, I was I could not help but sort of poking around and saying, okay, this is it doesn’t really matter, although I’m always having that question when it says based on a true story of like, hmm, how true is that really? But it’s just pop culture.

1:05:56.2 Val Kroll: It’s fun.

1:05:58.0 Moe Kiss: I want to know about the Widow Clicquot movie. That’s the one.

1:06:00.5 Tim Wilson: You know, that is not on there. I bet that one’s got a pretty…

1:06:04.0 Moe Kiss: I assume it’s not. It’s very niche.

1:06:06.0 Tim Wilson: Pretty close. Yeah, but very on brand for you to wonder about it, though.

1:06:11.4 Moe Kiss: It is.

1:06:14.3 Tim Wilson: So we have, yeah, we’ve definitely gone over. But I will therefore thank Josh Crowhurst for the added bit of editing that he will be doing. Shout out to Ken Riverside and Fourth Floor Productions as well for their support in this. Eric, thank you for coming on. I feel like we should get a report from you that we can put in the show notes as to whether you survived the rest of the day since you’re now going to be talking to a clear audience.

1:06:43.1 Val Kroll: Survived this.

1:06:46.0 Eric Sandosham: Thank you. It’s been thoroughly great. I think it’s, you’ve managed to shift some of my positions. I mean, I thought I had sort of landed on a position in my mind. But I think based on this conversation, it’s shifted a bit. Yeah, and that’s good.

1:07:04.5 Tim Wilson: It’s on my bucket list to sometime see one of your Medium posts and say, I think that might have been, you might have been saying this guy was completely dead wrong, but here are some thoughts to totally tear it down. I don’t have to be held up as right, but if somewhere one of your posts, again, I cannot recommend those, they’re the right length. So many people are writing, and this is pot calling the kettle black, but write really long weekly newsletters that I do read. Yours are like the perfect length. And they always make me think and clarify.

1:07:39.5 Eric Sandosham: Thank you. Thank you.

1:07:41.5 Tim Wilson: Bonus last call. We’ll definitely link to your Medium content. So if you’ve enjoyed what you’re hearing, we would love to get a review on whatever your podcasting platform of choice is. We recently updated our website to, now that Google Play no longer exists and iTunes isn’t a thing, maybe our subscription page was a little out of date. But we’d love to get a review. If you’d like to reach out to us directly, you can do that on the Measure Slack, or you can do that through LinkedIn, or you can just send an email to contact@analyticshour.io. If you want a sticker, you can head to analyticshour.io and click the Sticker Me link in the global navigation and fill out a little Google form, and we’ll be happy to get some stickers in the mail to you. But really, if you’re sitting on a chair, maybe polishing an apple, cutting it up so you can dip it in some peanut butter, so you can kind of crunch on it and enjoy it while pondering a dashboard or some other data product, just always make sure you keep analyzing.

1:08:48.9 Announcer : Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at @analyticshour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:09:06.8 Speaker 6: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.

1:09:12.9 Speaker 7: Do the analytics say go for it no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

1:09:27.3 Val Kroll: Rock flag and Eric said the upset chart is not a data product.

1:09:35.3 Tim Wilson: Burn.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#273: Data Products Are... Assets? Platforms? Warehouses? Infrastructure? Oh, Dear. with Eric Sandosham

#273: Data Products Are… Assets? Platforms? Warehouses? Infrastructure? Oh, Dear. with Eric Sandosham

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_273_-_Data_Products_Are…_Assets_Platforms_Warehouses_Infrastructure_Oh_Dear._With_Eric_Sandosham.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares