#220: Product Management for Data Products and Data Platforms with Austin Byrne

Data gets accessed and used in an organization through a variety of different tools (be they built, bought, or both). That work can be quick and smooth, or it can be tedious and time-consuming. What can make the difference, in modernspeak, is the specifics of the “data products” and “data platforms” being used for those tasks. Those specifics, in turn, often fall on the shoulders of (data) product managers! In this episode, Austin Byrne, Group Product Lead for Data at Canva, joined us for a discussion about the similarities and differences between typical product management and data product management!

People, Podcasts, and Other Resources Mentioned in the Show

Photo by Kiran K. on Unsplash

Episode Transcript


0:00:05.8 Announcer: Welcome to the Analytics Power Hour, analytics topics covered conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.

0:00:22.6 Michael Helbling: Hi, everybody, welcome to the Analytics Power Hour. This is episode 220. You know, we’ve all been sold an aspirational story that we can democratise data, a business user should be able to self-serve that data and generate insights, well, at least to a certain extent. And product after product enters the data and analytics space to help solve those problems, and then we build data products within our organisations as well, and all this… I mean, it requires a lot of product management. That’s why we’re gonna talk about it. So let me introduce my two co-hosts, Moe Kiss, you’re the Marketing Data Lead at Canva. How you going?

0:01:07.1 Moe Kiss: I’m going great.

0:01:08.6 MH: It’s great to have you back on the show and the whole gang back together.

0:01:14.2 Tim Wilson: So close. It’s the marketing data lead. But you got the how you going, but…

0:01:16.7 MH: Dah-ta, day-ta. Dah-ta. Okay.

0:01:19.8 MK: Almost. Almost.

0:01:20.6 TW: You’re always 50% to fully localised there.


0:01:22.9 MH: Yeah. Well, Tim, I don’t live the life that you lead right now of being just independently wealthy. And…


0:01:35.9 TW: I think I’m actually now inadvertently in training to be a long haul trucker, is what I feel like at the moment.

0:01:39.9 MH: Oh, okay. Well, anyways, I was like, “Oh well, I don’t know what to introduce as your job.” But I’m sure you’ll get one at some point.

0:01:47.3 TW: At some point.

0:01:47.4 MH: But anyways, we’ll just stick with quintessential analyst for now.

0:01:51.3 TW: Oh, that’s gonna be the incentive for me to wind up on a W-2 somewhere.

0:01:55.7 MH: That’ll get you off your butt. [chuckle] All right, I’m Michael Helbling, Managing Partner at Stacked Analytics. But we wanted to bring in an expert to enhance this conversation about product management in the data world. And Austin Byrne is the Group Product Lead for Data at Canva. He’s also held data and product management leadership roles at Atlassian and Facebook. And lucky for us, today, he is our guest. Welcome to the show, Austin.

0:02:20.9 Austin Byrne: Thanks so much for having me.

0:02:23.1 MH: Well, it’s great, and I’m glad that you could do this. And I think what’s a great kick-off point is you’ve straddled both worlds of analytics and product management. And in your role at Canva, you’re kind of combining those two things. But talk a little bit about that journey, maybe to kick us off and get started into the conversation.

0:02:45.8 AB: Yeah, sure. Something you’ll probably hear over this show is that sometimes I can be quite a stubborn individual, and…


0:02:54.5 AB: A lot of my career trajectory is rooted in that stubbornness. So I’m Austin. I actually started my career as a CPA. Still I’m a CPA, still hold the license. That’s Chartered Accountant in Australia. And I did forensic accounting and auditing for one of the big four firms. So I’ve always been… I guess I had a penchant for numbers and trying to understand patterns like puzzles when I was a kid and all of that. And so I spent a lot of time at this big four accounting firm, somewhat drowning away at, “Hey, here’s a checklist of things you need to do and things you need to check, and tick those things off.” And I really enjoyed that ’cause it gave me a lot of exposure to a lot of different companies, but I really desired something that was maybe a bit more in-depth and focused on a single subject matter, and so that’s when I made the switch from proper accounting into analytics, and specifically fraud analytics to start. Eventually that spiralled into marketing analytics and product analytics, a number of other things at Atlassian and Facebook, and I really enjoyed that.

0:03:56.5 AB: Built teams, built organisations, ended up at a point where I really wanted to make big step changes in the way we were doing analytics, and the blocker for me to do that was a lot of the tools that my teams were using, things like, how can we A/B test in days instead of months? Things like, why do I have to write a query every time somebody asks a question? Why can’t we actually simplify or democratise? Some of the most basic questions in self-serve tools. And so that got me thinking, “What if I went and built those tools?” And so I spent a lot of time, what we call moving crabs or moving specialties, going from analytics product management… I’m sorry, analytics management into product management. And as I went into product management, a whole new world opened up, changed my way of thinking. I’ll talk probably a bit more about it, I’m sure you all have questions about it, but the net result is, I’m very much working on products that allow analysts to work better and faster and smarter, allow them to collaborate with their PMs and their engineers and their software development teams better, and I’ve really enjoyed that journey because it’s taken me into, I guess, new challenges that I really enjoy chasing up.

0:05:11.3 TW: So when you’re saying building products, I assume you’re saying internally building products?

0:05:18.7 AB: A good question. Yeah.

0:05:19.9 TW: But I could… I feel like the last little bit of what you just said could be cropped out and put over into what there are a bunch of external products that are saying, “That’s what we do.” That feels like this dream that everyone is chasing. So clearly, you’re going with the, we need to build this stuff internally, I think.

0:05:44.7 AB: Yes.

0:05:45.4 TW: So why… Do you see… Is he… [laughter] I guess, like, where… I mean, that seems like… ‘Cause there is just a tsunami of tools. And I would claim that there are a lot of companies that are buying that saying, “Oh yes, that tool said it’s gonna give me exactly what you just described. I don’t need to have an Austin in my organisation to have a team to build it. I can just license it from… Insert my product here.”

0:06:10.6 AB: For sure. Buy, build decisions and how we’re actually gonna solve big problems that companies have have a lot of different variables, the size of the company, the relative expertise required to actually extract insight to how mature the actual culture is in the company. And so I’ve been lucky enough to be in enterprise, world-class organisations that value using data to understand what their customers need and what they want. And so with that, usually the investment comes in and they say, “Hey, we need an internal team dedicated to building internal data products to help us be better at winning our markets or achieving our growth goals,” or whatever it is. For a lot of companies, that’s even not realistic or it’s too slow, and we’ll just go buy something off the market. And so while I spend my time focused on internal products, there’s also a flavour of my role in product management that is focused on, “Well, what if we built that internal product so well?” We could actually then go bring that to market and create a whole new value generating vertical for the company. That’s the thing I’m doing right now, but it’s something I’ve done in the past, so it’s pretty interesting.

0:07:19.0 TW: Interesting. So Moe, are you kind of a client of Austin’s?

0:07:22.2 MK: Oh, so, poor Austin. Like I feel… He’s the person over the years that whenever I had these thoughts that I’m trying to figure out in my head, I’m like, “So Austin, you’re free for a chat? ‘Cause I just need to spitball some ideas,” and now everyone gets to listen to me do that again. But one of the things I’m really curious when you talk about your role now, who do you think the stakeholder… Oh, not who do you think. Who is your stakeholder in this world of data PM-ing?

0:07:54.5 AB: Define stakeholder.

0:07:56.7 MK: Oh.

0:07:56.8 AB: Is stakeholder the same thing as a customer?

0:08:00.1 MK: Sure. Let’s say who is your customer?

0:08:02.1 AB: Our customers right now, for the current role that I’m in, is the entire company. The specific class of customer whose problem we solve with a very specific solution differs wildly. So we may say for most team, they’re trying to develop a number of insights and it takes them a long time to do that because our query processing times are really slow, or they have to jump through a bunch of privacy constraints, or the tools just take a while to update. We only have batch processing, we don’t have any sort of real-time data infrastructure. And in that case, we’d be working directly with Moe and her team, both as stakeholders and customers of the solutions that we have on our internal product roadmap. But likewise, it’s really important to me to make sure that we’re getting the highest ROI for the engineering work that we put in. And what that means, sometimes the problems that we solve will be bigger when we work with non-data teams, when we help PMs make decisions, maybe without data analysts in the room. That can always be a tricky one because you never know how much you can democratise. You always need expertise to some degree, at least right now where we are. That may change with all the AI things that are happening right now. But kind of carefully treading between who your customer is at a particular time is 90% of my role.

0:09:25.1 TW: But did I catch you just say that product managers… Like the Canva product could be the customers of the data products that have product managers?

0:09:34.8 AB: Exactly.

0:09:36.5 TW: So presumably… So clearly they’re all product managers, they all think exactly the same way and approach everything exactly the same, I’m assuming, I’m assuming now?

0:09:47.4 AB: I think you’re hinting at something.


0:09:49.5 AB: There’s a distinction right now, and it’s a poorly-formed one in the industry. It’s the idea that there are platform product managers and there are product managers. And I could give you my opinion on it. It is a single data point, and it’s just an opinion, but the idea is platform product managers are building things that their colleagues use to then build things to send to their end users, to deliver to their end users. So whether… A good example would be, we are building an analytical warehouse that product teams can use to understand what our customers want and need so they can optimise a particular conversion funnel in order for their users to get closer to the product, deeper in the product, more engaged with the product, whatever it is.

0:10:38.7 MK: So sorry. You said the industry thinks that those two things are different, and you don’t think they are?

0:10:45.5 AB: I do think they are.

0:10:46.7 MK: Oh.

0:10:46.9 AB: Yeah, sorry for the click… Yeah, I do think they are. I think what the industry usually sees is that product managers are product managers. And a lot of the distinction that I create in my job for the teams that I build is platform product management, where your customers are your colleagues, especially in large companies like Canva, is a very different proposition to product management, classic product management, which is understanding what your end users want and need and building something directly for your end users. Often there’s more of like a daisy chain of attribution that has to happen when you go from building something for your colleagues, who then use that to build something for your end users. And that platform product management is where my teams are, it’s where I focus, and it’s a very unique way of product management. Of course there are similarities to the other sorts of product management, but it means a different way of working.

0:11:39.1 MH: And what primarily differentiates, in your view, those two things, if you could say?

0:11:46.4 AB: Yeah, two things off the top of my head, but the first is customer. So in product management, customer is king. The difference between your colleagues needing a piece of technology to get their job done versus your customers asking for a feature, it happens to be very different, the research involved, the discovery processes, how you align on the roadmap, all that kind of stuff. It’s like customers are one. The second is, usually… And I say this with a big caveat. This is a very generic statement, or generalised statement. Usually, platform PMs spend a lot more time on the technical side. They understand the technology that powers what your colleagues want and need more so than on the product management side. I’m sure there’ll be somebody listening to this that goes, “That’s actually complete bullshit.” And in a lot of cases, that is actually complete bullshit, but…

0:12:42.5 TW: Oh no, it’s…

0:12:43.6 MH: Wow.

0:12:44.3 TW: Once this goes out on the podcast, goes out as the truth, ’cause it’s technically on the internet.

0:12:47.3 AB: It’s true. Good. It’s on the internet, it’s true.

0:12:50.5 MH: We’re pretty authoritative here at the Analytics Power Hour, so you said it, there is…

0:12:55.5 AB: Have you heard of Brandolini’s law?

0:12:57.9 MH: Brandolini’s? No. What’s that one?

0:13:00.5 AB: Oh, no, I’m sorry. I was actually talking of Brandolini. That just came up. Cunningham’s law. So the quickest way to get the right answer is to post the wrong answer on the internet. I guess that’s what we do here.


0:13:12.6 MH: Nice. Perfect. Well, and alluding to Tim’s earlier comment about product managers always agreeing with each other…


0:13:20.9 MH: I’m sure that persists throughout the entire industry.

0:13:24.9 MK: Okay, I’m sitting here dying. I’m sitting here dying, like… Okay, because all of the things you’re saying, Austin, like the thing, I guess, that is floating around in my head is whether people that work on… I don’t know if platform PM is the word we wanna use or like… Obviously, specifically for me, I think a lot about data products or products that are heavily focused on data that require deep technical knowledge, and the bit that I keep coming back to is, I’ve worked in product analytics as well. And the PM, there’s such a big part of their job that’s advocating for the customer. For example, the e-commerce checkout side is the simple example of trying to make that checkout funnel easier to use and navigate, and what are the pain points and blah, blah, blah, blah, blah. But when we’re talking about, and this is the thing that Austin and I spend a lot of time talking about, for example, a measurement solution, the knowledge and skill that’s required to PM that space is… It’s such a technical space. And I would say experimentation is another really good example where like I for so long did not know that Chad Sanderson was a PM. I thought that he was a data analyst that worked in experimentation. And over the years, he’s a friend of ours, of the show. I found out he was actually a PM that just was such a specialised PM. He knew a shit-ton about experimentation, and that’s why he PM that space.

0:14:52.7 MK: So, the bit that I’m trying to get to is like, in order to PM these spaces, do you have to just be an exceptionally technical person? Do you have to be specialised in that space? Is it something that you can learn, or is it like there is a different relationship that has to happen with those experts that work in that space for the PM to understand the needs and requirements and help bit out a roadmap and a strategy? And I have become Tim because I just waffled for a really long time.


0:15:22.5 AB: That all makes sense to me. And it’s a good question.

0:15:25.4 TW: So you’re still not me, ’cause let’s do all of that and then just leave the, was there a question in there? Is what the guest asked. So Moe, you’re not there yet. [chuckle] Get a little more drunk.

0:15:34.9 AB: I think I pulled it out. Let me give you a mental model to frame my answer to this question. Again, I’m in equals one here, so I can give you my opinion. I’m sure there’s other ways to do it. But when I think of product management, or let’s just call it the product delivery life cycle, I think largely in terms of problems on one side and solutions on the other side. And so a lot of effort is spent on designing the right solution. And I spent a lot of my career on designing the right solution and trying to understand the technicalities of a solution, all the underlying technology, trying to be a subject matter expert, like we talked about with your friend in experimentation, who sounds like a world-class PM. When you understand deeply how to launch a particular solution, especially when that’s technical or requires specialised knowledge like statistics, that’s really awesome, and you need people that can do that. But a lot of product management is less concerned about the solution and more concerned about the problem. And by that I mean defining what the problem space is. So a good example would be experimentation. What is experimentation? Is experimentation just every time you release code? Does it also include deployments to actual servers? Is experimentation just when you A/B test or can it be a measured rollout?

0:16:56.5 AB: And defining that problem space to the unique needs of whoever your company is or your customer is, is 90% of the time where those platform PMs are spending their effort. And then choosing which problems to solve. Maybe experimentation is measured rollouts and A/B testing, but in our particular case, we’re an e-commerce company, it’s really important for us to make data-driven decisions. And so we care mostly about improving experimentation, both the breadth and the depth of experimentation, like how vigorous can we actually be in our statistical inference to make the absolute best decisions.

0:17:34.9 AB: And then how we do that, a lot of that is spent with data analysts or data scientists or engineers going, “What’s our options for actually solving those problems?” And PMs can often be seen as championing a very particular set of solutions or a single solution, but I would almost say, maybe this is contentious, that’s probably not their forte. That’s probably not where they should be leaning in. They should be the arbiter of good ideas and bringing in a lot of different ideas for how you actually solve those problems from the subject matter experts and then choosing whatever the solution is that fits the constraints of their roadmaps. Maybe they only have two weeks to deliver this. Maybe they have two years, that will necessitate a very different roadmap and a very different set of solutions, and those are the constraints that PMs have to actually work between, which problems are we gonna solve, how are we gonna solve them? And then bringing in subject matter experts and saying, what specifically are we gonna do to solve them?

0:18:32.6 TW: I love the phrase problem space ’cause that to me I think… That’s really helpful, I think. And I’m thinking back to 10 or 15 years ago when data warehouse projects kept failing, and maybe we weren’t talking product management as much back then, but there was like the industry lightbulb moment went on, that said, “Oh, we’re taking operational waterfall methodology, development methodologies, define specific features, the use cases, or the customer needs X or Y, get all the requirements, build it and then it will stably sit there for the next five to 10 years before we need to update it.” Whereas data warehouses, they were saying, “Well, we keep trying to get the features, and then as soon as the analyst or the business gets their hands on some data, they have another question, and the data warehouse doesn’t support that.” And so there was this industry aha that in retrospect, it’s like, well, yeah, ’cause the problem space, I think, around the platform-oriented data stuff is you’ve gotta…

0:19:37.3 TW: It does sound like the real secret is to define the problem space, not so broadly, we need all the data with zero latency readily available and fully documented. Like, well, that’s a definition, but that’s too broad. If you say, we need to be able to spin up landing page A/B tests in under a day, that may be too narrow. Whereas I still think normal product managers with features, say the e-commerce checkout, or probably a little more, their problem spaces are either… Are probably more often a little smaller, unless they say, “No, we need to re-envision the way that e-commerce occurs.”

0:20:17.0 AB: For sure.

0:20:20.5 TW: So am I playing that back? Am I understanding what you’re saying?

0:20:24.1 AB: Yeah, for sure. So what you’re talking about are the breadth of problem spaces at different altitudes. And so oftentimes tenure or leveling of a product manager determines the altitude. So if you’re just starting out, you’re focused on a single conversion flow. That’s a bit more narrow of a problem space. But maybe if you’re just trying to grow your monthly active users, that’s a really broad problem space, of which e-commerce funnel is a single lever that you can pull in order to grow your monthly active users. So depending on what your intended impact is, that kind of determines the altitude. And I think most of that altitude determination is best articulated through a use case, through an actual… Like, you don’t have to get as pedantic as a user story. I was a data analyst, I need X, Y, Z. It doesn’t need to be like that, but it’s like, what are you actually trying to do here? Are you trying to help people understand within 10 minutes whether or not a feature rollout completely tanked your subscription, right?

0:21:29.9 AB: Or, are you trying to help anybody be able to answer any question that they have about basic financial data in your company? Like how many monthly active users do we have? Those are very different use cases and they require different technology and different infrastructure in order to answer those particular use cases. And so that usually helps me slice the pie a bit better.

0:21:52.1 TW: But is there a valid use case, and I’m back to thinking of Moe’s… And I don’t have enough… You mentioned kind of having an analytics mark or warehouse or something, where the product is really just the enabling infrastructure to then plug a range of different other data products on top of it?

0:22:14.2 AB: It would be awesome… Yeah. So there’s multiple layers in the stack. At the end of the day, like the internet’s a glorified form field or whatever, and everyone uses Amazon or whatever it is, and so you’ll go up in the stack and you’ll start to see there are commonalities in the stack, and there are places where things are differentiated. And so the current big differentiation right now is the difference between batch and streaming infrastructure for analytics. That’s a meaningful one that will actually… That gap will close, I think, my prediction is at least it will close over the next few years. And so depending on which problems you wanna solve, you might get the best bang for your buck by having a single solution. Like, oh, you wanna answer questions and you’re all right with maybe 12 hours lag? Use this analytical warehouse over here. And then you’re like, sweet, all we have to do is make the warehouse better.

0:23:03.2 AB: Maybe you need to improve some processes or whatever. But there are points where a new use case pops up that’s critically important for a company, like, I need to be able to understand within the minute whether or not I really screwed something up. And that may be way different than the warehouse that you have. It may necessitate new infrastructure. And of course, that’s more costly to build, and a product manager’s job is to understand, all right, if I was gonna go build this thing from scratch and then maintain it over a long period of time, what value am I generating from it and am I getting positive ROI? There’s no right answer on that, too.

0:23:37.4 TW: Yeah.

0:23:37.5 AB: It doesn’t have to always be positive ROI. But yeah.

0:23:41.8 MH: I’m sending this episode to so many people the second it launches.


0:23:44.8 MH: This is good. This is really good. So I wanna ask you about the definition of the space then, we’re talking about altitudes. Is that also something that the product manager comes in and does, is go ahead and define the altitudes or help define it across the organization?

0:24:02.6 AB: Yeah, I believe so. So product managers are responsible for what we call a product portfolio, and product portfolios can be, yeah, as big as grow your monthly active users, and as tactical as improve this single conversion flow in this single product. Generally the lead for a product management team, myself included, is responsible for articulating the product portfolio and the strategy for achieving whatever the goals of the product portfolio is. In my particular case, I have a bunch of disparate infrastructure, anything from AI platforms, all the way to analytics infrastructure, all the way to compliance, to consent, to make sure that we’re respecting our users’ choices and how we use their data. And so my job is to pull together a cohesive strategy that makes sense across all of that disparate infrastructure. Because it is all data-related, but that’s almost too broad of a slice. It’s like, what specifically about it are you trying to do?

0:24:57.8 AB: And in my experience, most of those platform specific things are things like improving the productivity across the entire company. Being able to actually deliver things to users faster is critically important for us to win in our markets. Generating completely new revenue lines, maybe from a new data product that you can actually launch to users and users can use it, not just your internal colleagues can use it. Those are the types of things that I’m spending time thinking about and articulating back and getting alignment, and making sure that we’re not just a rogue faction within the company building whatever we want, but it actually is meaningful to the company’s goals.

0:25:34.1 MK: I have a totally different direction to take this conversation. But one of the things that I spend time thinking about is like, in my mind, like the data space is quite technical and specialized and all that sort of stuff, and I do really wanna understand how data and product work well together, but I have this, I guess, in-built bias that data is this like special world, and maybe it actually isn’t. And I suppose what I’m trying to ask is like, are there nuances to how PMs work with data folks that are, I guess, special and unique, or is it like, no, this is just another area that PMs have to get across and manage? Or are there, I guess, differences in the data space that mean that that working relationship would look different?

0:26:24.8 TW: Moe, can I ask you to clarify… Can I ask you to clarify something? ‘Cause you said it twice, and I’ve had the same question both times. When you’re saying more technical and specialized, are you thinking technical and specialized because data, generic data, is different and unique, or are you thinking technical and specialized because inside any organization of significant scale, the actual… The specifics, the natures of the data within an organization, what a monthly active user is, what revenue is, what the different interplays between the systems are, that that’s super unique and specialized? Or is it both?

0:27:07.8 MK: All right, that is a great question, Tim. Can I take column A and B? I actually was thinking about the former, but now I’m like, oh, you raise a really good point, because the second is also true. But my initial intent was that the world of data is complicated and difficult.

0:27:27.3 TW: ‘Cause I guess to me that’s the second part is where you really struggle with a third party tool that says, “We can do it.” They can hire people who understand data, they can’t really deliver the solution. So, okay.

0:27:40.8 AB: For sure.

0:27:42.6 TW: Back to you, Austin.

0:27:44.1 AB: Yeah, it’s a really good question. And, again, I’m a single opinion on this topic, but the way I see it is, as a data practitioner you can either be, to Moe’s former point, you can be a subject matter expert for a very particular data practice, or you can be acting as someone who’s trying to achieve the same goals as the product manager. In my world, as a platform product manager, I work with data folks as practitioners who help me understand subject matter I do not understand, and help me craft solutions for the rest of the company that are valuable. But on the product side, for the product managers who are just trying to deliver a feature for an end user, optimize conversion fall, whatever it is, often the data analyst, the data scientist, has the same goals as the product manager, and that’s a very different way of working together.

0:28:35.7 AB: That’s one where product managers and data scientists should be sitting in a room going, “What are our opportunities? Where should we be going? How are we gonna actually do this? What are you seeing in the data that I’m not seeing?” To tell me about the patterns you’re seeing, and they’re ranking the patterns. Whereas most of my work is, all right, we have a problem, we wanna solve it, but we need subject matter experts to help us understand maybe the statistical library that needs to power this Bayesian inference for a particular experiment platform. And so it’s not quite the same as the product side, which is trying to hit a very particular company goal, it’s more in solution-building and trying to understand how we actually take the subject matter expertise, platformize it, and then we can democratise it to their company because we’ve encoded that person’s brain, effectively, into the actual platform code.

0:29:23.8 TW: Platformizing subject matter expertise. That sounds like super consultancy, but…


0:29:29.4 AB: Am I turning into a consultant? Oh no.

0:29:33.3 TW: No, that’s my second note of, I’m like, ooh, defining the problem space and… What was it? Productizing…

0:29:40.1 MH: Platforming.

0:29:40.2 TW: Platforming the subject matter expertise?

0:29:41.5 MK: Okay, so, Austin, that was like a glorious response that I was not anticipating, which is also really interesting that I’m gonna have to think about.

0:29:49.4 MH: How insulting is that?


0:29:51.7 MK: What? What’s wrong with the word glorious?

0:29:53.7 TW: You were like… Well, I thought you were gonna be like… Well, you were expecting him to shit the bed on the response or what?


0:29:57.9 MK: No, I was expecting a totally different response. I suppose in my mind, I was more thinking about different specialties, and is like a PM trying to understand the data specialty different to a PM trying to understand the intricacies of the front-end code or the back-end or the UX that needs to go into something. Like, is there something inherently special about data that is different? That’s… Yeah. I maybe wasn’t very articulate the first time.

0:30:33.5 AB: No. That’s a loosely held opinion. So the way I think about it is data is a proxy for what customers need and want, and PMs need to understand what customers need and want. Front-end code and back-end code is how we deliver what customers need and want. So if you take the problem solution framing, front-end code and back-end code is the solution, the data is the problem. It helps us understand the opportunity of the problem, helps us understand the value of solving that problem. It can also help us understand exactly how to solve the problem. But for the most part, it’s just… It’s not particularly unique, it’s another way of taking what’s at your disposal as a PM, which is all the information you can possibly get, all the ideas that you can possibly get, and making an ambiguous decision with the tools that you have. So if data is right there, you have a data analyst going, “I have a bunch of ideas about how we can actually convert,” that’s a really good thing. Oftentimes, PMs don’t have that. They have a BI tool, or they have their intuition. I think we all have felt that before, a product manager going, “You know what, this sounds like a pretty cool tool or a pretty cool flow, I’m gonna build that.”

0:31:53.2 AB: And it’s not necessarily rooted in any sort of data. And so I think the challenge, I guess, for the PM craft right now, and I’m talking with my product manager hat on, not my data hat on, the challenge for the PM craft is to better use specifically quantitative data, not just talking to three customers, but specifically using quantitative data across your entire customer base and use that to inform how you plan and how you strategize, more so than just validating what you build at the end of the day. Does that… It kind of answers your question, but I wanted to throw in maybe…

0:32:26.3 MK: Yeah, but the funny thing is… The funny thing is you said no at the start, and I would actually argue what you said is that data is different. Because data’s focused on the problem and all the other specialties are focused on how you deliver the solution. But anyway, that’s just like…

0:32:41.9 TW: I feel like we’re managing to model how this is super muddled, and we wind up in these abstractions and then it’s hard to say… We’re trying to abstract what happens. So let me throw a specific example where, whether somebody is specifically in a product management role, I think they were playing the role with data, and you said BI and I kinda got a little triggered, that taking just basic digital analytics data and the number of times that somebody who’s the BI product owner and doing the product management, and they say, sure, the in-need is to be able to get to that data, and you need to be able to roll it up and roll it down. And they say, “Well, this is… ” Because I think, to Moe’s point, not having the data expertise, they just assume if they pump in fairly atomic level data into the BI platform, then it’s basically just group-buys to roll it up. And that means the really, really critical de-duping doesn’t happen. Another example is that time series data has all these unique characteristics, that if you just pop somebody in who they’re thinking a customer database and they’re not understanding that time series data is unique, or my long time ago favorite is, do you…

0:34:07.8 TW: When somebody changes the data on a customer record, do you need to still have access to the history of the change? And I can’t remember what the word is that describes that or not. Like whether… It’s a lot bigger and more complicated to store the history, or do you just always keep the new one? And those are like, to me, three examples that if you drop somebody in who’s coming from managing operational, transactional-oriented products, they will screw those up and they will…

0:34:40.0 MK: For sure.

0:34:41.7 TW: And they won’t understand. It will take so many conversations trying to explain it to them, and they will super, super struggle. So I don’t know if, Moe, if those are fair examples of the sorts of things you were thinking about data expertise, but those three popped into my mind as, yeah, that’s why it is really frustrating to work with a product manager who’s just a seasoned product manager but not understanding any of it.

0:35:05.2 AB: Well, and I can play it back to you from… Now I’m actually straddling with it, but I can play it back to you from a product manager’s view. A product manager understands there’s an error. What they primarily care about is what’s the cost of the error? So if I’m in a purely greenfield product, I’m just trying to launch a new conversion flow, I have no numbers yet, de-duplication may be overkill for me. Man, it’s like I just need a number on the board and then we can get a bit better and more rigorous about how we actually count, ’cause counting is hard. If I’m part of a mature product that already has mature flows, product management, a growth product management team or something like that, then de-duplication is a bit more important and the cost of that error is higher because I’m trying to optimize increasingly minute fractions of my funnel, and de-duplication may play a material role in whether or not my experiment succeeded or not. So depending on where you are in the product lifecycle kind of necessitates the depth of, I guess, rigor that’s required. And oftentimes you’ll see data scientists will get frustrated ’cause they’ll be like, “Hold on, but what about this de-duplication thing?”

0:36:20.3 AB: And product managers will say, it’s like, “I acknowledge that, but at this particular time the cost of that error is really low for me. The value from just putting any numbers on the board, or for just getting something out there to users so I can start understanding whether or not they like it, is actually much higher than the cost of that error. So I’m gonna go with just getting something out to users.”

0:36:41.4 MK: I’m like, “Ooh, I’ve got popcorn.”

0:36:43.2 TW: I’m now thinking through a couple of… I’m thinking through examples where it’s like, ah… But there’s actually a couple of ways to get at the data, and the one who goes straight to tool X, it does the de-duping, and the other one doesn’t, and this is now in the hands of some marketer, and they’re putting a number up and the other one is putting it up, and if it’s off by 10%, the infamous analyst saying, yeah, but that’s just gonna happen. There’s the challenge of now the credibility in both the… One or both of the products, as well as the analyst is undermining, and there’s all of a sudden there’s this whole communication around, no, this is good enough. There were trade-offs, and we did it this way. And that is on the de-duplication, specifically, one, the product manager didn’t realize that it was gonna be off and then came back to me with like, “Why is it off?” Well, I can explain to you why it’s off. And it’s off by 10%, 15%. They’re different, and one definitely is right and one is definitely… One is more right and one is less right, but they’re probably all gonna go in the same direction, so it’s okay. But boy, all of a sudden, the audience for that, it’s okay, is way broader with way less tolerance and has this perception that the data is perfect in truth, it’s the truth, and why isn’t there just one singular answer? And that causes headaches for a ton of people.

0:38:07.1 AB: Oh, 100%.

0:38:07.2 TW: Including the product manager.

0:38:09.5 AB: But maybe a slightly contentious point is, there’s no counting in the world that is perfectly precise. Let me put on my accountant hat from 15 years ago.

0:38:22.1 TW: Revenue recognition’s a bitch, right?

0:38:24.4 AB: We used to actually put a star next to what we would consider immaterial errors.

0:38:30.2 MH: Like hold-to-maturity bonds, for instance, at Silicon Valley Bank?


0:38:38.8 AB: That’s a good pop culture reference, I guess.

0:38:39.0 MH: Yeah. Sorry.

0:38:39.6 AB: For us it was, the bank says you have a $1 million in the bank, and you have $1,000,001 in the bank. That one dollar is immaterial to the audit. And so as a public company auditor, I would put a star next to something that we deemed immaterial. Of course we had guidelines for what that immaterial thing is, but that can get tricky based on a number of… As you start to aggregate accounts and all that kind of stuff. But this is public accounting, where quite literally the nature of the job is precision in accounting, and that’s never perfect. And so for us to go in as data practitioners to our roles, our jobs or whatever, and assume that we need to be 80%, 90%, even 100% precise, I’d say you have to understand there’s gonna be some ambiguity in how you count.

0:39:29.1 TW: I don’t think it’s the data practitioners, I think it’s the business stakeholders, and I think sometimes the data practitioners, they say, “Oh, but they want it to be precise.” And I’m just now connecting that both of you guys worked at the same big four accounting stuff, so you should both be comfortable with this…

0:39:44.2 MH: Oh. Well, I was on the tax side though, so I didn’t…

0:39:49.0 AB: A completely different world.


0:39:52.1 MH: Yeah. Audit… Those audit people were a whole different… Never mind.

0:39:58.2 TW: But I do think that is a… To me you’ve described the bane of the data practitioner in many cases of… I think there are some that say, “I need to chase it down to be perfectly match up or I’m gonna get creamed by the business.” Or the ones that say, “No, this is fine. I’m still gonna get creamed by the business, I just need to stand my ground and educate them.” But oh, by the way, there’s a tsunami of tools out there that are promising perfect precision, like for another episode in my 3:00 AM middle of the night musings.

0:40:31.4 AB: Yeah, and one way to think about it too, is there are specific use cases that may need to be precise and are imprecise, and that creates a whole lot of strife, I guess, within the business stakeholder, the data team or whatever it is. But if the sum of the data product, like the actual… Like the whole of the data product is adding value above and beyond just the imprecision of that particular use case, right? So instead of, here’s a single dashboard; Here’s a dashboard with a collection of metrics, and maybe this one metric is slightly off. That may be okay for a product manager. They may say, “You know what, on the whole, I can see more numbers, and that is a good thing for me because I can make a more well-balanced decision, even if that one particular number is off.” This is kind of like the conversation with AI, with Google as like… Google and ChatGPT, where folks are saying, “Oh well, ChatGPT has the wrong fact, or Google Bart, it’s returning the wrong fact when I ask it a particular question.” And it’s kind of missing the point. It’s like the actual net value of conversational AI as a capability, where you can actually get to… You can summarize data really quickly, or you can get to just about anything that you want in a matter of seconds, is much greater than whether or not a single fact returned is correct or not. Does that make sense?

0:41:56.1 TW: I just… I think it goes back to the defining the problem space, because just as you were talking, I’m counting through four completely separate cases where that was kind of the, oh, but we’re giving you a dashboard with this data from different sources, so even if some of it’s a little off, in every one of those, the end users were like, “This is useless.” Like you wound up… Because they headed in thinking, “Well, maybe a thing or two will be off.” And I think fundamentally it was, well, they picked a solution, bought into the idea of this will have all this stuff, and then just kind of ran pell-mell, and just everything was not good. And, oh, the one data source they couldn’t get in was the most critical data source, and nothing matched, and it was difficult to use. So to me I think that’s still a failure of defining the problem space.

0:42:47.4 AB: Platform product managers. [chuckle] That’s the failure of platform product managers. ‘Cause if the key requirement is precision, and the solution is not meeting that key requirement, then we’ve failed to meet the customer need. And platform product managers are responsible for that.

0:43:01.9 TW: Well, or it could be, we need these seven different data sources integrated, and they’re prioritized one through seven, and somebody comes along and says, “Congratulations, we’ve gotten two through seven, one is just not feasible.” But that’s six, seven. And it’s like, but that was literally the most important one. And the second one, they’re off by 25% when I compare it to the source. And I’ve been in the middle of the people saying, “Why aren’t you so excited? ‘Cause look at this thing’s pretty.” It’s like it’s pretty, but it actually isn’t at all useful.

0:43:34.0 AB: For sure, and that’s like a classic trade-off to say, well, you have to create space for understanding which of the requirements are the top priority. And if we don’t get one in, is two through seven even worth doing, or shall we just spend all of our time on one, and can we have more time, and if we have more money. And that’s when you start to get the negotiations of, how do we actually deliver value? Maybe it’s gonna cost more than we estimated. And that’s 90% of what platform product managers are spending their time on.

0:44:00.1 MK: Austin, I do have to ask, like you have this maturity of thinking about this space, which obviously is from your experience and also your background with accounting I think is a big part of that. Do you think everyone in product is with you? Because I don’t know if I… I don’t necessarily see this across the board in the industry, right, that PMs largely understand that counting is hard, and yes, maybe if we hit two through seven, that’s good enough, and maybe it isn’t. But that willingness to like, it’s not going to be perfect. Especially when it comes to data and counting. ‘Cause that’s the bit, like Tim says, it seems to be this real pain point, and the data practitioners often end up in the middle, being like, this is off by 25%, but this is what we knew going in.

0:44:46.8 AB: Yeah.

0:44:49.2 MK: Is everyone else with you or are you just like… Yeah, on you own?

0:44:52.5 AB: This is a good lesson in how to bring out impostor syndrome really, really quickly.


0:45:00.4 AB: Of like, do you think that all of your peers think like you?


0:45:04.5 AB: So, I wanna say no, to pat myself on the back. [chuckle]

0:45:09.8 TW: We’ve got AI. Every time you said this is in of one, or this is just one person’s opinion, we’re gonna put the AI in where you said, “This is the way the group… This is the only answer.”

0:45:19.9 MH: That’s right.

0:45:21.4 AB: If I say no, then yeah, maybe I distance myself from my peers, but do I want to do that?

0:45:26.4 MK: I just…

0:45:27.6 AB: I don’t know. I think…

0:45:29.0 MK: But do you know what I mean?

0:45:31.1 AB: I do know what you mean. I think there are mental models and frameworks that I lean on from either my experience or having really, really good mentors and managers in the past that have helped me think about things in different ways. And I think when I talk about things like expanding your product management craft, or your product management tool kit, it’s more than just things like, do you prioritize using the RICE framework? It’s these types of things like, how do we actually think about that? Do you think about things in problems and solutions, or you think about things in terms of requirements and whether or not there’s a top priority, meets-all requirement, and everything else is just kind of nice-to-haves? And I would say probably the biggest challenge I see hiring product managers is everyone comes with a different toolkit, with a different set… Like, some people have a hammer and some people have a wrench and some people have a screwdriver, whatever, and you want to give everybody every tool, but you can’t. Realistically. We live in a world of constraints, we can’t. And so figuring out how everybody can…

0:46:35.7 AB: I don’t know, build a house with a very different toolkit is actually kind of the trick. And so I don’t think everyone shares what I’m saying. I’m an equals one. I don’t think that I’m right. But I think if I can take what I have in my toolkit and I can munge it with what everybody else has, I can enrich it for, using actual data lingo, with what everybody else has, then we can create real value, because we all have this really diverse set of tools that we can use to build something. Or it could actually fail miserably because nobody has a screwdriver, and we’re trying to build a house and you can’t build a house without a screwdriver. And then I would just say, your manager should probably hire somebody with a screwdriver.

0:47:22.3 TW: I randomly heard somebody say that, in many cases, there’s more than one right answer. And this was like three weeks ago. And I was like… I had not been able to let that go, ’cause I’m like, oh my God… This was kind of in the context of making your decision about life, but I was like, it sounds like the same thing. If you bring in a useful framework or a useful mental model, it doesn’t mean that’s the only useful mental model, right? There may be some that are bad, but there can easily be two or three different approaches that are all totally equally valid. And if you can blend them together, maybe even better? Is that fair?

0:47:58.3 AB: Starting to border on Buddhism, or any sort of like moral philosophy. There is no right, there is no wrong. There is no self.

0:48:10.4 TW: There can be more than one right answer.

0:48:14.7 MH: Okay, so we do have to start to head to wrap up, but I wanna ask a question that I hope will be applicable. A lot of our listeners are subject matter experts in analytics and things like that. From your experience, hopefully this is a little easier because I’m not asking for you to speak for the entire industry…


0:48:30.9 MH: What attributes make the best relationships or productivity from a SME and you, from your perspective? So when you’re working with a subject matter expert in analytics, do you want a cantankerous type of Tim Wilson character, or a more… No, I’m just kidding.


0:48:51.8 MH: No, but talk about some of those things. ‘Cause I think all of us are engaging with this at different levels, depending on our companies and things like that, and I can see that a platform product manager could be a huge asset to a group of subject matter experts, if they could be able to work effectively. So from your perspective, what does that look like?

0:49:14.0 AB: My number one answer, and this is kind of a trait that I look for generally when I build teams, is curiosity and a willingness to learn. That’s it. Above anything else. And so the reason I always look for this trait, especially in subject matter experts, is it’s actually really hard to come by once you really specialize, once you have really specialized knowledge. Because you’re so deep into maybe the weeds or the precision, that you say, “This is the right way.” And the right way is defined, right, is defined by some metric of success that you set for yourself. Whether it’s precision to the philosophy of experimentation or whether it’s right because the person you respect the most said it was right and you read their blog every day, or whatever it is. And so the ability to take a modality of being a subject matter expertise and to stretch outside that and say, what are you trying to accomplish, Product Manager? And the product manager doing the same thing, you tell me what you’re seeing, Subject Matter Expertise.

0:50:10.2 AB: If both of those folks have a curiosity and a willingness to learn and then adapt to new information that they’re hearing, that by far is the hallmark of what I would consider a really great partnership. You can’t just throw things over the wall and say, “All right, I’m the product manager, I’ve done this, now you go do this, Subject Matter Expertise. Let me know when you’re done and throw it back over.” It has to be kind of this nebulous, like deep in the trenches together kind of partnership, if you’re looking to deliver outsized value in your business. That’s just my opinion.

0:50:41.3 MH: I like it. All right. We do have to wrap up, but this is unfortunate ’cause this has actually been a really awesome conversation. So thank you so much, Austin…

0:50:49.7 AB: Thank you.

0:50:51.8 MH: For taking the time to do that and representing an entire industry of product management, lock step together, authoritatively…

0:50:57.9 AB: I’ll not be checking my LinkedIn for a long time.


0:51:03.8 MH: The title of your new book, Product Management Done Right.

0:51:07.6 TW: Was it Cunningham’s Law? What was it?

0:51:10.3 MH: Yeah, Cunningham’s Law. I wrote that one down. That was awesome. All right, well, one thing we do love to do is just go around the horn and share a last call. Something of interest that you’ve run across recently that might be of interest to our listeners. Austin, you’re our guest, do you have a last call you’d like to share?

0:51:26.1 AB: Yeah. I’m gonna pick somebody who I read very often. His name’s Ben Thompson. He has a blog called Stratechery.

0:51:37.2 TW: Oh. Yeah. Big fan.

0:51:39.3 AB: It is an incredible, incredible way to think about the commercials of very technical subject matter. He did write a post in, maybe it was October or November, called the AI Unbundling, that helped abstract what all of this stuff that we’re seeing with AI into really human-level things. Like could the creation of ideas and the substantiation of ideas and the distribution of those ideas to others, and he talked about it through the course of history, as newspapers were able to help us distribute ideas more effectively, but they couldn’t substantiate or create ideas really quickly. And AI can now help us do that with things like image generation based on a single string of text. And I thought that was a really unique way to think about it through a human lens. So yeah, I’d suggest reading Ben Thompson.

0:52:26.6 MH: Nice. Awesome, thank you.

0:52:29.3 AB: Yeah.

0:52:30.2 MH: All right. Moe, what about you? What’s your last call?

0:52:31.3 MK: I am so excited about this. I’ve been telling everyone about this, and I… Anyway, people are genuinely surprised when I tell them I don’t listen to a lot of podcasts. I find… I have a 10-minute walk to work, I can’t click into something and then zone out, so I often listen to the news or genuinely I actually like having a walk that’s like tech-free. So I don’t listen to a lot of podcasts, but this one I am now obsessively listening to. And I think the reason I am is because the episodes are actually like a couple of hours, so I can listen to 15 minutes here, listen to 10 minutes there, and it’s almost like a book, which is why I like it. But this one particular episode was the episode that got me totally obsessed with it. And it is the Acquired Podcast…

0:53:16.6 Announcer: This could be Joe Rogan? Is this gonna be…


0:53:19.2 MK: No.

0:53:20.4 MH: Stop!

0:53:21.3 MK: You’re not funny.

0:53:21.4 MH: Stop it, Tim.

0:53:22.8 MK: But the Acquired Podcast, which so many people I’m sure we’ll have heard of, because they’ve done like 12 seasons. But they just did an episode on LVMH, which, honestly, I just could… I was binge listening to what… I was like taking my dog for extra walks so I could keep listening. Because it talks so much about luxury brands, which was really fascinating, and the evolution of those brands, which I found that side of the house quite compelling and just fun to listen to because it was a different topic to what I’m used to. But then it also talks so much about company takeovers and stock and a bunch of concepts that I probably, like I don’t know as much about as I would like to, but then with this framing of these luxury brands. And it just honestly, I was so hooked. Like I have been telling everyone left, right and center about this podcast, this particular episode about LVMH. Yeah, and now I’m going back and listening to older ones on other companies. But like I said, it’s got me hooked.

0:54:24.9 MH: Cool.

0:54:25.6 TW: Nice.

0:54:27.7 MH: I’ll listen to that, ’cause I sure love luxury items. So… No.


0:54:32.4 MK: No, but one of the things they talk about actually is the difference between luxury and premium, which is not something I’d actually given a lot of thought to, but it’s all about the utility of the item. And then I was like, “Oh, that’s actually… ” Like I don’t know, there’s just the framing, and I think the two hosts of the show are phenomenal. And they had this whole analysis bit at the end, which I also really love. Anyway, I’m gonna shut up.

0:54:53.3 TW: You should listen to the episode on Viv… I can’t even say it.

0:54:56.2 MK: Veuve Clicquot?

0:54:57.4 TW: What’s your…

0:54:58.2 MH: That’s the one.

0:55:01.5 MK: Yeah. I don’t…

0:55:02.0 TW: I don’t think they have an episode on it, but if they did…

0:55:02.3 MK: No, but they’re thinking… One of the hosts wrote his PhD on champagne houses. So I’m actually considering tweeting them to see if they’ll do one on champagne houses, but I am gonna shut up now.

0:55:14.5 MH: Tim, you jumped right into it, and Moe was ready for you.


0:55:18.6 MH: Alright. Tim, what is your last call to share?

0:55:22.7 TW: So I do listen to a ton of podcasts, and I have been driving a lot this year, so I’ve had to actually expand some of the podcasts. So I’m gonna go with one episode from The Axe Files, which is David Axelrod. And he talked to Theo Epstein. So even though he is a political wonk, he’s had a couple of talks with Theo Epstein, and he talked to Theo Epstein… If you have zero interest whatsoever in baseball at all, then just jump ahead a minute. But if you’re anyone who’s aware of Major League Baseball, they made a bunch of changes, made like three big changes and some other smaller ones this season that literally every one loves, like the players, the managers, the fans. And Theo Epstein, who was kind of like Moneyball V2 with the Red Sox and the Cubs, was actually part of the consulting with MLB to make those changes. And he walks through… If you listen to it with kind of a data or an analytics ear, he talks about the market research they did of the fans. It was kind of a shift from all the Moneyball stuff and the MVP machine was around optimizing at a team or a player level to drive wins, and what that did is it kind of screwed up the fan experience. The games got longer, what was happening was crappy.

0:56:41.7 TW: So to see… Theo then jumped, and it was MLB saying, “Oh crap, we’ve got to actually improve the game.” They did a bunch of market research. The different ideas they had, they basically played about 8000 games of minor league baseball where they experimented with all of the different changes. He talks about some of them that they didn’t roll out because of those experiments. The episode was recorded… Not this episode, but the episode of The Axe Files recorded at the end of spring training, so the season hadn’t started. He’s rattling off what they’re seeing in the data from spring training, and now that, as we’re recording this, we’re kind of into the Major League season, and the numbers, like the reduced length of the games, it’s like spot on. So it was just crazy. I thought I was gonna be listening to some like political discussions, and I was like, “Oh, Theo Epstein, what’s he talking to him about?” And then I was like, “That is a guy who thinks broadly and deeply about how to apply data, and actually framing the problem really, really well.” So it was a really fun, fun listen.

0:57:43.5 AB: Very cool. MLB needs overall… OECs. Guardrail metrics.

0:57:48.6 TW: OECs? What are OECs? Your fancy product management talk.

0:57:55.3 AB: It’s overall evaluation criteria. It’s overall evaluation criteria. I feel like some of our listeners might know that one. But it’s the idea that, yeah, you can have hyper-focus on a single metric but there might be unintended consequences elsewhere in your conversion funnel or another complex system that is your product. And it sounds like MLB is starting to figure that out.

0:58:14.3 TW: When he even talks about… He was like, “When I was in the front office of a team,” he’s like, “I couldn’t worry about the health of baseball. I was compensated and driven to win as many games as possible.” He’s like, “That wasn’t my role at all.” So… Yeah, but definitely unintended consequences. But then he’s been part of the solution, so very cool.

0:58:36.4 AB: Cool.

0:58:37.3 TW: Michael, what’s your last call?

0:58:38.3 MH: I’m glad you asked, Tim. You may have heard about this AI thing that’s been going around Twitter. And one of the things you probably have heard is that you have to be good at prompting the AI. So there’s even like a new career field called prompt engineering, which is ridiculous, but we’re not talking about that. But it does matter how you talk to the AI a little bit, and so Microsoft actually published a brief sort of tutorial for how to start up with or think about different aspects of prompt engineering. Which I found was actually a good read. If you’re trying to lead into AI a little bit, or use it in some capacity or starting to integrate it, it’s good to review stuff like that. Our audience is pretty technical, but it’s still very helpful. So we’ll post a link to that. But it was a good read, kinda help think through sort of like, okay, yeah, if you ask it this way, you’ll get a different reaction than if you ask it that way. I don’t think it matters if you’re polite to it or not. I don’t think that’s part of it. But you do see people talking…

0:59:43.0 MK: I think you should be, because that’s what our children hear. So when I talk to Google, I say please and thank you.

0:59:48.7 MH: Oh boy.

0:59:49.6 MK: Yeah.


0:59:51.5 MH: I’m not disagreeing with you, Moe, but an AI is not a person and has no concept of right or wrong. And we’ll leave it at that. Okay, you’ve probably been listening to this episode and like us you’ve been having a blast, and we would love to hear from you. Maybe there’s questions or ideas or things that you’d like to share. You can easily reach out to us. The easiest way to do that is on the Measure Slack group, or on our LinkedIn page or on Twitter. And we haven’t got a Blue Sky invite yet, but we’re working on it. No, I’m just kidding. We’re not working on it, but if it keeps taking off and you have an extra one and you wanna throw it our way, just contact Tim. Anyway…


1:00:35.3 MK: Jesus.

1:00:35.7 MH: Hey, we have this platform, why not ask for a Blue Sky invite? I mean, come on. Anyway, and again, no show would be complete without a huge shout-out to our producer, Josh Crowhurst. Thank you, Josh, for all that you do to make the show possible. We appreciate it. All right, Austin, once again, thank you so much for coming on the show, sharing the opinions shared by all… Okay, I won’t make that joke again. But really loved getting your perspective on some of this problem space and your approach to it. It’s been very informative and helpful. So appreciate that very much.

1:01:10.6 AB: Thanks a ton for having me.

1:01:10.7 MH: And I know, if you’re listening, my two co-hosts would agree with me, no matter if you are a platform PM or a general product manager, one thing is for sure, that you should keep analyzing.


1:01:26.7 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @AnalyticsHour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

1:01:44.6 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.

1:01:51.2 Kamala Harris: I love Venn diagrams. It’s just something about those three circles and the analysis about where there is the intersection, right?

1:01:58.6 TW: Right now, even though I have zero Hispanic heritage in either myself or my wife, I have been making my own flour tortillas for 15 years, our own guacamole, our own salsa…

1:02:09.0 MH: Tim makes a mean fajita.

1:02:11.2 TW: Yeah.

1:02:11.5 MH: Because… Delicious.

1:02:13.3 MK: But you should be making Quorn. Quorn is like the superior tortilla.

1:02:18.9 MH: It depends on the application, Moe.

1:02:19.6 TW: You know, Moe, it’s been nice doing a podcast with you…


1:02:23.0 MH: We’ll never talk to you again.

1:02:26.0 TW: Rock, flag and Cunningham’s Law.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#249: Three Humans and an AI at Marketing Analytics Summit

#249: Three Humans and an AI at Marketing Analytics Summit

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_249_-_Three_Humans_and_an_AI_at_Marketing_Analytics_Summit.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares