#293: Tool Selection and the Unhelpfulness of Feature Comparisons

The one rule about the Analytics Power Hour is that we don’t talk about specific tools. But that doesn’t mean we won’t talk about tool SELECTION! Jason Packer recently released the second edition of Google Analytics Alternatives, (also available on Amazon) and his approach in the book is very much not an RFP-like “check which features your tool offers” system. And his rationale for that seems just as applicable (to us, at least!) for any data platform selection, be it a digital/product analytics platform, a BI tool, database or storage infrastructure, or, well, you name it! Ultimately, the challenge is how to go about getting a reasonably strong understanding of the philosophy and historical roots of each platform being considered and then marrying that up with the foundational priorities and needs of the organization. Is that a lot harder than a feature checklist? Yes. But them’s the breaks.

Links to Resources Mentioned in the Show

Photo by Alexander Schimmeck on Unsplash

Episode Transcript

00:00:05.76 [Announcer]: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

00:00:17.05 [Michael Helbling]: Hi everybody, welcome. It’s the Analytics Power Hour. This is episode 293. Okay, listen, we draw a hard line of this show. We don’t talk about tools, but we never said anything about tool selection. And let’s be honest, we have all been there trying to figure out which vendor to go with after putting in tons of effort into our carefully crafted spreadsheet with all the selection criteria, which somehow every vendor says, yes, they can absolutely do all the stuff on there. It’s enough to make a person cynical. And we analysts don’t need help with that. So take a pause from reading the cold sales emails from the latest analytics AI SAS vendor. And let’s talk about the ins and outs of selecting a tool. But first, let me introduce my co-hosts, Tim Wilson. Or as I like to call you, Tim, tool selection, Wilson. No. How are you doing, Tim?

00:01:11.97 [Tim Wilson]: I’m just about ready to select a new podcast recording platform. Oh, perfect.

00:01:18.30 [Michael Helbling]: That’s going to probably trigger a bunch of inbound emails. All right. Moee Kiss, how are you going? I know you do a lot of vendor evaluation and selection in your role.

00:01:31.25 [Moe Kiss]: I certainly do. I’m very pumped to talk about this. I think the only thing you missed is the like, oh, don’t worry. If we can’t do it yet, it’s on our roadmap. Oh, yeah.

00:01:40.86 [Tim Wilson]: That happened yesterday to a client. They turned to the vendor and they’re like, yeah, we can’t do that. And the response from the client, which was a very large company was like, well, is it on your roadmap? It’s on our QN plus one roadmap.

00:01:57.02 [Moe Kiss]: Yeah.

00:01:59.38 [Michael Helbling]: All right, and I’m Michael Helbling, and we wanted to bring on a guest, and we found a great one. Jason Packer is the founder of Quantable Analytics. It’s an analytics consultancy focused on analytics engineering and implementation. He’s also the author of the book, Google Analytics Alternatives, now in its second edition. And the genius behind the Measure Music channel on the Measure Chat Slack group. And now he is our guest. Welcome to the show, Jason.

00:02:24.65 [Jason Packer]: Thanks, Michael. I’m really happy to be here. It’s a bucket list item to finally make it on the podcast.

00:02:30.44 [Michael Helbling]: Well, it’s awesome to have you. All right, so maybe to kick things off, Jason, maybe just walk us through sort of what brought up the idea and was behind the idea of writing the book in the first place.

00:02:42.55 [Jason Packer]: Yeah, so I’ve always been really interested in evaluating software and knowing what’s out there, even back to my early days as a Unix administrator and software developer. I liked looking at all the different tools and back in the era when the Google Universal Analytics Universal Sunset was coming up. There was a lot of people that were asking these questions. There were a lot of people asking me these questions. And so I thought, well, I may as well start doing this research. Seems like a fun thing to do. And I started out thinking, well, maybe I’ll write a series of blog posts. And then someone at Columbus at the time, Web Analytics Wednesday, said, well, why don’t you just write a book, Jason? And that seemed like a good idea to me. And so I did it. And now a few years later, there’s some, you know, things have changed. There’s some new tools I wanted to look at. And I thought I would just, you know, make the same mistake again. So that’s here. Here we are.

00:03:47.46 [Tim Wilson]: Wait, who was it? Who was it? It didn’t last Wednesday. Who said that?

00:03:51.94 [Jason Packer]: It was Ahmad. Ahmad. Oh, OK. Nice. Which I think I credit him for in the first book, at least for the for the idea.

00:04:02.15 [Tim Wilson]: Look, I read it. I didn’t memorize the acknowledgments. Jeez. Come on, Tim.

00:04:08.73 [Moe Kiss]: But it sounds Jason like a big part of your process and like understanding the capabilities of the tool is like really playing with it, right? And I think one of the things that I’m often thinking about is like, I see folks trying to evaluate tools without getting their hands dirty. And so like, Do you think that’s what everyone should be doing, or is that just the thing that’s always worked for you?

00:04:32.16 [Jason Packer]: Well, I think everybody loves to have an opinion about a tool, and it’s very easy to form an opinion. You get in there, you see how it looks and how it feels, and that’s fine. I have opinions about that too, but you really have to balance that against really learning what the tool is about. And for me, the way to do that is to use it and to use it with real data, not to use it, not to watch videos about it, not to be walked through a demo by somebody, but to install it on a website, even if it’s just a trivial website. install it and use it. And that’s how I learn best. That’s how I learn most quickly. And do you think that using it with real data?

00:05:17.98 [Moe Kiss]: The bit that I’m taking away from that is it helps you understand it. But how do you think it changes the evaluation process itself?

00:05:28.04 [Jason Packer]: I think using real data will show you a lot more about where the issues are. For example, if you’re working with a vendor and they walk you through it, they’re going to show you the highlights. They’re going to show you the things that work well. They’re going to show you a tool that’s completely, perfectly set up. and we all know. That’s not how it is. In the book, everything that I evaluate, I used on real websites with real user data. For example, one of the issues with those real websites is one of them had a terrible bot problem. It was a site that I bought on the secondary market. I didn’t make the website, I just bought it. you know, it had some real traffic, but it was just like, you know, littered with bots. And so the traffic looked really weird. And like, there was all kinds of strange hits to pages that weren’t there. But that led me to learn a lot about how, you know, these different tools worked in the cases where there’s a bunch of 404s or there’s huge amounts of bot traffic. So like, that’s the difference between no vendor, whatever, like, show you a demo where 90% of the traffic was bots. That’d be crazy. And in some ways, it can be challenging to do that because that might not be your use case. So a lot of the things I talk about in the book is use case match. That’s the challenge as a tool evaluator is to match your constraints of your use case to the best match of a tool. Like I said, opinions, everybody’s got them. And there are in some ways in which some tools are more technically advanced than others, or some tools are faster than others or whatever. But it’s really about matching use case to tool through the lens of those constraints.

00:07:30.86 [Tim Wilson]: Back to the using the actual data, so the book was kind of digital analytics, product analytics stuff. I would put BI platforms in there, put data warehouse platforms. All of those when it’s like, you want to try it with your data. I mean, a really high bar or a real challenge seems to be, we want to do a bake-off or we want to do a proof of concept. We want to try it out. I’ve gone through processes where it’s like, we’re going to do the RFPs, we’re going to select some finalists, we’re then going to do a bake-off. And that does mean you’re fundamentally doing some sort of mini implementation and trying to draw the line of, you know, and that can include getting through some compliance hurdles to say, yeah, we’re using our real data, or do you say, well, we’re going to dummy up or we’re going to do an effort to make it’s kind of like our data, but it’s been anonymized to the point that it’s not our data, but it’s still mimics our data enough that we could actually try it in this platform. It does seem like companies To me, that’s what motivates a lot of the not wanting to go through that process.

00:08:48.93 [Jason Packer]: Ideally, it would be great to use your actual data and to do, like you say, a real mini implementation, but that’s just not feasible in a lot of cases.

00:08:56.60 [Tim Wilson]: I mean, Moe, have you done that?

00:08:59.32 [Moe Kiss]: Yeah. I’m not going to bait around the bush. I do a lot of… like analysis of different vendors and different tools and that sort of stuff. I would say I definitely lean towards the, we should do multiple POCs. Like the last major tool selection we did, I think I wanted to do maybe four POCs. And obviously like that’s a negotiation with the business and capacity and things like that. We ended up agreeing on two. But I think the thing that I found really hard is like, often the folks doing the evaluation and the assessment and those sort of things. I don’t know if the incentives are always there to do multiple POCs. I find that hard to reconcile with because it is. It’s really hard to understand how good a feature is or a particular capability that you’re looking for without stress testing it. And yeah, I don’t know if I just air too far on the POC side maybe. I think folks internally would probably say I do.

00:10:04.27 [Jason Packer]: I think that’s really challenging, right? Because a POC is great, but even before you want to get to that POC, you want to feel like you’ve narrowed it down to something that’s worth the effort there. And for me, part of that can be not even doing a real POC, but doing a toy test. Oh, let me do it with my podcast website. Let me do it with my… a personal website or whatever. That’s part of the reason also why I’m a big proponent of free tier, even on enterprise tools. That can be a challenge, right? Not everybody can offer that. Sometimes, if you’re talking about a huge BI platform or something, what would a free tier even mean if it’s even doing a simple example, implementation means putting in 100 hours of work or something. But the ability to get a little bit into the product before you really start talking about committing company resources to it, I think, because I do love the POC approach and the more, the better. But it can be hard to get those resources for sure.

00:11:23.11 [Moe Kiss]: Also, just getting it through security is a really big step. You’re basically doing a procurement process for something that you’re running a PRC on. It takes a lot of time and energy, but I obviously am very biased here because I lean strongly on the side that that’s worth it. Yeah, that’s my lived experience, but yeah.

00:11:46.77 [Tim Wilson]: Well, Bill, have you run into, because I can see the doubt. Say it’s only two, you get down to two tools and you get in and you’ve got multiple people who are all trying it and they all have different things they most care about. And then you get to the end of that, and you’re like, all we’ve done is allowed people to dig their heels in further on their preferred tools, because now they have hard evidence that that other tool doesn’t do this thing that I think is really important, and it does this thing. Like, do you wind up saying, well, this is supposed, we’re hoping that we arrive at a clear winner, but even if you do a POC of four tools, They’re still not the one clear winner and you’re still in kind of a negotiating phase. And you’re also setting up the people who didn’t back the ultimate winner to be able to say, see, we did the POC and I told you we shouldn’t have that one. Sorry, that’s just depressing me. No, no, no.

00:12:45.80 [Moe Kiss]: I can still remember like a few years ago, we were doing a BI tool selection. It must have been like five years ago and all the data analysts got in a room and we like this, this was the absolute worst way to do it. I would never ever do this. But we were like, how important is this thing to you when everyone would go to one side of the room or the other side of the room? And almost every time I was on the side of the room on my own. And I think, so it’s suffice to say we did not pick the tool that I wanted to get, but it is what it is. I think the thing that I find so difficult about data tools in particularly, and I know we had Colin on previously, from Omni talking about how especially BI tools, you’re trying to be many things to many different people. And I think what’s so challenging about data tools is data folks have very strong opinions about the things that they do and don’t want to work with. But also their opinions are normally representing what is best for them and not always what is best for the business. And that’s human nature, right? You think about what’s going to make your own job easier. And so I think I I often come with this perspective of a data tool is actually for our stakeholders. So even if it’s a little bit trickier or a little bit harder for us in our day-to-day, is it going to help our stakeholders in their relationship with data be better? Because I will up wait that. But I don’t think that’s the common. I’m not sure that’s necessarily a common view.

00:14:13.11 [Tim Wilson]: Michael, what’s your relationship status with SQL?

00:14:20.42 [Michael Helbling]: Oh, I think you know it’s complicated. It keeps gaslighting me with a syntax error near from, like, I don’t know where from lives.

00:14:30.46 [Tim Wilson]: Well, here’s a healthier relationship. Prism by Ask Why. You ask in plain English? Prism writes the sequel.

00:14:37.02 [Michael Helbling]: Ooh, like Revenue by Channel week over week, excluding refunds. And instead of me crafting a 47-line query and a three-line apology, Prism just does it?

00:14:46.36 [Tim Wilson]: That’s right. The best part? It doesn’t forget everything the moment you close the tab. Prism’s jam of memory remembers your reality, your definitions, your quirks. I mean, not your personality ones, but, you know, your coding quirks.

00:15:00.86 [Michael Helbling]: Well, but like the BigQuery table is the source of truth and conversion means this and not whatever gets decided by somebody like mid-meeting somewhere.

00:15:11.01 [Tim Wilson]: Exactly. So you don’t have to re-explain your business context like it’s a bedtime story for robots.

00:15:17.69 [Michael Helbling]: Yeah, I have to admit I’m a little tired of starting every session with previously on analytics.

00:15:24.28 [Tim Wilson]: And when Prism generates SQL, you get traceability. You can track changes, see what was created, and follow the logic.

00:15:31.25 [Michael Helbling]: I like that, because when somebody asks me where this number come from, I can stop saying, well, from the number tree.

00:15:38.33 [Tim Wilson]: It’s like version control for your analytics brain.

00:15:41.34 [Michael Helbling]: I like it. A little bit of accountability, but it’s convenient.

00:15:45.58 [Tim Wilson]: That’s right, so do you want in? Go to asky.ai and join the waitlist. That’s ask-the-letter-y.ai and use code APH to go to the top of that waitlist.

00:15:58.78 [Michael Helbling]: I like the idea of letting AI write some of the SQL.

00:16:01.56 [Tim Wilson]: And let your memory do literally anything else.

00:16:06.29 [Jason Packer]: No, I think it’s not. And I think, you know, everybody also wants to work with the cool that’s good for them. Like personally, well, like, right, like this idea of sort of like you’re implying that like, hey, I want to work with the new tool. I want to work with the cool tool. I want to work with a tool that’s good for my career. I want to work with a tool that my LinkedIn posts are going to be, you know, go with. And, you know, that, that. A lot of times that’s not the right fit. It’s really about the whole organization, not just the analysts, but a lot of times the analysts isn’t even really the one.

00:16:44.96 [Michael Helbling]: Flip it around and people want to work with a tool they’re familiar with. I used this in my last job, so I want to use it here.

00:16:51.57 [Tim Wilson]: Which was good when GA4 came out and Universal Analytics got sunset, then it was like, well, nobody’s familiar with it. So reset, yeah.

00:16:59.78 [Jason Packer]: Yeah, that’s what I was going to say, too, is that a lot of times a tool switch is not the right answer. We all like to think, hey, there’s a tool out there. The perfect tool out there that’s going to fix my problems is going to make my personal life better, my company do better, et cetera, et cetera. But there’s no perfect tool. There’s no Grasses looks greener, but a lot of times the tool you have now just isn’t implemented correctly. The new one you get isn’t going to be implemented correctly either. That can be a real challenge too, especially if you’re like, hey, I want to do these Hey, we’re going to do two POCs and put in all these resources. In the end, we’re going to say, oh, well, actually, I think the answer is that we stick with what we got and we just spend a little more time trying to improve our reputation. Nobody wants that answer.

00:17:52.95 [Moe Kiss]: In your experience, talk me through when There are trade-offs, right? We’ve all said no tool is going to meet the brief perfectly. How have you approached balancing those trade-offs? What’s your thinking? And how do you, when you’re working with businesses, convince them of the trade-offs they should make versus shouldn’t?

00:18:11.59 [Jason Packer]: Yeah, it’s really difficult because how I evaluate the tools from the book is a totally different mindset than how I think when I’m talking to an organization. A lot of times, I won’t even really be talking about the same things. In the book, I talk about the underlying tracking structure of different tools, the databases that different tools use, how they work with consent, things like that. And when I’m talking to a particular business, I listen for what their real pain points are. Is this an organization that they just need to get off of GA because of compliance issues? And that’s like, Then I focused their selection on solving those pain points as directly as possible, but also trying to not get into the weeds with them about the details of the tools that the people listening to this might find interesting because they’re not going to find that interesting.

00:19:20.08 [Tim Wilson]: I think you just kind of mixed it because part of what you did and maybe it’s worth having you What I loved about both editions, because the structure stayed the same, is that the tool by tool, blow by blow, and it’s not a feature by feature, but the tool by tool kind of write ups are the second half of the book. The first half of the book is you got to have kind of a framework of what matters to you. You admitted throughout, you’re like, there is no perfect categorization, but you just talked about one of those was the tracking methods. I could see for the right company, they would say, we’ve been getting burned by our current tracking method and we have got to find something. You’re like, cool, well, let’s then think about the philosophical difference from the different tools. If somebody else says, we just need something super cheap, it’s like, okay, well, then let’s talk about the nature of your digital experience in the different pricing models. If somebody says, we just got to get off a GA because it’s compliance. We actually love everything about it. Our compliance team has said we have to get off of it. I would say in the example you just gave, it was how you approach the book. It’s just where you’re going deeper in that understanding what attributes truly matter and then going deeper, right?

00:20:51.82 [Jason Packer]: Yeah, I think actually that’s fair. One of the things I’ve talked about is how things like that’s all about constraints and how price is a constraint. Price is a real important thing for organizations. It’s not the coolest thing to talk about when it comes to tooling. Similarly, it’s just a question of how you’re engaging with the decision makers, I guess. are in that first half of the book are just a long list of the things that I think about. I might think about a bunch of those when talking to a particular organization about a tool. I might not be talking to them about all those things, but I’m certainly thinking about a lot of them. I think it’s important to understand them to a certain degree. For example, in the new edition, there’s a chapter on server side. Obviously, I’m not going to teach someone everything about server side analytics in a chapter, a 3,000-word chapter of my book that’s not primarily about that. understanding at least enough about that to know if you’re talking to a vendor when they say, oh, yeah, we support server side. It’s easy. This is what you do to be able to understand, interpret what they’re saying, to know like, oh, well, really kind of like anybody could do server side. It’s not really about the tool, it’s more about the deployment about, oh, are you using server-side GTM to deploy that? And if you are, then this, and perhaps the real underlying problem is tracker blockers or something like that. And then your lens for viewing that is different. So that’s why I think that the The first half of the book, the guide part of it, rather than the product evaluations, is the lens in which I look at all product evaluations, and I’m trying to share that viewpoint in the first half. Tim liked it at least.

00:23:08.83 [Tim Wilson]: Can I ask, and this is probably also a question for multiple people like you, and you said it in kind of some of the earlier discussion that you explicitly did not talk to the vendors, even though they were, especially after the first edition, they knew you were doing the second edition. And they’re like, come on, just let our sales engineer help you out. You know, once you just understand, and I think you did that to say, I want a level playing field and I need to finish this book at some point. And if doing 15 POCs is tough, letting their sales teams get their hooks into you would be absolutely impossible. Whereas, yeah, and so whereas Moe, I feel like if you’re down to a couple, the where does sales play? So I don’t know, maybe you can talk through that.

00:23:56.68 [Jason Packer]: I mean, yeah, that’s sort of an unusual choice that I make in the book is to like, I mean, I definitely have talked and I know a lot of really great people at a lot of these vendors, like, especially after the first edition, you know, I’ve talked to a lot of these, these people and there’s a lot of them told you what you got wrong. Not as many as some, some, yeah. But it’s important to me that I was really, really fair more than I was particularly making any value judgments or anything like that. But the not engaging with them is about loving the playing field to some degree. It also fits in well with how I learn, like describing the learning from doing. Again, getting a demo account or some kind of account where I can use the product is the fastest way for me to learn rather than being on sales and engineering calls. But I think that was my case for writing the book. That’s different than most case with engaging with vendors from a large org that has specific needs. I think that a lot like engaging with vendor reps can be really, really helpful, but it also gives you an idea too of the culture fit between the product and your organization, which is a real thing. Something that when I started the first edition of the book, I didn’t expect to be so important, but is, I think, quite important.

00:25:36.33 [Michael Helbling]: Do you hear that, Tim? Culture is very important. I just wanted to reiterate that point really quickly. Sorry, you’re going to mo.

00:25:45.99 [Moe Kiss]: Just to add to that, I have personally found that engaging with sales, engineering support, whatever, is like a really big part of the process because I want to make sure that we can learn from their expertise that we’re not facing challenges that are very easily fixed. And I think part of then, even in the playing field, is making sure that you get that with all the companies that you’re PGO seeing. It’s not a favorites game. And you’re so right, Jason. Such a big part of it is about the culture or the ways of working that you then get to explore with that other company. And very transparently, I’ve talked about our relationship with Snowflake quite a bit and a big, big part of our success. I will rail on about implementation for years to come. But a big part of it is we’ve had really close relationships with their product teams, with their product managers, their tech leads, we will have calls like testing out new features and new functionality and being able to influence a roadmap. That is a huge, hugely important thing for us when we’re doing vendor selection because we want to make sure that in a year’s time, we have the kind of relationship where we can push their product if we need to. And so I think that letting those folks in the room so that we can stress test each other is a big part of the evaluation for me.

00:27:12.97 [Jason Packer]: Yeah, I agree with that. Again, it depends on your organization and why you’re buying the thing to start with. If you’re a tiny startup and you’re not really going to if you’re the thing that I hate is like you’re you’re a tiny startup and you’re you’re talking to an enterprise software provider and you know you get to the point where okay we’re ready to actually like talk some real prices and like okay well you know start for for your volume data we’re starting out with $65,000 a month and you’re like that’s my what are you talking about that’s like my entire yearly budget for all of my analytics so you know it I love transparency. I make that pretty clear book. And I think that like, you know, that’s just a great thing to get people on the same page as quickly as possible, because that’s super important. And I think that like, When you are engaging with the vendors, being transparent with them helps everybody. Nobody wants to seem like a dummy when they’re talking to a vendor, but if it’s a new tool, I don’t know the tool. They know the tool. They know they’re immediate competitors far better than I will. I try to be very direct about, hey, the budget is this and here’s my seemingly very stupid question. When you gave me answer, I didn’t understand, I’m going to just ask that stupid question again because it’s important to everybody that we find the best fit in the most direct way possible.

00:29:03.33 [Moe Kiss]: And I do think, obviously, I come from a place of absolute tech privilege. I think a lot about what are the hills. We say that all the time. We’re like, well. Well, anyway, I just want to be conscious of other folks have very different budget constraints when it comes to tool selection and things like that. But there are things I will die on a hill for. And one of them is, I do think, obviously, budget is incredibly important. But if there is a very good tool and it is not 10x, like other options, but it is a good fit. I personally think that is a fight worth having with the business, like getting support for that extra budget to make the right tool decision versus being so constrained by it that you make a really, really shitty choice. And again, everyone’s not in that position, but that situation you just described, Jason. Knowing the prices much earlier in the process is absolutely something that folks should be doing. You can’t wait till you’ve done a POC to start getting an idea of their pricing because if it is way out of the realm of possibility, you don’t want to waste your time and energy on it.

00:30:12.54 [Jason Packer]: I guess I haven’t thrown Google under the bus yet. There we go. Here we go. Yeah. One of the things that I think Google really made hard for is that they made a, with Universal, they made a pretty darn good product, and they made it free to just an incredible degree. It was technically free to 10 million hits a month, and in reality, it was quite a bit higher than that. And with GA4, of course, there’s no hard event limit. the one million per day export limit to BigQuery is probably the thing that people hit first. But they’re giving away so much for free. And that’s really caused people in the industry to think that analytics should be basically free, that the software should be free. And it’s very distorting. things really hard for new tools to come out there and to get a foothold in the market. It makes what Moee you’re saying as far as, hey, we need to understand that even if this tool is a little bit more money, think of the cost in people, the cost in data decisions in the organization. you’re really undervaluing analytics, and part of undervaluing analytics started with VA being free. And that’s still happening.

00:31:53.35 [Moe Kiss]: Oh, Jason, I feel like we could sit around and like… chat for hours because I think fundamentally one of the biggest mistakes I see is, yes, folks want to work on cool shit to put on their resume or LinkedIn or whatever, but it’s also the open source fallacy or the free fallacy, which is like, oh, this is open source or it’s free, it’s not going to cost us anything. I will push pretty heavily on like, that does not mean it’s free. We need to actually think like, we’re talking about a solution here that has five full-time engineers supporting it. That is not free to me. That is actually a huge cost to the business. And if we want to do that because we think that’s the right decision, that’s okay. But that needs to be a line item in our decision as well, not just the like on paper cost of the tool.

00:32:39.25 [Tim Wilson]: That also then extended if we’re going to have to support and we have those five engineers and one of those engineers leaves, what’s the size of the pool of candidates that we’re going to have to replace it, which is one of those where Market leaders and whatever tend to have a leg up, and it’s a legitimate leg up. They’ve achieved some critical mass. Nobody got fired for buying Tableau or Power BI. Part of that is because everybody’s been exposed and is familiar, but it’s also legitimate saying, well, if I need a Power BI developer, That’s a much larger pool to draw from, right?

00:33:20.45 [Michael Helbling]: Yeah. When Moee excitingly get ready for a new wave of that with AI, because now people are going to be like, it’s free. We can just build it with AI. It’s a question to Jason, sort of from your perspective, because obviously, we’ve all kind of been through vendor selection processes and we kind of touched on how Google Analytics is free. So obviously, as people are sort of jumping on the AI bandwagon and seeing how easy it is to prototype things, not necessarily build full-on products yet, but we’re moving in that direction, I would say. Do you think that’s going to be something that will enter the process of the build versus buy debate certainly changes a lot in the future?

00:34:08.82 [Jason Packer]: Yeah, I think so. I think that’s already happening. I think that it’s happened not exactly with AI, but the simplified realm of these tools like the In the book, I call them simplified web analytics tools, including things like plausible and fathom. Um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um, um And there are so many of those tools out there now. Every few months, a new one comes out. Some of them are quite good. It’s not a hard thing to prototype, and it’s even easier with AI. I could go in and especially if you start with the ones that are open source, you can be like, hey, build me an Umami clone. Here’s the Umami GitHub repo about it. And you could build yourself something like that pretty quickly. And some people are doing that. I think you run into some of the problems that Tim was talking about as far as having expertise with the tool. If it’s some internal tool, then the internal people are going to be the only ones that have the experience with it. And also, when it comes to some of the more complicated underlying database things, that’s I think far beyond the complexity of what AI can do a good job with. And so, you know, like, and also things like schema AI doesn’t do great job with, I mean, it can do okay, but it needs a lot of like, you know, human hand holding. So I think that like, ultimately, it’s like, a mistake for most organizations to try to think that they can build their own when there are so many really great platforms already out there. I think it’s maybe fine to think like, oh, we’re going to extend. We’re using whatever. We’re using Postog. And Postog is an open source tool. It’s one of the widest tools in the market. They have 34 different apps built into the tool. of whether you want session recording or feature flags or whatever, or LLM analytics, they’ve got it probably. And if there was something in there that you need to add to that, then using AI on top of that, I think to extend it, it makes sense. But trying to build a foundation to your analytics platform, your analytics practice, without really having that real strong attention to detail that these platforms that have been out there and tested have. It doesn’t make sense whether it’s, you know, humans building it. or it’s even worse if AI is building it. But I think it makes a lot more sense to build on some of the great tools that are already out there.

00:37:26.55 [Michael Helbling]: That gave me a cool weekend idea though. The mommy club coming out.

00:37:30.72 [Jason Packer]: Yeah.

00:37:31.04 [Michael Helbling]: Yeah. Just because you can. I don’t know.

00:37:34.58 [Tim Wilson]: But as you brought up post-hoc, because that’s one that I’m not familiar with, but you mentioned them because they kind of, well, actually, I think I sat and watched you talking to a different vendor and you brought them up a lot as being kind of like, who was the tool built by and for? Posthog is like, of by and for the developer. Google Analytics, Universal Analytics was kind of intended to be for the more casual initially, and they tried to stick with it. They’re like, this is easy. This is for the marketer. BI platforms seem like they’re similar. If you’re in the Power BI, you’re in the Microsoft stack, you’re like, this is built for the enterprise that wants to have the complete ecosystem and progression of tools. What do you think you’ve said here? You definitely said it in the book. What is the philosophy? What is at the What’s in the DNA of the company that’s building it? Who do they feel is the user that has primacy? Is that a fair… Absolutely.

00:39:01.22 [Jason Packer]: You and I have talked about that. Product philosophy and outlook is more important than any sort of feature comparison. Probably some people have heard me complain about this before, but future comparisons are helpful in some ways, but they also are already outdated. By the time you posted your future comparison list, it’s already out of date. The one checklist item that it says it does X, There’s so much to unpack underneath that, that their version of X might not be what you really think that you’re getting when you get that feature. And people want more features, like I just talked about postdoc, they’ve got every feature under the sun, they’ve got, you know, They’ve got session capture, but maybe you already have session capture. You’re already running Microsoft Clarity or HotJar or something like that. So while that looks like a great thing on a future comparison list, that’s not something you need, or that’s not something that’s going to help you. It’s going to add confusion to the product. The more features you add onto a product, the harder it can be to use. That’s just the way these suites work. But it can be really hard at the same time to understand, to peel back that layer of marketing a little bit and be like, well, what is really this product philosophy? Who is this for? you know, when you’re in that job, when you’re, you know, when you’re an analyst and you get, I don’t know, whether you get in front of like Adobe Analytics workspace, you’re like, oh, okay, I get it. This is, for me, this was written by people that have been listening to people like me. This makes sense to me and is useful to my use case. That It’s clear and a lot of times once you get in and use the tool like I was saying before, but when you’re just looking at the marketing, it can be like, well, both platforms say they have the ability to customize reports like done, like they’re the same when they can be wildly different.

00:41:10.19 [Moe Kiss]: Talk to me about this philosophy piece because I find this really interesting, the philosophy of the platforms. I think maybe I was alluding to a similar idea before, but how do people figure that out? Is the philosophy who the user is or the direction they want to take it?

00:41:28.25 [Jason Packer]: I think it’s not easy because, in a lot of ways, I think the vendors themselves don’t know a lot of times. It’s a product of the history of the company. It’s a product of what the target market of that tool is, and it’s a product of the people that built it and who they were listening to when they built it. Let’s take, we’re talking about possible on some of the simplified tools, they have a clear philosophy of this is a simple tool. There’s not going to be vast ability to customize reporting. Everything is going to be on one screen or pretty much everything is going to be on one screen. We’re not going to have a drown you in configuration options. This is designed to be simple and part of that is in that as privacy as well, is that it’s you’re giving up some amount of complexity to make it easier and to perhaps make it more private as well. That’s a philosophy. And I think that philosophy for them is, can be pretty clear, right? When you look at their marketing, you look at the sample, you know, you try out a similar product. It can be pretty easy to understand their philosophy when they communicate it well. And it’s not, you know, a sprawling platform with 27 different components or whatever. versus some of the more complicated tools like I talk about piano in my comprehensive category along with tools like Adobe. If you’re looking at either of those tools, they offer so many different features and functionality. And there’s a much more complicated onboarding process that it can be really hard to understand what that philosophy is until you get much further along in the process. I do think that talking to the vendor and engaging with them before you get too far along can help you understand that, but it also can confuse the process too. I don’t know that I have a real great answer.

00:43:37.60 [Tim Wilson]: I like that because you said you have to sort of where the company like looking at the roots of the tool and I don’t have a million examples, but I look at like in the BI space, you had Tableau, which was like one of the second generation of tools that clearly was like The shit should be drag and drop, and we should be able to customize it to conform to things that Steven Fugh would give a 10 out of 10 to. They were coming at it saying, it’s got to be a drag and drop, WYSIWYG interface that you can have highly customized to be very, very clean visuals. So there were a BI tool philosophically, I think it was forward on the quality of the visualization. Contrast that with DOMO comes along a number of years later. And I would say DOMO was saying, no, no, no, it’s all about the ease of connecting to all of your data sources. And they kind of led with the connectors. Now, they’re competing with each other, so over time, their sales teams are saying, we’re losing Domo, we’re losing deals because our visualizations are shitty, and Tableau’s getting pushback of saying, we’re losing deals because we’re not easy to connect to all these different things. not necessarily permanently handicapped, but it is one that I would say both of those tools. That’s where their various strengths versus weaknesses are, which means if I’m looking at a BI platform and those two are in the consideration set, I may be thinking, do I have stuff going pretty tightly into most of my stuff is going to go into a data warehouse and occasionally it’ll be pretty normalized and I want to hook into it and occasionally I might want to hook into something else or are we going to just live in a chaotic world where I’m always going to be needing to hook into a gazillion different data sources that are all going to be messy and I’m going to need to be able to do transformation within it. I think it does take a lot of and maturity or wisdom or thought to try to map where is my company’s kind of, which philosophical or historical underpinnings are most aligned with my needs and then stand up and say, and guess what? That means our visualizations will never be as good as what the perfect idea would have because that’s a lower,

00:46:09.08 [Jason Packer]: Yeah, that’s where I think I talk a lot about understanding fundamentals, how that can be really helpful to close. Some of that you’re talking about is the gap between the marketing that you see from the vendor and the reality. Part of closing that gap and understanding really what a tool is all about can be understanding the fundamentals of how particular things work. If we’re talking about you know, databases, right? If we’re talking about this product uses MySQL and this product uses Snowflake and this product uses Postgres and this product uses Clickhouse. Knowing just a little bit about the differences between those two tools is going to tell you a lot about the product, you know, like if we’re talking about We’re talking about, say, we’re comparing PivotPro and Matomo. PivotPro uses Clickhouse as the database underlying their product, and Matomo uses MySQL. On the surface, they’re pretty similar products, but they end up working quite differently because of that difference in the underlying database. MySQL is a simpler database. It’s something that’s easy to self-host. It’s something that’s easy to see the raw data from. It’s something that’s not super performant in a lot of more complicated analytical queries. And all those things surface in the products. And if you know that background and you know that It’s not like you need to know how to use those tools, but just knowing a little bit. The same is true for like tracking methods like cookies, tracking with cookies versus tracking with this IP plus user agent method or tracking with browser fingerprinting or whatever. Just knowing a little allows you to sort of see like, oh, the vendor says X. Oh, I think what they mean is this. There’s not as much as the vendor might say, it’s not like there’s a million new things and a million new ways to do things. There’s a limited number of ways.

00:48:12.40 [Michael Helbling]: Before I wrap up, I’m going to give Moe the opportunity to jump in one last time, but we do have to start to wrap up soon. But yeah, go ahead, Moe. I know you want to ask one more.

00:48:23.25 [Moe Kiss]: Jason, we’ve talked about a lot of different concepts and things you need to think about in this whole tooling decision space. If I’m sitting at my desk and I just take like your one like absolute, this is the thing that should be most top of mind from all the things we’ve chatted about today. What would be like the one thing that you would say, just if you pay attention to this, then you’ll probably make a slightly better decision.

00:48:49.96 [Jason Packer]: Oh, that’s a tough question. I might actually say price. So I’m disappointed. It is. I’m disappointed. I’d like to say something cool like the fundamental database schema or something like that. It’s a shortcut to a lot of putting you in the right area. I don’t want to do, but that’s where I would go.

00:49:27.05 [Tim Wilson]: Price is one input to a total cost of ownership. I mean, that’s, again, maybe another one. Have you ever come at it that way, Moee, with any of your… That’s a better, you know, total cost of ownership.

00:49:38.75 [Jason Packer]: Let’s just say that. Say I said total cost of ownership of price. That’s what I meant.

00:49:42.51 [Moe Kiss]: There you go. That needs to be in version three of your book, because I like that framing. Total cost of ownership sounds way better than… I think I do use total cost of ownership.

00:49:52.13 [Tim Wilson]: I don’t know if… Maybe it’s 10… I mean, I think it makes sense if you’re going through it differently. Philosophically, how much am I going to have to invest in added tooling to work around a limitation in their tracking or something? It could be, but yeah.

00:50:07.46 [Michael Helbling]: All right. Well, we do have to start to wrap up. This is awesome conversation and honestly so It’s a good conversation, because I think everybody deals with this in some capacity in their analyst career. So Jason, thank you so much for coming on the show and being our guest today. One thing we like to do is go around the horn, share last call. It could be any topic, anything at all, just something that might be of interest to our listeners. Jason, you’re our guest. Do you have a last call you’d like to share?

00:50:38.79 [Jason Packer]: So my last call is something that you already mentioned, Michael, which is Music League. Nice. Michael and I, and I’m how I believe your sister is a part of this as well. Music League is a, like, It’s a competition sort of, it’s a friendly competition where every week somebody, like there’s a theme, like this week’s theme in the music league that I’m part of is Beatles covers. So everybody picks a Beatles cover that they like, then a playlist is made automatically from that, whatever, 20 songs. and everybody votes in the ones that they like and fun as hell. It’s not complicated. It’s fun to do with your peers, your friend group, your work. We’ve been doing it on the measure slack for what, three years now or something like that? It’s, I mean, it’s a lot of fun.

00:51:40.10 [Tim Wilson]: Is it in the measure? What? For somebody who’s interested, they have to be in the measure slack and then in the measure music channel, they can find it.

00:51:47.46 [Jason Packer]: Yeah, that’s where the conversation happens. You don’t technically have to be part of that. But anybody can start musically too. And there’s also like free

00:51:57.09 [Tim Wilson]: You know, like, yeah, but we like to do stuff around, you know, us. Don’t just get people to go out and do their own thing.

00:52:05.14 [Michael Helbling]: They got to be part of the measure slack to do this. So join that first. Obviously, top tier. Yeah. Yeah. And the group is amazing. Like, that’s also great. We have tons of cool, fun, music-based conversations with all your peers in analytics and, um, It’s a lot of fun. So for my own personal experience, it’s it’s a great time. And I’ve got a great idea, Jason, because, you know, we’ve been growing as we grow and then we get the big power hour bump on this now. We can start like different levels of leagues. So there could be like a Premier League with relegation and a championship league like like British soccer, you know.

00:52:44.59 [Jason Packer]: I think I would be relegated. I’m not sure I would like that. I do not.

00:52:47.60 [Michael Helbling]: Well, I probably would be too. I don’t often score very well, but I have a lot of fun. Anyways, it’s also really cool to get a new playlist every couple weeks or so of songs you might not have ever heard or genres you’re not that into. So it’s nice. I like it.

00:53:02.80 [Tim Wilson]: So we do occasionally get comments from people who are like, you guys mentioned the measure slack where it’s like, if you literally go to measure.chat and then you join.measure.chat. And we’ll also have it on the show notes page. So if anybody’s like, you guys keep mentioning it and you, and it’s in our outro and we don’t have instructions for how to find it.

00:53:22.46 [Michael Helbling]: So listen, if you’re committed, you’ll find your way in. All right. No, thank you. Yeah, that’s awesome. And Jason, thank you for kind of being the oomph behind that, as I know it’s a ton of work on the back end to make it work.

00:53:36.24 [Jason Packer]: The official commissioner.

00:53:38.54 [Michael Helbling]: Yeah, the commissioner. The ska loving commissioner of the Measure Music channel. All right, Moe, what about you? What’s your last call?

00:53:50.95 [Moe Kiss]: Okay, so my husband has been listening to a podcast for a long time that folks will probably be familiar with. I have noticed it indexes highly to men. I know a lot of men that listen to it. I don’t know a lot of women.

00:54:03.93 [Tim Wilson]: Joe, you’re wrong.

00:54:08.85 [Moe Kiss]: Sorry. the Pivot podcast. One of the, my husband listens to it on like loud speaker around the house and it like really like drives me nuts. And I have not been the biggest fan of Scott Galloway. However, I have had my opinion changed very significantly. I am now a listener of Pivot. I have been incredibly impressed with how they’ve talked about I mean, AI and like tech over the last few months, but particularly the coverage on the Epstein files is something that I just really, like it really impressed me and that’s why I’ve become a really big listener. Scott also last month did this like resist and unsubscribe initiative, which folks might have seen in the media, which was really cool, which was like encouraging folks to basically use our economic power to let tech companies know that we’re not happy with how they’re supporting the administration. I felt like they were using their voice to share their perspective on something in a really meaningful way. Also, just for everyone out there, checking on the women in your life, the last few months have been like shaken us to the core. And so just to just check in on your, your wives, your mums, your daughters, all the women.

00:55:38.89 [Michael Helbling]: Nice. Yeah. Great.

00:55:40.63 [Moe Kiss]: All right.

00:55:41.50 [Tim Wilson]: Yeah, great. Yeah, Tim, what’s your last call? What have you got? Well, there was this episode of the Rogan. No. So, I’m going to do two. They’ll be quick. One, David Epstein, who I’m a big fan of like his books, like his videos, but he did a 15 minute video called why you should fail 15% of the time. And he talks about desirable difficulties, which is a phrase I don’t think I knew, but he kind of breaks down the value of doing hard things the hard way and specifically what that does for you, which in the world of vibe being shit, there are a lot of people grappling with it, but he’s just a well done video and he’s delightful to listen to. And then maybe kind of adjacent to that, there was just an article, I don’t know, it was the metadata weekly, Mark Dupuis, the AI analyst hype cycle. And I just, there were some quotes in it that were just, I thought were gems, like quote, if AI can only answer questions that have been preconfigured by the data team in a semantic layer, what have we actually built? An expensive natural language interface to existing dashboards, which And he kind of makes the case of where is this all going? It’s narrowing down to where what you actually get is maybe not that great. But they also had the analysts who thrive will be those who can translate business problems into the right questions, validate AI output, build the context systems that make AI useful and provide the judgment. and recommendations that AI cannot, which I think a lot of people are saying, but that’s kind of like a cheap throwaway thing to say when I look what people are then also saying, I did this thing. It often kind of skips those components of it. The AI analyst hype cycle by Mark Dupuis is my second one. Michael, what’s your last call?

00:57:43.19 [Michael Helbling]: Well, we did an episode a while back talking about semantic layers with Cindy Hausen from Thought Spot, which was awesome. And we also did an episode about AI that I remembered something Moee said about how So letting AIs leverage how the queries are being used in the organization is also a way of training the AI to do that. And I read an article recently from Jacob Mattson at Moether Duck about rethinking the semantic layer and kind of challenging the idea that a semantic layer is kind of the only way to go. And I just thought it was a cool counterpoint. I don’t know that I’ve got a strong opinion one way or the other. I very much respect the conversation we had with Cindy and I really thought it was really powerful. But there’s some interesting research and discovery going on as well on sort of like letting the AI consume all your SQL queries and using that to help it understand some of the context behind where and how your data is getting pulled together. So anyway, it’s a good read, good to kind of think through those things. I don’t think we’ve solved it for our industry. So I think it’s early days on all this. So yeah. Oh, and what’s this breaking news? I’m getting word now straight from our correspondent. There is a book out there that Jason Packer has written called. What’s the name of the book again? Hold on. I haven’t written down Google Analytics alternatives. And for listeners of the analytics power hour, he’s going to give you a 20% discount. So that’s pretty sweet. If you haven’t already bought the book, that is the incentive to do so. Discount code APH. So there you go. We’ll put the link to that in the show notes as well. All right. Well, Jason, once again, thank you so much for coming on the show. This has been a lot of fun and a really good conversation. Appreciate all the work you’ve done. It’s a labor of love, I’m sure, just to do all this. And so very much appreciate it on behalf of a vendor weary industry, I think you’re doing us all a big service. So thank you.

00:59:50.85 [Jason Packer]: Thank you. You’re welcome. Yes.

00:59:52.19 [Michael Helbling]: Great time. All right. Well, we’d love to hear from you too, because you’ve been listening and you probably have questions or you’ve got thoughts. And so reach out to us and you can do that on the Measure Slack chat group, which we’ve spoken about on the show. as well as our LinkedIn page or via email at contact at analyticshour.io. And we also love to get your comments and ratings on whatever podcast platform you listen to. Please feel free to do that as well. And I think I speak for both of my co-hosts.

01:00:23.72 [Tim Wilson]: Boy, have you listened to a few things that are a little important. If only our show prep had it. So one, just know that Michael, you and Jason and I will all be at Measure Camp New York on the 28th of March, so if you want to see us.

01:00:39.95 [Michael Helbling]: That is true, we will.

01:00:41.09 [Tim Wilson]: But even more important.

01:00:41.97 [Michael Helbling]: I didn’t expect by now there’d be tickets left, so I was leaving that out because it’s too late, you probably can’t make it. Well, there’s something that’s available.

01:00:50.97 [Tim Wilson]: If you can get a ticket. Moere important from an operational perspective, the marketing analytics summit that we’ll be at on April 29th, Yeah. Okay, now you’re… Yeah, I did skip that.

01:01:04.63 [Michael Helbling]: I did skip that, yeah. Moere breaking news, I’m getting… Yeah, we’re going to be a marketing analytics summit and we need your help. We want your questions. We’ve got a very cool survey of which there’s an Easter egg at the end that I had no part of. And we’ll have to take the survey and ask a question to see it. But yeah, go to analyticshour.io slash listener and submit a question. We’ll be recording at the marketing analytics summit at on April 29th in Santa Barbara, California. And we hope to see you there. But if you can’t make it there, we can still ask a question and we may answer it on the podcast. So please do that if you want to ask us a question. And even if you don’t want to, push yourself a little bit and ask what anyway. I highly preference questions that make Tim feel uncomfortable. So like, you know, asking emotional questions about, you know, the best manager he ever had or

01:02:09.06 [Tim Wilson]: Yeah, luckily, we have not figured out how we’re sharing access to all the questions with all the co-hosts.

01:02:14.85 [Michael Helbling]: Oh, yeah, that’s the little tricky part of that. All right, well, before I forget anything else about the show wrap-up, let me just say thanks once again, Jason. And I think I speak for both of my co-hosts, Moe and Tim, when I say, no matter what vendor you need to pick, just keep analyzing.

01:02:37.26 [Announcer]: listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst. Those smart guys wanted to fit in so they made up a term called analytics. Analytics don’t work.

01:03:01.49 [Charles Barkley]: Do the analytics say go for it, no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

01:03:15.95 [Michael Helbling]: All right. Well, we do have an editor who we’ve been talking so fondly about. So we can stop and start as needed.

01:03:31.43 [Tim Wilson]: Well, without further ado. Well, actually before, so just like Moee, are there any, because I mean, you’re kind of often in the midst of vendor selection stuff. So you’re comfortable. There’ll be anything you talk about. You can yourself edit for whatever, named and unnamed.

01:03:49.58 [Moe Kiss]: Yeah. So like we just signed a new BI tool, which I probably can’t. say, but I will just say I’ve been involved in multiple BI tool selections and stuff like that. Okay.

01:04:02.06 [Michael Helbling]: Yeah. All right. All right. Let’s start clackin’ the keyboard and record this thing.

01:04:14.42 [Moe Kiss]: I think I got it. You got it? I think so. I was like, I better do it before you start, because if I do it half a year, it’ll be like…

01:04:23.24 [Michael Helbling]: That was great timing actually.

01:04:26.59 [Moe Kiss]: Pretty sure I got it right.

01:04:33.76 [Michael Helbling]: Here we go in five four

01:04:47.01 [Tim Wilson]: Rock flag and an instrumental rock flag rendition by our guest.

01:05:13.64 [Michael Helbling]: Oh my gosh. That’s the permanent one at the end of every show now. That’s incredible.

01:05:24.53 [Tim Wilson]: I don’t know why I was showing that it was going to play that one, and instead it just played like Transition 2, so it’s good.

01:05:32.12 [Moe Kiss]: Fucking Rostat.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#293: Tool Selection and the Unhelpfulness of Feature Comparisons

#293: Tool Selection and the Unhelpfulness of Feature Comparisons

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_293_-_Tool_Selection_and_the_Unhelpfulness_of_Feature_Comparisons.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares