#045: Identifying and Prioritizing Hypotheses

The intro bumper for this podcast says “the occasional guest,” and, yet, the last five episodes have had guests. That’s hardly “occasional,” so Tim and Michael had a choice: either change the intro or do an episode on a topic for which both of them have experience, interest, and, hopefully, at least modest authority. In this show, the guys dig into hypotheses: how to identify and articulate them, the pitfalls involved in *not* clearly stating them, and where they see organizations and analysts get tripped up. They have a hypothesis that you will get some value out of the show, and, if they’re right, that you will share the show with a colleague and maybe even give it a positive rating on iTunes.

People, places, and things referenced in this episode:

Episode Transcript

The following is a straight-up machine translation. It has not been human-reviewed or human-corrected. However, we did replace the original transcription, produced in 2017, with an updated one produced using OpenAI’s WhisperX in 2025, which, trust us, is much, much better than the original. Still, we apologize on behalf of the machines for any text that winds up being incorrect, nonsensical, or offensive. We have asked the machine to do better, but it simply responds with, “I’m sorry, Dave. I’m afraid I can’t do that.”

00:00:04.00 [Announcer]: Welcome to the Digital Analytics Power Hour. Tim, Michael, and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com forward slash analytics hour. And now, the Digital Analytics Power Hour.

00:00:24.53 [Michael Helbling]: Hi everyone, welcome to the Digital Analytics Power Hour. This is Episode 45. As you sit and you look at all of your digital data, sometimes it just washes over you and you hope that somewhere out of it inspirational strike or a great idea, a lightning bolt of insight that guides you to a huge ROI. But here’s the question. What if you could make your own weather, so to speak, setting up and knocking down hypotheses in a way that made you the cool kid in a analytic sense? Well, buckle up, because you’re in Tornado Alley tonight. And my fellow storm chaser is none other than Tim Wilson. Heidi Ho. That’s not really what a storm chaser says, Tim.

00:01:15.22 [Tim Wilson]: I know, but I was trying to. I was blanking on Helen. What’s in Twister? What’s your name?

00:01:22.71 [Michael Helbling]: Is it Helen Hunt?

00:01:23.97 [Tim Wilson]: Is it Helen Hunt?

00:01:25.36 [Michael Helbling]: Twister, 1996 movie starring Helen Hunt and Bill Paxton.

00:01:32.45 [Tim Wilson]: So am I Bill Paxton or am I Helen Hunt? You dear listeners can be the judge.

00:01:38.76 [Michael Helbling]: I haven’t watched this movie since I think 1996. All right, and I’m Michael Helbling. All right, Tim, well, setting up this topic, that’s something that I think a lot of analysts and a lot of companies struggle to make this transition into effective analysis and insight generation because they don’t understand how to take the data that they’ve got and transform it into hypotheses and how to prioritize them. So that’s what we’re going to spend time tonight. So let’s turn this off, Tim. How do you kind of help people do this or what has worked for you in this?

00:02:16.42 [Tim Wilson]: So I think you hit the nail on the head that part of the challenge is I think there’s this raging, raging misconception in the marketplace. I mean, I think marketers and businesses definitely have this. And unfortunately, I think too many analysts have this as well that Somehow saying we have all this data we have more data than we’ve ever had We do need to just dive in let it wash over us and wait for insights to emerge I feel like I’ve talked until I am blue in the face and I’ve got two more times that I’m doing that basic pitch just this fall because the fact is we do need to have hypotheses before we start doing analysis in my view. So I think kind of step one is just trying to explain that to people. I use a MythBusters analogy. I mean, that seems to be a clear way to articulate to any group and say, what if MythBusters was run like many, many marketing campaigns or like many channels are invested in? It literally would be Jamie and Adam standing in the workshop and saying, wow, we have a bunch of stuff. Hey, how about if we go blow up that car and then they go blow up the car and then they say is that myth busted or confirmed and they’d be like wait a minute. I don’t know and you know maybe one of them chimes in and says we pet instrumented it you know we know that with that explosion shot debris twenty two feet up into the air. And they’d be like, great, but what’s the actual myth we were trying to bust? And I think that, to me, this is an analogy I use a lot, is what happens is we do something and people are excited about it. And it started with some genesis of an idea of this is our strategy or this is our objective or this is how it aligns with the message we’re trying to get out to the market. But then it’s kind of we did it now give us the learnings and there was no stopping up front and saying how do we believe the world works like why do we think this would work. What do we want to learn right now we never hear about. a campaign or an initiative being looked at as saying, yes, we want it to be successful in its own right, but what is it we want to get smarter about? And what do we want to get smarter about needs to start with? Well, I think that mobile video ads will be a fantastic way for us to draw in millennials. If we’re right, we’re going to double down on that and do more of it, but there just isn’t that discussion. So I think step one is absolutely just getting people to separate the volume of data from, you still have to have questions and you have to have ideas. Our last episode Dennis Moertensen, you know, talking about AI and he actually kind of said the same thing. So I’m like, yes, you know, exactly. We’re talking about AI and he said the successful people of the future are going to be the ones that are asking smart questions. We’re going to get better at better at answering the questions, but the need to ask smart questions isn’t going anywhere. So that’s one thought. Now, but not that I have a strong position on that.

00:05:32.09 [Michael Helbling]: You’re so rarely opinionated. It’s very strange. Suddenly encounter this know I so I of course agree with you a couple of things that I’ve observed and sort of live through even in my own career is first and foremost sort of this concept of just sort of like hey look at all this data. And now we just watched the insights roll in. And in reality, as an analyst, a lot of times, you’re sitting there being like, well, where do I go? And then the other thing is, sort of, and you kind of referenced this, this sort of almost spasmodic jerking, trying to find stuff, because here we are in the middle of a storm. And that’s sort of just like a fight for survival. But there’s no structure or finesse or bigger approach to how we’re getting into the data and leveraging it so that you can build things. And some of this is just really about lack of planning, lack of thinking about what our objectives are, kind of writing down even what the things we want to learn about. Because it would be amazing to say, hey, we’re going to go not only learn about social media and how to invest correctly there but we have these other objectives that we have that we want to use all of our social media spend to learn about and so how do we construct. Our campaigns and our exposure in a way that helps us gain this kind of information. So that kind of thinking. The other thing is, and this is sort of backing up a little bit before we start into analysis, and that is defining some sort of framework that analysis or hypotheses can hang on. In other words, and we’ll get into this a little bit in terms of hypothesis construction, but in essence, giving you some sort of a, hey, what is it that we’re out here on the website to do sort of what would be a good set of goals? And obviously that takes a lot of organizational support and buy-in, but in a lot of cases, they’re sort of pretty straightforward, at least at a high level for most businesses. And obviously some really large businesses have multiple different things the site is there to do, but You can get down to a core distillation of, hey, we sell things on our website, so people buying those things, that’s really core to this website experience, or we’re a B2B and we need to get our sales people in touch with the right contacts to other companies. So how do we generate leads or incorporate data into our CRM system? That’s how we measure that. Or we are a brand that is trying to get information out to a broader set of public through the website through coupons and things like that for CPG. So how do we measure and how do we have a framework for success around how people are engaging our content, downloading our coupons, whatever the case may be?

00:08:33.75 [Tim Wilson]: For those of our listeners who are not in the U.S., also FMCG.

00:08:37.84 [Michael Helbling]: Oh, right. Oh, you and your… I’m traveling through Europe, so I’ve got to use… No, it’s fine. And I do appreciate that because I don’t know what that meant, but basically consumer goods.

00:08:50.86 [Tim Wilson]: FMCG, fast-moving consumer goods. Is that really what it’s called? Yeah, I think… One of the enemies of hypothesis generation and meaningful analysis is sort of the rut that we get in in day to day operations. Because I’m thinking about one of my clients has a large e-commerce site and they had somebody new start. And he is kind of over in kind of the paid search and affiliate world. And kind of as the new guy, he just kind of said, we’re spending all this money. And there’s evidence that it’s working. I mean, they can point to this is a good investment of money, but he was coming in with sort of a fresh set of eyes. And as he’s learning the business, he’s asking some of those basic questions of what is our approach? Why is this our approach? And he’s not asking it because he’s saying it’s wrong. He’s saying, I’m trying to understand. But just in having that discussion, which there would be no reason for anybody to ask that question who has been you know, they’re at the company. There’s just no need to say, you know what, time out. Let’s just take a step back and say, what the hell are we doing and why are we doing it this way? Well, it turns out because we’re making some assumptions. You know, Brent Dykes had one of his articles within the last few months was on this whole idea of assumptions governance and which I thought was, he didn’t originate the idea. If you search for assumption governance, you’ll find other writing about it, but we don’t really think about that as saying, have we stopped and thought about the things that we’re doing that either take us a lot of time or cost us a lot of money or often based on assumptions. And I think Brent even made the point in his article that you have to operate on assumptions. You can’t go and validate everything with the data, but we don’t even stop and think about what are our assumptions. We’re spending on display because we have. Like you watch this sort of insidious migration from carving off some TV spin to spend on digital and that became display and then we’ve been doing it. And we’ve been looking at the data, but we’ve never kind of taken a real analytical approach to what are the assumptions into why we’re doing this? Because articulating an assumption to me is your halfway to articulating a hypothesis, if not farther. And I think analysts should play that role. Like when I use my sort of framework, you know, a hypothesis is nothing more than saying, you know, I believe fill in the blank. It’s, it’s a, it’s an attentive assumption you’re making about the world. And that’s all it takes to stay to hypothesis. But the thing is, is we don’t, we don’t want to just go validate hypotheses that aren’t going to lead to a decision. So we’ve got this second fill in the blank of, if we’re right then this is what we’ll do and I use that now all the time even if I don’t if somebody’s just like go analyze the website and I’m not going to be able to have the discussion I’d like to have I’ve been around the block a few times I will sit down I will poke around the website and I will I will generate hypotheses. And ideally, and probably 75% of the time, I take those and basically sort of clean them up in that framework and say, OK, I think this might be going on. And if that’s the case, then this is what we’ll do. And I think this might be going on. And that can spark the discussion. So I know I jumped right into kind of my framework for doing that. But that all comes before you said something earlier about you know, diving into the data. And I think it is so easy to become horrendously inefficient and effective by diving into the data prematurely and not spending some time doing honestly kind of the harder, more organic work of just thinking about what are those assumptions and what are those hypotheses and what might you do, you know, depending on how they panned out.

00:12:49.93 [Michael Helbling]: Yeah, absolutely. And no, I don’t think that was the direction we were headed anyway. It’s sort of, if this, then what do we do about it? And, you know, some of those core things about how we operate the business, obviously that guides, well, what is the action should our whatever word researching come true? And so, no, I think that’s, that’s appropriate.

00:13:14.64 [Tim Wilson]: Which in it’s fine to me. That’s the other thing is there’s another big misperception is that Every analysis is supposed to yield an actionable insight, which I think is a horribly unrealistic expectation. If I have an assumption, chances are 80% of the time, that assumption is correct. So there’s still value in me saying that assumption is based on this hypothesis and I’m gonna go validate it. And you know what? That assumption is largely true. Two things will happen from that. I can kind of move on and move on to other assumptions and don’t have that uncertainty. Chances are as I’ve been digging in to validate the hypothesis underlying that assumption, other things will crop up and I will say, oh, you know what, but I just generated three or four more hypotheses. I’m not necessarily going to run them all to ground. I’m going to put them in my, that same world where I want to sit down, float those out to people and that, that, It kind of happens. I love when that happens, you know, that you’re saying, you know what, your assumption is effectively true, but we noticed this other weird thing about your paid search traffic. Does this make sense to you? It seems kind of odd to me and that’s not what we were trying to chase down, but If this is true, let’s talk about what that would mean, because I can go and validate whether that really is the case, but let’s talk about what you’d do if that is the case. So actually validating hypotheses that don’t lead to action is okay, and it’s probably gonna happen more often than not, because you have to. If every analysis you’re doing is turning up some horrendous surprise, it’s like, well, somebody needs to be smacked upside the head about, you know, how consumers behave because you’re not going to find a million golden nuggets, you know, when you turn over a million stones.

00:15:14.10 [Michael Helbling]: Yeah. And I think sometimes there is a lack of patience in organizations for analytics to kind of do its work because we think, oh, every time you turn over one of those stones, you got somebody coming up with big incremental gains. And sometimes we just learn something fascinating that we sock away and it doesn’t have an application for maybe years. You know, and that’s something I think that’s as if you’re somebody who’s managing an analytics team, I think that skill of like how to direct analysis and how to know when to kind of position like, okay, I think we’re headed the right direction or, you know, we need to probably go this way or that way. Like just is this, it’s a skill that very very few people really do well and sometimes as an analyst even when you’re you’re not directed you don’t know exactly when to sort of be like cut bait and move to another topic and sometimes as analysts we get real razor focused on something that we find pretty interesting but our time could be better and that prioritization I think is also kind of a really big deal of like how do we kind of make sure well how do we drive the most value because again some of this is very scientific but there’s also an art to picking. Does that make sense?

00:16:39.36 [Tim Wilson]: Oh, yeah, absolutely.

00:16:41.61 [Michael Helbling]: And you know how there’s always an article every so often about how analytics is ruining creative? And in a certain sense, if you think about it like the demand for results is ruining analytics, is something that some people probably don’t have probably never said that way, but it’s probably a feeling they have like, let me do the analysis I want to do because it’s interesting. And I think it might have something meaningful to, but I don’t want to. So again, it’s like this balancing act of how do we kind of serve the overall goals of the company while at the same time having enough leeway to do research or do analysis around hypotheses that may or may not yield like incredible dollar signs. Does that make sense?

00:17:28.99 [Tim Wilson]: Yeah, I mean, I am a fan of trying to discreetly articulate every hypothesis. And I try to, to me, this seems like it’s a unnecessarily challenging leap for some people to make that there’s not a one to one to one relationship of hypothesis to analysis to presented results. I may do one analysis that may actually have two or three or five hypotheses in it. that may be one or two or three deliverables. I do try as hard as I possibly can to bucket out each hypothesis. And I’m fine with the little small hypotheses and kind of the idea of sub-hypothesis sort of as well. Because I think they’re sort of a twofold value. One, you can start to show people, look, we started with legitimate hypotheses in 60% or 80% didn’t turn anything up. So let’s start one kind of managing expectations as to how much we have to invest. Now we can also point to say, now these hypotheses, we just validated them, or we weren’t able, it was non-conclusive. But you know what we did figure out? We figured out we had this other data, or we built a foundation to do these future analyses, or we generated these other hypotheses. So it doesn’t mean there’s zero value. But we can start to pivot the group. Don’t just tell me to go dig into your email data and find insights. We’ve got to break it down into these discrete things. And I feel like analysts just don’t click. We’re such pleasers that we want to have one conversation and then just go and do a ton of analysis. to me it’s like it’s not that hard nobody everybody likes to talk about their business users love to talk about what they’re doing and why they’re doing it so if i dive into analysis and say you know what i found some stuff i’m getting to a certain level i don’t think there’s anything super surprising here i’m basically validating the hypotheses but i’m not going to polish up this final deliverable, I want to go back and meet with the stakeholder. And I’m going to walk her through the current state of the deliverable and see if it sparks some other hypotheses, sparks some other investigation points. They know the business better than I do. And I’m not trying to deliver kind of the big, aha, you know, in one big final reveal. So I don’t know if that was actually responding to your – No, but I mean, the nice thing is we tripped merely along.

00:20:10.40 [Michael Helbling]: We stubbed our toe on all of our little bugaboos or bug bears as Simon Rumble taught us. Bug bears. I sort of – I’m starting to adopt that. So, no, you’re right. And actually, that process of integration is one that is really difficult, both externally and internally. And this is where soft skills and analytics, I think, are so critical and probably undervalued in that you have to be friends with people and get their buy-in and their trust so they show you and are honest with you about what it is they’re trying to do and give you feedback and so that you can show them here’s some stuff that I’m just I’ve started to put together it’s still kind of molten and it hasn’t taken shape because that’s that’s the challenge and certainly you and I Tim both now live on the consulting side of the world where The typical model is we’ve got all this brain power. We’ll ask you some deep probing questions. We’ll walk away for three weeks and then return with a magnificent statue that is now… And in reality, the best thing to probably do is say, great conversation. We’re going to dig for a week and then we’re going to come back with all kinds of dirty guts and fish and we’re going to look at it all together and it’s going to be smelly and put your boots on. And then we’re going to go back and eventually we’re going to get to some really great things, but we’re going to do this process together. And I think there’s such a lack of understanding or willingness of that process of, okay, yeah, that’s great, but that’s going to take way too long to prove out, or, hey, we could do something with that right now. Let’s put that closer to the top of the list, because I think there’s real value in what we could learn there. Or, okay, see, we studied that four times last year, and basically we kind of know that inside and out. You didn’t know that, so let’s chuck that out the window. So before you then made that the centerpiece of your massive dog and pony show at the end, right? which I’ve seen that kind of thing happen tons of times. So again, it’s that back and forth. It’s the combination of the analytics expertise and the domain expertise or the business expertise. And it’s that collaboration that is so vital. And that’s why it’s important that your leadership or your sponsor or whoever it is that’s enabling this as if it’s marketing the CMO or the director or whoever’s in charge of what you’re doing believes in, trusts and understands, okay, here’s how we integrate in what we’re going to be doing and we’ve got buy-in from the organization.

00:23:08.29 [Tim Wilson]: That is a kind of reminds me of another challenge in this area is when When you’re the sole analyst or a small team and you’re reporting up to somebody who maybe has misperceptions, I’ve watched that. And it’s actually easier for a consultant because usually they’re like, hey, we’re paying you. So if I say, look, dude, it doesn’t work that way. If you’re saying that in this really murky world, you want somebody to come in and just magically put together all the pieces of your entire cross channel strategy and present glorious insights when they’re external or they’re brand new like you’re you’re you’re smoking crack like it’s not going to happen but I have even as an external person and I’ve just maybe just gotten more comfortable this probably comes more with you know age than anything else that Saying no like we need to you know either we’re all gonna sit down and look at this messy data put on your put on your hip waiters because it’s gonna get. Not deep and bullshit just there’s a lot of stuff to wait into I still will do everything I can to make it. Understandable like this is not like oh here’s a massive spreadsheet yeah definitely. But yeah, let’s sit collectively with this pivot table and slice it a few ways, or let’s look at it these two or three ways. And I’m not going to pull you in for a half day. I’m not asking you to go through all of this. I just have generated questions of saying, well, this seemed surprising to me. Can we explain that? I can’t explain that. Can you explain that? Asking the subject matter expert, can they explain something that looks odd to me, the analyst, is about 12,000 times more efficient than saying, I’m going to sit here and just dig deeper and deeper and deeper and deeper into the data when somebody may be able to look at it and totally explain it. Right. And that’s the application of context.

00:25:00.20 [Michael Helbling]: Right. And that’s so vital because as an analyst, you might actually come up with the answer that is that piece of context through your analysis and a lot of travail and hard work. But then the person is like, why did you spend so long trying to figure that out? We know this is how our business works. You’re like, well, yeah, now I know. But you know, you could have just asked.

00:25:23.13 [Tim Wilson]: But I found a really expensive way to get to something that you already knew.

00:25:26.18 [Michael Helbling]: Yeah. Something that we just take for granted around here or we understand is sort of fundamental and so yeah it That is a big Organizations, that’s a big hurdle in and you’re right like if you get hired into an organization You’re kind of the junior person and you’re in a structure that is I don’t know if, honestly, I don’t know what advice to give at that moment except to try to push people towards this kind of a construct and things like that, but it is difficult, I think, when there’s not sort of a mutual understanding coming from the other side.

00:26:11.33 [Tim Wilson]: Yeah. And I think that’s, that’s when, if there’s a, there’s a manager saying, no, you can’t go back and talk to that person again until you’ve got the final result. Like that’s the, that’s the killer. But I would say, I mean, this would get into organizational stuff anytime any manager is saying you can’t go talk to a stakeholder. Right. I mean, I think at a fairly low level, you can say, look, I legitimately have three or four things that all look kind of interesting. But the last thing I wanted to do is present to 12 people. With something where they’re gonna say yeah we already knew that if i can go if i just go talk to this one person and say help me understand your business more because that’s that’s that other weird thing that we feel like oh we’re asking people to explain their business and they know how their business works i’m wasting their time like no turns out. They’re hopefully they’re doing their job because they like it and they are thrilled to have somebody who’s, you know, asking them explain more. Well, what about this? Well, what about that? I mean, I’m envious on that front of analysts who are internal because when I was internal. Yeah, I got to build all that stuff up and I had the handful of people who I knew were so fun to work with and I’d learned from them about the business and we could just have so much fun on different projects because they were asking smart questions and there were the other people who maybe it was a little rocky at first, but I could kind of help sort of guide them and we got to where we were working together well. And that, you know, that takes full time internal listening to things over, you know, cubicle walls. And even listening in casual conversations that meant I think that’s another skill to learn I think that constructive I believe if if we’re right I we will. That’s not something to say I’m gonna put a Google form up and I’m just send out my business users to fill that into it is a. something that an analyst can totally work with that and say, I just had a great conversation and they seem so full of ideas and I jotted notes down. Let me go look at my notes and now let me see if I can take this random three pages of notes and if I can translate that into hypotheses, that’s kind of like active listening of a pretty high order because that’s going to take what them just kind of, you know, barfing thoughts and questions and ideas and saying, I’m going to do the work of putting that into a hypothesis that points to an action that I can play back to them and they’re not going to have a problem with it. They’re going to like it. They’re going to read it and say, oh, shit, yeah, I told you all that. And oh, by the way, that’s not really what I meant. You kind of misinterpreted that. That’s not a fail. That’s why you do it because to make sure that you’re clear.

00:28:51.20 [Michael Helbling]: Yeah, deeper understanding and then alignment on what to go after. So we touched on a little bit around sort of prioritization, right? And so low effort, high impact. How do you construct this for people? How do you guide people through what to go after and when?

00:29:13.59 [Tim Wilson]: Oh, it’s very, I have a two by two matrix and along the x-axis is the level of effort and the y-axis is high impact. I’ve seen that fucking two by two matrix. It’s so many conferences. AB testing gurus love that, which I just think is… Is that CRO gurus?

00:29:31.80 [Michael Helbling]: CRO gurus, yeah. That’s one of my bug bears.

00:29:35.19 [Tim Wilson]: I love that. They’re like, oh yeah. you know just just find the find the tests that are going to be low effort and they’re going to generate high impact and i’m like well okay if i knew what was going to generate high impact then why am i doing the test and we’re living in the real world not kind of a theoretical oversimplified you know construct so the reality is if you plot every one of those with your best best guess at what kind of impact might it have and your best estimate of how much effort it’s going to be Guess how many you have that are low effort and high impact, like one.

00:30:06.30 [Michael Helbling]: Yeah, or two or something like that.

00:30:08.30 [Tim Wilson]: Yeah, that’s not the quadrant things live in. So I think it’s a lot, it’s a lot greater than that. And, you know, whether it’s done sort of organically, you know, the reality is, you know, you’ve got the highest paid person’s opinion. And I think we talked about doing it in, in defense of the hippo, or has that been a blog post? I know I’ve heard the indefense of the hippo. The highest paid person’s opinion should not be immediately discounted.

00:30:36.99 [Michael Helbling]: The hippo comes in and out of style, depending on who you talk to. But at the end of the day, in Kevin Hilfschirm’s speak, that’s the person who’s going to get fired if this is all wrong.

00:30:50.66 [Tim Wilson]: Right. And it’s the person who’s most likely to be able to drive action. They have budget, they have people, and their ass is on the line. So the fact is, I had a client, and it was a little short engagement. She was the CMO of this organization. and sat down with her and she was she mapped out she was an analyst dream because she had mapped out in she would sketch it and she had clearly drawn this thing so many times of this is how our business works and this is our marketing works within the business and. What I need to be able to do is confirm this and specifically this arrow here. I’m the most uncertain about and this arrow here. I’m the most uncertain about and you know, I went back to the analysts saying, Oh my God, like she is like one of the best CMOs you could ever have because she has an analytical mind and she’s asking smart questions. Now you may also have. CMOs who say you got all that data. You got that fancy web analytics platform, you know, go give me some insights, you know, tell me the answers. And that’s a problem because they haven’t asked a question. So, but definitely the, yeah, the, the higher you up on the food chain, the more attention you’re going to get. So that’s one, I think that second part of the, if we are right, you know, we will do X, the more crystallized that is, the more it is if If this is what the hypothesis, the validation turns up, then we’ll do action A. And if this is what it is, it will be B. And because of the way we’re going to approach this, it’s going to be pretty crystal clear. I think that bubbles up in the priority because it’s more likely to lead to action if it’s something where it’s clear what the action should be. So I mean, I just, you know, trying to do ones that are going to prove somebody wrong. Like to me, that’s a huge red flag. You know, if you’re like, yeah, that. John Smith has bugged me for years and he’s been going around saying Instagram is the be all end all and I’m going to show him. We’re going to make him look like an idiot. Be careful. Analysts can’t really make any of it. Even if you’re saying your objective and the data is what it is, Making anybody look just black and white bad is not ideal. It may be the right analysis to do, but you need to figure out a way to do it where You don’t want anybody to lose. You want them to be the ones championing that, hey, I’ve been championing this, but it turned out I worked with Tim to do the analysis, and I’m thinking we should change to X or Y. My overused buzzword in this area is alignment.

00:33:41.31 [Michael Helbling]: It is along the lines of what you just described, which is, hey, CMO or VP of Marketing, they care about these five things. Guess what’s really high on our list of things to hypothesize about? Obviously, it’s going to be the things that they’re really betting on. And that makes a lot of sense. And also what you just said about the politics of data and how organizations can, and that’s very damaging, frankly, to analytics and probably to the organization itself. And when you see that, that is a red flag. When somebody’s going after somebody else with data because they want to prove them wrong, And actually, you know, hypothesis and testing kind of fall to this a lot, I think, in different organizations when you get two different sides basically using it as a cudgel against each other. We just use the word cudgel.

00:34:36.56 [Tim Wilson]: Sometimes a cudgel can really help drive some alignment.

00:34:39.13 [Michael Helbling]: Yeah, we drive some alignment.

00:34:40.75 [Tim Wilson]: We’re saying you get those two people in a room and smack them with a cudgel.

00:34:44.81 [Michael Helbling]: Yeah, box their ears is what you do. Two marketers enter, one marketer leave. No, but the reality is that there’s a common enemy of every organization, which is, I don’t know, entropy, failure, competitors taking away all your customers. and getting together and viewing the common enemy then gives you a framework for taking on doing all the analysis and getting the insights and hypotheses that you need. And using it that way is way more virtuous in terms of the pursuit of analysis than say like going after somebody kind of cross-departmental or something like that. And no matter how you feel about social media or something else new that just came up, And it’s hard because you see people chasing sort of, I don’t know, I kind of call it shiny object syndrome right where it’s, hey, Snapchat’s really popular. You can advertise on Snapchat. Maybe we should make Snapchat filters for all of our retail locations so people can take pictures with hard logo on there and that’ll be big brand reach to millennials. And that’s probably a strategy and a discussion that is Happening on the regular right now and that’s okay because that might not be wrong, but like let’s figure out how we’re gonna Where’s that service or how does that help us and all those things and I might have just gone away off on sidetrack.

00:36:17.35 [Tim Wilson]: Well, no that has me thinking I mean that actually reminds me of I Hate that man use the hole so I’ve got a client who but It’s kind of a tricky space, not kind of a hard online conversion. Social media manager who kind of hear from the analyst like, oh, she’s difficult. She just has a million questions and she just wants this and that. And she’s maybe a mildly challenging personality, but you know what she’s really passionate about? Social media you know what she’s done a ton of reading about social media Turns out you know what she has a lot of smart thinking about social media and She doesn’t know how to use the data. She may have woeful misperceptions about how the data works, but for Multiple years now every time I have a conversation with her I kind of it’s fun because I’m listening for saying you’re trying to explain to me how social media works and it turns out You’re kind of right. I can explain to you what the data can or can’t tell you, but now we’re going to jointly decide what’s the right way to validate this hypothesis. And I watched the analysts get frustrated with her because sometimes the stuff that she’s looking for requires some weird you know, data polls that are kind of challenging to go figure out where to get it. But if you think about the data, it should be there. Like it’s a legitimate hypothesis where she has already articulated a strong action. Why are you frustrated? Because it’s hard, you know, give these people, they may be challenging personalities. They may be in an area that social media, you may not like social media personally, but man, you need to work with them and say, we’re trying to do what’s right for the organization. and let me help you. And gee, if they’re on board and they have a hypothesis that you think is the dumbest thing, there’s no way it’s gonna pan out, embrace that and do it really, really well and say I’m gonna validate it anyway or fail to validate it and take them right along with me because now we’ve worked on this together and it’s not I was right, you were wrong. It’s we agreed that this was a legitimate question and we partnered to Answer it and now we can turn to the rest of the organization and say we partnered and did X and oh by the way the way we did this was by Working together to be really clear about what we thought was happening What our assumptions were what the hypothesis was and then we validated it because we knew we could take this action You know what? We’re gonna shut off paid media on Facebook, you know for instance, so

00:38:57.52 [Michael Helbling]: So yeah, no, Tim, I think this is a really good discussion. And it’s sort of an underserved area of analytics, in my view, generally, in that I don’t think people give attention and intentionality to this aspect. They definitely spend a lot of time focused on the tools, like let’s get the right tools. Let’s do a ton of analysis to figure out what the right tool is for us. We spend a lot of time thinking about what it is we’re gonna measure and how we’re gonna get that data, right? So how do we build our data storage? How do we integrate our data together? This is all good stuff, but then we just fall right off a cliff in terms of our intentionality when it comes time to start using that data to actually do something. And that’s where it turns into a swamp and not a river, right? Or something, I don’t know.

00:39:48.75 [Tim Wilson]: Yeah, well, and I think, I mean, I’ve got theories about why that is because, you know, there’s a hard dollar cost on the technology and there are a lot of vendors and I’ll say all technology vendors out over promising that, you know, get our, you know, for instance, you could see, you know, you could bring in the weather data at the, put it right next to your traffic data and that’s, and I can list multiple vendors that talk about You know, being able to do that and then just think you’d be able to see if you were selling bicycles, you would be able to see. I’m like, well, yeah, but I’m selling toilet paper and people shit as much and when it’s raining, is it when it’s cold as when it’s hot.

00:40:30.55 [Michael Helbling]: Is that your hypothesis? Because, you know, we could research that. We could. I’m sorry. You like to pick on this weather data example, because I’ve heard you do that before, but in reality, there’s a couple of scenarios if you can do real-time segmentation because of certain specific weather data, that would be a good thing to do. No, absolutely. Okay, just wanted to make that clear.

00:40:58.19 [Tim Wilson]: look at what the vendors can do is they can say across our entire install base we’ve generated five highly impactful case studies in five totally different industries that are these five little snowflake oddball things and we can tell those stories but I mean hell today I had a I had a vendor contact me about one of my clients because I know the vendor and he’s like, Hey, the client’s blowing me off and you know, I’d really like to get this moving. I’m like, I know you would because you’d like to hit your quota, but this has no value whatsoever for. The client, but you’re telling everybody you can, by our technology, you’ll be able to do these amazing things. I’m like, yeah, but not without a common key. Oh, you’re going to integrate with your offline data. Well, not in your CPG slash FMCG. you’re not getting the purchase so shut the hell up you know that’s that’s a lot of crap so i think that’s part of it is there and then it the time lag right you’ve got the sales process where they’re saying all you gotta do is get our stuff installed it’s gonna take two weeks and then you’ll have glorious insights well it takes three months to negotiate the contract it takes nine months to get it implemented the whole time you’re just hanging on to what the sales guy said, which was, and the glorious insights will emerge, you finally get it rolling, and people are exhausted, so they’re saying, oh my god, we now have all this data, and we’re just gonna jump into the swamp, instead of saying, holy crap, I don’t have a process, I still need to have rigor, I still need to have the same diligence I had when I had my lesser, crappier, free, poorly implemented tool, You know that I had then you know that hasn’t that hasn’t changed yeah and I will say the other side is I think that the business as well as analysts will completely confuse the looking at the looking at the past and measuring performance you know KPIs. KPIs are only relevant in hypothesis validation and from your alignment perspective. What do we care about at the end of the day? But when you’re using KPIs, you’re just measuring, are we hitting our goals? Are we achieving what we expect to achieve with our plan? We’re generating hypotheses to help us do that but they’re kind of a separate thing. Hypothesis are let’s make an assumption of what’s going on because we want to change the action in the future and yes if we do this well we will continue to meet our performance objectives but that too happens with the give me the campaign report and I want my actionable insights from the campaign report. It’s like well if you’re delivering one campaign report Part of that should be, did the campaign deliver what it was supposed to? And the other part is, what were the hypotheses that we had that we were validating through the data from that campaign that is going to influence what we’re going to do in the future? Because just saying that campaign fell below target, that doesn’t tell me what I should do. But that’s all the KPI does. It says, nope, you didn’t do what you were expecting to do. The KPI just tells you where to start asking questions. where you need to, like, because you may, if you exceeded it, it doesn’t mean that you couldn’t exceed it farther. But the fact is, you laid out, these are the things that we wanted to do. And the ones where we missed it, or the ones where you better start trying to figure out why. And so that you don’t do that again.

00:44:29.51 [Michael Helbling]: Right. How do we fix what’s broken? And how do you make big, bigger?

00:44:33.43 [Tim Wilson]: Yeah, or in your hypothesis, maybe, you know what, we just were, we’re ridiculously unrealistic about that. And we don’t think we’re ever gonna hit it. And so we will never set that target again, which is still fine because that means that’s influencing your future investment. If you say, yeah, that was, we are not gonna have 10% of the people who come to the site share it socially. Like that was, that will never happen. So the next time we’re looking at that next campaign, let’s be realistic about what we might do. And then we may say, yeah, maybe we don’t wanna do that campaign. That tactic doesn’t make so much sense.

00:45:05.49 [Michael Helbling]: Absolutely. I mean, I’m glad we picked a topic that both you and I have so little interest or passion about. But actually, I mean, it makes a good segue because unfortunately, we do have to wrap up. But as you’re listening, if you’re a marketer, well, first, how did you end up here with all of us cool data nerds? And second, start taking an analyst with you. everywhere in your business. Take them to all your meetings. Let them see what’s going on. Let them hear the questions they’re going to ask so they can learn how to do analysis that brings back real value to you. And if you’re an analyst, start trying to horn your way into all the meetings.

00:45:50.74 [Tim Wilson]: Well, I will say, if you can’t just drag the analyst along, and maybe they’re not asking questions in the meeting, but man, you better hope they’re grabbing you afterwards. So maybe this is more to the analyst as well, because I watch analysts sometimes come into meetings and say, uh-huh. Yeah. Uh-huh. And I’m like, you don’t know the business that well. There is no way this is 100% making sense to you. And again, I feel like I’ve played the, hey, I’m just the dumb analyst. You don’t understand the business. Let me ask the stupid question. And I swear people are like, oh my God, that’s a good, like that’s a great question. Or I hadn’t really thought about it that way and oh by the way I’m now more equipped to do the analysis because I asked a question that was a legitimate question so unless you got to actually actively listen and process and say does this make sense or do I need to probe more.

00:46:45.11 [Michael Helbling]: Well, and there’s a certain amount of humility required to be a good analyst because to ask a good question or to formulate a good hypothesis means putting yourself in a position of not necessarily knowing the answers to everything. And so we’re lucky Tim, right? Because we can raise our hand and be like, well, you know, I have an industry reputation, so I can ask a dumb question and people will give me the benefit of the doubt. And be like, oh, that’s interesting because, you know, we’re not unproven in that context, but maybe someone else might feel that way. So it does, I feel the tension of that for people as they do feel pressure of like, I can’t look dumb in front of this room and ask this question.

00:47:30.74 [Tim Wilson]: Yeah, but they got to get over that because that’s that is a internal misperception like that is a possibly I think it absolutely is I mean I think that is we tell ourselves that story that oh if I ask that I’m gonna look dumb and I will 95 times out of a hundred if you ask the question because you’ve literally you sat there and thought about it and said I Don’t quite understand or even man, you’re throwing buzzwords around and I just I don’t get those buzzwords and Yeah, the fact it and you can you can hedge it you mean the humility is a good way to put it say look I know this may be a dumb question Everybody else seems to understand this. I mean you got to be careful to not be like a snarky like I know this may be a dumb question But really I think it’s a smart question. I’ve done that Yes, there are some people who are know of me or know of my organization, but I am much, much more often working with people who’ve never heard of me.

00:48:29.17 [Michael Helbling]: Like, I feel like I’m starting from- Oh, it’s the first thing I tell people. Like, I’m kind of a big deal here, guys.

00:48:35.86 [Tim Wilson]: I’m kind of a- Hi! Have a podcast. I had a blog and now I have a podcast.

00:48:42.24 [Michael Helbling]: That’s right. It’s hard to let that one slip out naturally. Yeah. We just talked about this on my podcast. What were you? Oh, yeah. How’d you know?

00:48:53.63 [Tim Wilson]: Who told you? You must realize that, no, but I think that’s a personality trait of an analyst that we want to be right. We’re not comfortable, naturally comfortable saying, I don’t know, but it’s not that hard. I have a hard time looking back in my career when I wasn’t, of all the things, I mean, I am racked with insecurities, not about this one topic, apparently, but my entire career, I’m like, Please explain that more.” And it’s like you do it twice and realize that, oh my God, not only are people not looking at me like I’m dumb, they’re like so excited that somebody wants to hear it, and they’ll find more time. They’ll show up late for the next meeting so they can draw it out for me. And now I am so much better equipped. I mean, that is just to get over yourself.

00:49:45.92 [Michael Helbling]: I mean, I guess what I would caveat that with is if you’re in an organization where people are actively trying to destroy each other with data, then probably don’t ask the questions. But other than that, I’m totally in agreement with you, Tim. All right, we’ve got a transition to the part of the show where we do our last call, something we find interesting, some of that’s going on in the world today, something we’ve come across. Tim, what’s yours?

00:50:11.22 [Tim Wilson]: So last last episode, I plugged a specific podcast episode. This time I’m going to plug and it actually ties in well to this episode. I’m going to plug an entire podcast. It’s relatively new, although it’s sort of transitioned from a different format. It’s called science versus.

00:50:29.90 [Michael Helbling]: Oh my gosh, you I was sitting here. I have not heard of that. So I’m very interested. However, my last call is also podcast related. So I think we’re getting a little too insular here, Tim. We’re in this podcast world now. We’re just all our friends are podcasters.

00:50:49.07 [Tim Wilson]: No, so I am a I am a huge Gimlet media fan and I’ve run it other people who are in it’s one of theirs. So science versus the the premise is this Wendy and I’m gonna butcher it’s like Zuckerman, but I want to say on one of the first episodes Her boss actually on the recording was like, I’m mispronouncing your last name. And she said, yes, you are. And she pronounced it correctly and now it’s based on what it is. She’s Australian, but science versus they basically they’ll take on attachment parenting or fracking or gun control and they dig in and say, okay, you know, it’s a polarizing issue. Let’s dig into the facts. So I sort of throw it out to, to our listeners because it is one of those where Yeah, go figure. On all of these polarizing issues, it’s actually not black and white. Nobody is 100% right or wrong. And it’s kind of interesting. I would actually say she’s not the most awesome journalist in the few episodes that I’ve listened to so far. She gets there. Just maybe it’s the production and the editing. That’s tough. Glass houses, I guess. But it’s kind of interesting because to me, it kind of resonates with analytics where people want to have the, oh, give me the answer. And it’s like, well, it’s kind of this. It’s kind of that. It’s not totally definitive. There’s sort of things on both sides. So that’s mine. What’s your podcast related last call?

00:52:15.94 [Michael Helbling]: So I was recently introduced to a podcast that I kind of had heard of before. It’s called Invisibilia.

00:52:22.42 [Tim Wilson]: Oh, yeah. Well, now it’s got Hannah Rosen on it. She’s like the new, yeah.

00:52:27.77 [Michael Helbling]: Tim, obviously, is aware. Big fan. Yeah, it was recommended to me by colleague Stuart Schilling. He’s pretty smart, so I’ve given a listen. Specifically, there’s an episode they did on June 17th called The New Norm. about how sort of normative behaviors and things like that kind of define what we do and all those kinds of things. So it’s really interesting. I’m really enjoying it. There’s so much of what analytics people do that is in this part of science in terms of how to influence and change people’s opinions and understand the forces that are working under the surface in terms of how decision making is happening and things like that.

00:53:13.36 [Tim Wilson]: And that’s the invisibility. It’s like the invisible forces that are shaping the world that we live in.

00:53:19.25 [Michael Helbling]: I should have probably started with that, but that’s exactly right. So anyways, I’m really enjoying it. So I think if you’re in for another podcast, definitely want to add to your Stitcher. All right, well, if you have been listening and you have been crying out in the car like yes or no or whatever, give us a shout on Slack, on the Measure Slack, or on Twitter, or on our Facebook page. We want to hear from you. This is obviously a topic Tim and I both share a great deal of passion about. So if you found a great way to use hypotheses and you found ways to incorporate ideas into your organization effectively, what a great conversation to share with your colleagues and peers on social media. So love to hear from you. As you’re working, as you’re thinking about it, keep digging for those insights. And for my co-host Tim Wilson,

00:54:18.30 [Announcer]: Keep Analyzing!

00:54:45.50 [Michael Helbling]: Yeah, I just saw Start Recording 2 tomorrow. Oh, yeah. Wow, classin’ it up tonight.

00:54:57.72 [Tim Wilson]: Gotta get rid of them.

00:54:58.88 [Michael Helbling]: Yeah, no, I kill ya. It’s, uh, slow calorie.

00:55:02.86 [Tim Wilson]: It’s my, uh, it’s my special diet. Drink, drink low calorie slow.

00:55:09.31 [Michael Helbling]: Can your last name be Winslow?

00:55:11.87 [Tim Wilson]: Sure.

00:55:12.53 [Michael Helbling]: Alright. Whatever you want. Fascinating. Proper baby buggy bumpers.

00:55:25.32 [Tim Wilson]: Always gonna throw you off your game.

00:55:29.20 [Michael Helbling]: See, I figured that was those little smart cars. They’re just little fast moving. You could like just tump, tump.

00:55:35.81 [Tim Wilson]: Let’s take all that… I’m taking it all back.

00:55:41.19 [Michael Helbling]: Tuck myself right into a hole. Let’s keep that.

00:55:47.49 [Tim Wilson]: Let’s hand it off to Charlie from the social site. Well, we’ve been using Facebook canvas. And I know that every number that we keep showing you shows that it’s not working for shit. And, you know, Facebook canvas is limited to mobile. And oh, by the way, the coupon process on mobile for our this site is is completely broken. Shit. So, we’re driving some registrations at a higher cost per registration, and we know these people probably aren’t getting the coupons, but they actually have to get to an S-TOP computer in order to download their S-TOP computer.

00:56:29.13 [Michael Helbling]: Should we, we probably should not tell the story about our own Facebook. Well, and after our little debacle with Facebook.

00:56:40.21 [Tim Wilson]: Oh yeah, that was fan-fucking-tastic. Oh my god.

00:56:45.74 [Michael Helbling]: It’s just incredible.

00:56:48.36 [Tim Wilson]: It’s like, if you’re a small business, I hope you do an experiment and spend $20 and then say, hey, don’t convince yourself that these random-ass people actually give two shits about your plumbing service. Oh my god.

00:57:03.02 [Michael Helbling]: No, ours was crazy successful. I mean, basically for a dollar alike, we can have as many likers as we want. We were targeting people who lived in the United States, age 18 to 65, both genders, with detailed targeting who matched at least one of the following. interests in technology, web analytics, entrepreneurship, marketing, and digital marketing.

00:57:36.62 [Tim Wilson]: Technology might have been a killer.

00:57:38.56 [Michael Helbling]: Well, or entrepreneurship, or just marketing. So I might have gone… All broad. Yeah. Oh, and 95% of our likes came from mobile.

00:57:52.41 [Tim Wilson]: I mean, it’s clearly not working, so are we gonna, like, double down and run it for another six months?

00:58:02.77 [Michael Helbling]: This is the steepest!

00:58:07.87 [Tim Wilson]: Oh, man, I’m so glad that was on tape. That’s the best thing I’ve ever heard. Rock flag in hypothesis!

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#274: Real Talk About Synthetic Data with Winston Li

#274: Real Talk About Synthetic Data with Winston Li

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_274_-_Real_Talk_About_Synthetic_Data_with_Winston_Li.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares