#045: Identifying and Prioritizing Hypotheses

The intro bumper for this podcast says “the occasional guest,” and, yet, the last five episodes have had guests. That’s hardly “occasional,” so Tim and Michael had a choice: either change the intro or do an episode on a topic for which both of them have experience, interest, and, hopefully, at least modest authority. In this show, the guys dig into hypotheses: how to identify and articulate them, the pitfalls involved in *not* clearly stating them, and where they see organizations and analysts get tripped up. They have a hypothesis that you will get some value out of the show, and, if they’re right, that you will share the show with a colleague and maybe even give it a positive rating on iTunes.

People, places, and things referenced in this episode:

 

Episode Transcript

The following is a straight-up machine translation. It has not been human-reviewed or human-corrected. We apologize on behalf of the machines for any text that winds up being incorrect, nonsensical, or offensive. We have asked the machine to do better, but it simply responds with, “I’m sorry, Dave. I’m afraid I can’t do that.”

[00:00:24] Hi everyone. Welcome to the digital analytics power hour.

[00:00:28] This is episode 45. As you sit and you look at all of your digital data sometimes it just washes over you and you hope that somewhere out of it inspiration will strike or a great idea. A lightning bolt of insight that guides me to a huge RAII. But here’s the question what if you could make your own weather so to speak. Setting up and knocking down hypotheses in a way that made you the cool kid in a analytic sense.

[00:01:03] Well buckle up because you’re in Tornado Alley tonight and my fellow storm chaser is none other than Tim Wilson.

[00:01:11] Heidi whoa that’s not really what a storm chaser says Tom. I know what I was trying to I was blanking on Helen. What’s that in Twister. What’s her name that Helen Hunt.

[00:01:23] Is it Helen Hunt twister 1996 movie starring Helen Hunt and Bill Paxton.

[00:01:32] Bill Paxton or am I Helen Hunt. You dear listeners can be the judge.

[00:01:38] I’ve watched this movie since I think 1996. All right. And Michael Helling. All right. Tim was sitting up this topic. That’s something that I think a lot of analysts and a lot of companies struggle to make this transition into effective analysis and insight generation because they don’t understand how to take the data that they’ve got and transform it into hypotheses and how to prioritize them. So that’s what we’re going to spend time tonight. So let’s turn to SOF Tim how do you kind of help people do this or what has worked for you in this.

[00:02:16] So I think I think you hit the nail on the head that part of the challenge is I think there is this raging raging misconception in the marketplace I mean I think marketers and businesses definitely have this and unfortunately I think too many analysts have this as well that somehow saying we have all this data we have more data than we’ve ever had. We do need to just dive in and let it wash over us and wait for insights to emerge. I feel like I’ve talked until I am blue in the face and I’ve got two more times that I’m doing that basic pitch just this fall because the fact is we do need to have hypotheses before we start doing analysis. In my view. So I think kind of step one is just trying to explain that to people. I use a myth busters analogy. I mean that’s seems to be a quicker way to articulate to to any group and say what if myth busters was run like many many marketing campaigns or like many channels are invested in it literally would be Jamie Adams standing in the workshop and saying wow we have a bunch of stuff.

[00:03:26] Hey how about if we go blow up that car. And then they go blow up the car and then they say is that myth busted or confirmed and they’d be like oh wait a minute I don’t know. And you know maybe one of them chimes in and says well we had instrumented it you know we know that with that explosion shot debris 22 feet up into the air you know and they were great. But what’s the actual myth. We were trying to bust you know and I think that to me it’s an analogy I use a lot is what happens is we do something and people are excited about it. And it started with some genesis of an idea of this is our strategy or this is our objective or this is how it aligns with the method we’re trying to get out to the market. But then it’s kind of we did it. Now give us the learnings and there was no stopping upfront and saying how do we believe the world works. Like why do we think this would work. What do we want to learn. Right. And we never hear about a campaign or an initiative being looked at is saying yes we want it to be successful in its own right. But what is it we want to get smarter about and what do we want to get smarter about needs to start with. Well I think that mobile video ads will be a fantastic way for us to draw in millennials you know and if we’re right we’re going to double down on that and do more of it.

[00:04:53] But there just isn’t that discussion. So I think step one is absolutely just getting getting people to separate the volume of data from you still have to have questions and you have to have ideas. Our last episode Denis Mortensen you’re talking about A.I. and he actually kind of said the same thing. So I’m like yes you know exactly we’re talking about A.I. and he said the successful people of the future are going to be the ones that are asking smart questions we’re going to get better a better at answering the questions but the need to ask smart questions isn’t going anywhere.

[00:05:26] So that’s one thought now. But not that I have a strong position that year so rarely opinionated it’s very strange suddenly I encounter this. No. So I of course agree with you a couple of things that I’ve observed and it’s sort of lived through even in my own career is first and foremost to this concept or just sort of like a look at all this data. And now we just watch the insights roll in and in reality as an analyst a lot of times when they’re being like where do I go. And then the other thing is sort of in you kind of reference this is sort of almost spasmodic jerking trying to find stuff because here we are in the middle of a storm and we’re sort of just like a fight for survival but there’s no there’s no structure or finance or bigger approach to how we’re getting into the data and leveraging it so that it can build you so you can build things.

[00:06:29] And so you know and some of this is just really about like lack of planning lack of thinking about what our objectives are you know kind of writing down even the things we want to learn about because it would be amazing to say hey we’re going to go not only learn about social media and how to invest correctly there but we have these other objectives that we have that we want to use all of our social media spend to learn about. And so how do we construct our campaigns and our exposure in a way that helps us gain this kind of information. So that kind of thinking.

[00:07:05] The other thing is and this is sort of backing up a little bit before we you start into analysis and that is defining some sort of framework that analysis or hypotheses can hang on. In other words and we’ll get into this a little bit in terms of hypothesis construction but in essence giving you some sort of a hey what is it that we’re out here on the Web site to do sort of what would be a good set of goals and obviously that takes a lot of organizational support and buy in. But a lot of cases there sort of pretty straightforward at least at a high level for most businesses. And you know obviously some really large businesses have multiple different things the site is there to do. But you know you can get down to kind of a core distillation of hey we sell things on our website so people buying those things.

[00:07:57] That’s that’s really kind of core to this Web site experience or we we’re a B2B and we need to get our sales people in touch with the right contacts to other companies so how we generate leads or incorporate data into our CRM system. That’s how we measure that. Or you know we are a brand that is trying to get information out to a broader set of public through the Web site through coupons and things like that for CPG. So how do we measure and how we have a framework for success around you know how people are engaging our content downloading our coupons whatever the case may be for those of our listeners who are not in the US also FMC. Now right now you and your I’m traveling through Europe so I’m got to use. No it’s fine. And I do appreciate that because I don’t know what that meant.

[00:08:48] But basically consumer goods FMC the fast moving consumer goods is really what it’s called. Yeah I think one of the enemies of hypothesis generation and meaningful analysis is sort of the rut that we get in and day to day operations because I’m thinking about one of my clients has large e-commerce site and they had somebody new start. And he is kind of over in kind of the paid search and affiliate world and kind of as the new guy he just kind of said we’re spending all this money and there’s evidence that it’s working. I mean they they can they can point to this as a good investment of money.

[00:09:30] But he was coming in with sort of a fresh set of eyes and as he’s learning the business you know he’s asking some of those basic questions of what is our approach. Why is this our approach. He’s not asking it because he’s saying it’s wrong he’s saying I’m trying to understand but just having that discussion was there would be no reason for anybody to ask that question who has been you know there at the company there’s just no need to say you know what. Time out. Let’s just take a step back and say What the hell are we doing and why are we doing it this way at what turns out because we’re making some assumption you know Brent Brent Dykes had one of his articles within the last few months was on this whole idea of assumptions governance and which I thought was he didn’t originate the idea. If you search for Assumption governance you’ll you’ll find no other writing about it. But we don’t really think about that is saying have we stopped and thought about the things that we’re doing that either take us a lot of time or cost us a lot of money are often based on assumptions and I think Brent even made the article made the point in his article that you have to operate on assumptions you can’t go and validate everything with the data. But we don’t even stop and think about what are our assumptions.

[00:10:43] You know we’re spending on display because we have like you watch this sort of insidious migration from carving out some TVs spend to spend on digital and that became display and then we’ve been doing it and we’ve been looking at the data but we’ve never kind of taken a real analytical approach to what are the assumptions into why we’re doing this. Because articulating an assumption to me is you’re halfway to articulating a hypothesis. If not if not farther in I think analysts should play that role. Like when I use my sort of framework you know a hypothesis is nothing more than saying you know I believe fill in the blank it’s in the tentative assumption you’re making about the world.

[00:11:30] And that’s all it takes to state a hypothesis. But the thing is we don’t we don’t want to just go validate hypotheses that aren’t going to lead to a decision. So we’ve got this second fill in the blank of if we’re right then this is what we’ll do. And I use that now all the time even if I don’t if somebody is just like go analyze the website and I’m not going to be able to have the discussion. I’d like to have I’ve been around the block a few times. I will sit down I will poke around the Web site and I will I will generate hypotheses and I ideally probably 75 percent of the time I take those and basically sorta clean them up in that framework and say OK I think this might be going on. And if that’s the case and this is what we’ll do and I think this might be going on and that can spark the discussion. So you know I jumped right into kind of the you know my framework for doing that but that all comes before you said something earlier about you know diving into the data and I think it is so easy to become horrendously inefficient ineffective by diving into the data prematurely and not spending some time doing honestly kind of the harder more organic work of just thinking about what are those assumptions and what are those hypotheses and what might you do. You know depending on how they panned out.

[00:12:49] Yeah absolutely. No I don’t think it’s that was the direction we were headed anyway as sort of if this then what do we do about it. And you know some of those core things about how we operate the business obviously that guides will what is the action should ours. Whatever we’re researching come true. And so no I think that’s that’s appropriate. QUESTION It’s fine to me that’s the other thing is there’s another big misperception is that every analysis is supposed to yield an actionable insight which I think is.

[00:13:27] Horribly unrealistic expectation. You know if I have an assumption you know chances are 80 percent of the time that assumption is correct. So there’s still value in me saying that assumption is based on this hypothesis and I’m going to go validate it and you know what that assumption is largely true. Two things will happen from that one. I can kind of move on and move on to other assumptions and don’t have that uncertainty. Chances are I’ve been digging in to validate the hypothesis underlying that assumption. Other things will crop up and I will say oh you know what. But I just generated three or four more hypotheses. I’m not necessarily going to run them off the ground. I’m going to put them in my. That same world where I want to sit down float those out to people and that kind of happens. I love when that happens you know that you’re saying you know what your assumption is effectively true. But we noticed this other weird thing about your paid search traffic. Does this make sense to you it seems kind of odd to me and that’s not what we’re trying to chase down. But if this is true let’s talk about what that would mean. Because I can go and validate whether that really is the case. But let’s talk about what would you do if that if that is the case. So they are actually validating hypotheses that don’t lead to action is okay. And it’s probably going to happen more often than not because you have to give your every analysis you’re doing is turning up some horrendous surprise.

[00:14:59] It’s like well somebody needs to be smacked upside the head. You know how how consumers behave because you do. You’re not going to find a million gold nuggets. You know when you turn over a million stones.

[00:15:14] Yeah and I think sometimes there is a lack of patience in organizations for analytics to kind of do its work because we think oh well every time you turn over one of those tones you’ve got a committee coming up with big incremental gains and sometimes you just learn something fascinating that we sock away and it doesn’t have an application for maybe years you know and that’s something I think that’s as if you are somebody who’s managing and analytics team. I think that skill is like how to direct analysis and how to know when the kind of position. Like okay I think we’re headed the right direction or you know we need to probably go this way or that way just as it’s a skill that very very few people really do well. And sometimes as an analyst even when you’re you’re not directed you don’t know exactly when to sort of be like cut bait and move to another topic. And sometimes those analysts who get real raise are focused on something that we find pretty interesting. But our time could be better and that prioritization I think is also kind of a really big deal of like how do we kind of make sure. Well how do we drive the most value. Because again some of this is very scientific but there’s also an art to picking that makes sense. Oh yes absolutely.

[00:16:41] I mean you know how there’s always an article every so often about how analytics is ruined ruining creative and in a certain sense if you think about it like the demand for results is ruining analytics is something that some people probably don’t have probably never said that way but it’s probably a feeling they have a like let me do the analysis I want to do because it’s interesting and I think it might have something meaningful. But I don’t want to. So again it’s like this balancing act of how do we kind of serve the overall goals of the company while at the same time having enough leeway to do research or do analysis around hypotheses that may or may not yield like incredible dollar signs. That makes sense.

[00:17:28] Yeah I mean I I am a fan of trying to discreetly articulate every hypothesis and I try to. To me this seems like it unnecessarily challenging Lee for some people to make that there’s not a 1 to 1 to 1 relationship of hypothesis 2 analysis to presented results.

[00:17:51] I may do one analysis that may actually have two or three or five hypotheses and it that may be one or two or three deliverables. But I do try as hard as I possibly can to buck it out. Each hypothesis and I’m fine with a little small hypotheses in kind of the idea of sub hypotheses sort of as well. Because I think there are sort of a twofold value. One you can start to show people work.

[00:18:25] We started with legitimate hypotheses and 60 percent or 80 percent didn’t turn anything up.

[00:18:32] So let’s start one kind of managing expectations as to how much we have to invest now we can also point to say now these hypotheses we just validated or we weren’t able it was not conclusive but you know we did figure out we figured out we had this other data or we built a foundation to do these future analyses or we generated these other hypotheses. So it doesn’t mean there’s zero value but we can start to pivot the group don’t just tell me to go dig into your email data and find insights so we’ve got to break it down into these discrete things. And I feel like animals just don’t click that we were such pleasers that we want to have one conversation and then just go into a ton of analysis. And to me it’s like it’s not that hard. Nobody everybody likes to talk about their business. Users love to talk about what they’re doing and why they’re doing it. So if I dive into analysis and say you know what I found some stuff I’m getting to a certain level I don’t think there’s anything super surprising here I’m basically validating the hypotheses but I’m not going to up this final deliverable. I want to go back and meet with the stakeholder and I’m going to walk through the current state of the deliverable and see if it sparks some other hypotheses sparks some other investigation points. They know the business better than I do and I’m not trying to deliver a kind of the big aha. You know in one big final reveal I don’t know if that was actually responding to your.

[00:20:06] No but by that I mean the nice thing is we trip merrily along we stub our toe on all of our little bugaboos or bugbears says Simon Rumball Tada bugbears. I serve. I’m starting to adopt that. So no you’re right. And actually that part of that process of integration is one that is really difficult both externally and internally. You know this is where soft skills and analytics I think are so critical and probably undervalued in that you have to be friends with people and get their buy in in their trust. So they show you and are honest with you about what it is they’re trying to do and give you feedback. And so that you can show them. Here’s some stuff I’m just I’ve started to put together. It’s still kind of Molton and it hasn’t taken shape because that’s that’s the challenge and certainly you know you and I and Tim both now live on the consulting side of the world where the typical model is we’ve got all this brain power you give us. We ask you some deep probing questions we’ll walk away for three weeks and then return with a magnificent statue that is now you know so and in reality the best thing to probably do is say great conversation we’re going to dig for a week and then we’re going to come back with all kinds of dirty gets and fish and we’re going to look at it all together and it’s going to be smelly and put your boots on and then we’re going to go back and eventually we’re going to get to some really great things but we’re going to do this process together.

[00:21:56] Yeah. And I think there’s such a lack of understanding or willingness of that process of.

[00:22:03] Okay yeah that’s great but it’s going to take way too long to prove out or Hey we could do something with that right now. Let’s put that closer to the top of the list because I think there’s real value in what we can learn there or OKC we study that four times last year and basically we kind of know that inside. Now you didn’t know that. So let’s check that out the window. So before you then made that the centerpiece of your massive dog and pony show at the end. Right. Which I’ve seen that kind of thing. Yeah yeah. Unsa times.

[00:22:31] So again it’s that back and forth. It’s the combination of the analytics expertise and the domain expertise or the business expertise. And it’s that collaboration that is so vital and that’s why that’s why it’s important that your leadership or your sponsor whoever it is that’s enabling this as if it’s marketing the CMO or the director or whoever’s in charge of what you’re doing believes and trusts and understands OK here’s how we integrate in what we’re going to be doing and we’ve got buy in from the organization that is a kind of reminds me another challenge in this area is when when you’re the sole analyst or a small team and you’re reporting up to somebody who maybe has misperceptions.

[00:23:22] I’ve watched that and it’s actually easier for a consultant because usually they’re like hey we’re we’re paying you so if I say look dude it doesn’t work that way if you’re saying that in this really murky world you want somebody to come in and just magically put together all the pieces of your entire cross-channel strategy and present Corys insights when they’re external or they’re brand new. Like you’re you’re you’re smoking crack like it’s not going to happen.

[00:23:52] But I have even as an external person and I’ve just maybe just gotten more comfortable this probably comes more with age than anything else. That saying no like we need to know either we’re all going to sit down and look at this messy data put on your. Put on your hip waders because it’s going to get not deep and bullshit just there’s a lot of stuff to wade into. I still will do everything I can to make it understandable.

[00:24:18] This is not like oh here’s a massive spreadsheet. Yeah definitely. But yeah. You know let’s sit collectively with this pivot table and slice it a few ways or let’s look at these two or three ways. And I’m not going to pull you in for a half day I’m not asking you to go through all of this. I just have generated questions of saying Well this seemed surprising to me. Can we explain that. I can’t explain that. Can you explain that asking the subject matter expert. Can they explain something that looks odd to me the analyst is about 12000 times more efficient than saying I’m going to sit here and just dig deeper and deeper and deeper and deeper into the data when somebody is able to look at in total explain it.

[00:24:57] Right. And that’s the application of context. Right. And that’s so vital because as an analyst you might actually come up with the answer that is that piece of context to your analysis and a lot of travail and hard work. But then the person is like why did you spend so long trying to figure that out.

[00:25:14] We know this is how our business works. You’re like hell yeah. Now I know but you know you could just passed but on a really expensive way to get to something that you already knew. Yeah something that we just take for granted around here or we understand is sort of fundamental.

[00:25:32] And so yeah that is a big organization. That’s a big hurdle. And you’re right. Like if you get hired into an organization you’re kind of the junior person and you’re in a structure that is I don’t know if. Honestly I don’t know what advice to give at that moment except to try to push people towards this kind of a construct and and things like that. But it is difficult I think when there’s not sort of a mutual understanding coming from the other side.

[00:26:11] Yeah I think that’s that’s when if there’s a there’s a manager saying no you can’t go back and talk to that person again until you’ve got the final result. Like that’s the that’s the killer. But I would say I mean this would get into organizational stuff anytime any manager is saying you can’t go talk to a stakeholder.

[00:26:27] Right. I going I get get a fairly low level you can say look I legitimately have three or four things that all look kind of interesting but the last thing I want to do is presented 12 people with something where they’re going to say yeah we already knew that if I can go if I just go talk to this one person and say Help me understand your business more because that’s that’s that other weird thing that we feel like oh we’re asking people to explain their business and they know how their business works I’m wasting their time. Like now turns out they’re hopefully they’re doing their job because they like it and they are thrilled to have somebody who’s you know asking them explain more well what about this. What about that. I mean I’m envious on that front of analysts who are internal because when I was internal Yeah I got to bill all that stuff up and I had the handful of people who I knew were so fun to work with and I learned from them about the business and we could just have so much fun on different projects because they were asking smart questions and in there were the other people who maybe was a little rocky at first but I could come to help sorta guide them when we got to where we were working together well and that you know that takes full time internal listening to things over. You know cubicle walls and even listening in casual conversations. I think that’s another skill to learn. I think that constructive I believe if we’re right we will.

[00:27:52] That’s not something to say I’m going to put a google form up and just send all my business users to fill that into it is a something that an analyst can totally work with that and say I just had a great conversation and they seem so full of ideas and I jotted notes down. Let me go look at my notes and now let me see if I can take this ran over three pages of notes and if I can translate that into hypotheses that’s kind of like active listening of a pretty high order because that’s going to take them just kind of you know barfing thoughts and questions and ideas and saying I’m going to do the work of putting that into a hypothesis that points to an action that I can play back to them and they’re not going to have a problem with it they’re going to like it.

[00:28:39] They’re going to read it and say oh shit yeah I told you all that and oh by the way that’s not really what I meant. You kind of misinterpreted that. That’s not a fail that’s that’s why you do it because they make sure that you’re clear.

[00:28:51] Yeah. Deeper Understanding and alignment on what to go after. So we touched on a little bit around sort of prioritization. Right. And so low effort high impact. How do you how do you construct this for people or what how do you kind of guide people through what to go after and when.

[00:29:13] Oh it’s very I have a 2 by 2 matrix and along the x axis is the level of effort and the y axis has high impact. I’ve seen that fucking 2 by 2 matrix. So many Commerces AB testing gurus love that which I just think is SPRO gooroos zero gooroos Yeah.

[00:29:33] One of my bugbears. I love that they’re like.

[00:29:37] You know just just find. Find the tests that are going to be low effort and they’re going to generate high impact and I’m like OK if I knew what was going to generate high impact then why am I doing the test.

[00:29:48] And we’re live in the real world not in kind of a theoretical oversimplified construct so the reality is if you plot every one of those with your best best guess what kind of impact might it have in your best estimate of how much effort it’s going to be guess how many you have that are low effort in high impact like one or two or something like that. You know that’s not the quadrant things live in. So I think it’s a lot it’s a lot greater than that. And you know whether it’s done sort of organically you know the reality is you know you’ve got the highest paid person’s opinion and I think we talked about doing and in defense of the hippo or as I’ve been a blog post I know I’ve heard the defense of the hippo you know the highest paid person’s opinion should not be immediately discounted like that hippo coming in and out of style depending on who you talk to.

[00:30:41] OK. But at the end of the day like in EnLink Kevin Hulce from speake rate that’s the person who’s going to get fired. This is all wrong. Right. So it’s the person who’s most likely to.

[00:30:54] Be able to drive action. You know they have budget. They have people in their asses on the line. So the fact is I hadn’t had a client and she was a little short engagement. She was the CMO of this organization and sat down with her and she was she mapped out. She was an analyst dream because she had mapped out and she would sketch it and she clearly drawn this thing so many times of this is how our business works.

[00:31:27] And this is our marketing works within the business. And what I need to be able to do is confirm this and specifically this arrow here. I’m the most uncertain about this arrow here I’m the most uncertain about and you only back to the analyst saying oh my god like she is like one of the best Simos you could ever have because she has an analytical mind and she’s asking smart questions. Now you may also have Simos who say you got all that data you got that fancy web analytics platform you know go give me some insights you know tell me the answers. And that’s a problem because they haven’t asked a question though. But definitely the. Yeah the the higher up on the food chain the more attention you’re going to get. So that’s one I think that second part of the if we are right you know we will do X the more crystallized that is the more it is if if this is what the hypothesis the validation turns up then will do action a. And if this is what it is it will be. And because of the way we’re going to approach this it’s going to be pretty crystal crystal clear.

[00:32:32] I think that bubbles up in the priority because it’s more likely to lead to action if it’s something where it’s clear what the action should be. So I mean it’s you know trying to do ones that are going to prove somebody wrong like to me that’s a huge red flag.

[00:32:48] You know if you’re I get that you know John Smith has bugged me for years and he’s been going around saying Instagram is the be all end all and I’m going to show him you know we’re going to make him look like an idiot. Be careful. You know Ray analysts can’t really make any. Even if you’re saying you’re objective and the data is what it is. You know making anybody look just black and white bad is not ideal. I mean it may be that maybe the right analysis to do but you need to figure out a way to do it where you don’t want you don’t want anybody to lose. You know you want them to be the ones championing that hey you know I’ve been champion this but it turned out I worked with someone that worked with them do the analysis and I’m thinking we should change to X or Y my overused buzz word in this area is alignment.

[00:33:41] It is alone ones of what you just described which is the CMO or Liepāja of marketing. They care about these five things. Guess what’s really high on our list of things I hypothesize about. Obviously it’s going to be the things that they’re really betting on and that makes a lot of sense. And also what you just said about the politics of data and how organizations can and that’s very damaging frankly to analytics and probably to the organization itself. And when you see that that’s a that is a red flag when somebody is going after somebody else it with data because they want to prove them wrong. And actually you know hypothesis and testing kind of fall ill fall to this a lot. I think different organizations when you get two different sides basically using it as a cudgel against each other. You just used the word cudgel. Some

[00:34:36] of the Cudal can really help drive some alignment dress a line rethink the cut you could pose to people in a room with a cudgel. KUMAR Yeah yeah box their ears is what you do to marketers enter one market or leave. No but but the reality is is that there’s a common enemy right of every organization which is I don’t know.

[00:35:00] Entropy failure. Competitors taking away all your customers and getting together and reviewing the common enemy then gives you a framework for taking on doing all the analysis and getting the insights and hypotheses that you need and using it that way is way more virtuous in terms of the pursuit of analysis than say like going after somebody’s kind of cross departmental or something like that. And no matter how you feel about social media or something else new that just came up right in and it’s hard because you see people chasing sort of now. You know I call it shiny object syndrome right where it’s hey stop chance really popular. You know you can advertise on snap chat. You know maybe we should make snapchat filters for all of our retail cations so people can take pictures with like our logo on it and that’ll be big brand reach to millennials and that’s probably a strategy and a discussion that is happening on the regular right now. And that’s OK because it might not be wrong but like let’s figure out how we’re going to where’s that service or how does that help us and all those things that I might have just gone away off on side track.

[00:36:17] Well no that has me thinking. I mean that actually reminds me of I hate to use the hole.

[00:36:23] So I’ve got a client who but the client kind of in a tricky space not kind of a hard online conversion social media manager who kind of hear from the analyst like she is difficult she just has a million questions and she just wants this and that she’s maybe mildly challenging personality but you know what she’s really passionate about social media you know what she’s done a ton of reading about social media. Turns out you know what she has on a smart thinking about social media and she doesn’t know how to use the data she may have woeful misperceptions about how the data works.

[00:37:03] But for multiple years now every time I have a conversation with her I kind of it’s fun because I’m listening for saying you’re trying to explain to me how social media works. And it turns out you’re kind of right. I can explain to you what the data can or can’t tell you. But now we’re going to jointly decide what’s the right way to validate this hypothesis. And I watched the analysts get frustrated with it because sometimes the stuff that she’s looking for requires some weird you know data polls that are kind of challenging to figure out where to get it. But if you think about the data it should be there like it’s a legitimate hypothesis where she has already articulated a strong action. Why are you frustrated because it’s hard you know give these people they may be challenging personalities they may be in an area that social media you may not like social media personally but maybe you need to work with them and say we’re trying to do what’s right for the organization. And let me help you and see if they’re on board and they have a hypothesis that you think is the dumbest thing. There’s no way it’s going to pan out. Embrace that and do it really really well and say I’m going to validate it anyway or fail to validate it and take them right along with me because now we’ve worked on this together. And it’s not. I was right you were wrong.

[00:38:27] We agreed that this was a legitimate question and we partnered to answer it and now we can turn to the rest of the organization and say we partnered and did X and oh by the way the way we did this was by working together to be really clear about what we thought was happening what our assumptions were what the hypothesis was and then we validated it because we knew we could take this action. You know what we’re going to shut off paid media on Facebook you know for instance. So. So yeah no.

[00:38:58] Tim I think this is a really good discussion. And it’s sort of an underserved area of analytics in my view generally in that I don’t think people give attention and intentionality to this aspect. They they definitely spend a lot of time focused on the tools like let’s get the right tools. Let’s let’s do a ton of analysis to figure out what the right tool is for us. We spent a lot of time thinking about what it is we’re going to measure and how we’re going to get that data right. So how we build our data storage how to integrate or data together. This is all good stuff but then we just fall right off a cliff in terms of our intentionality when it comes time to start using that data to actually do something and it’s that’s where a kid turns into a swamp and not a river right or something.

[00:39:47] I don’t know yeah. And I think I mean I’ve got theories about why that is because you know there’s a hard dollar cost on the technology and there are a lot of vendors and I’ll say all technology vendors out over promising that you know get our you know for instance you could see you know you could bring in the weather data.

[00:40:12] Put it right next to your traffic data. And that’s can I can list multiple vendors that talk about you know being able to do that and then just think you’d be able to see if you were selling bicycles you would be able to see Michael Yabut I’m on toilet paper and people should as much and when it’s raining is when it’s cold is when it’s hot.

[00:40:30] Is that your hypothesis. Because you know we killed research that we could. I’m sorry you like to pick on this weather data example because I’ve heard you do that before but in reality there’s a couple of scenarios if you can do. Oh real times the segmentation because of certain specific weather data that wouldn’t be good thing to do. Absolutely. Absolutely. OK. I just want to make that clear.

[00:40:58] But look at what the vendors can do is they can say across our entire install base we generated five highly impactful case studies in five totally different industries that are these five little snowflake oddball things and we can tell those stories. But I mean how today I had a I had a vendor contact me about one of my clients because I know the vendor and he like Hey the client bought me off and your Droid to get this moving. I’m like I know you would because you’d like to hit your quota but this has no value whatsoever for the client.

[00:41:34] But you’re telling everybody you can biotechnology AOL to do these amazing things. I’m like Yeah but you know now without a common key oh you’re going to integrate with your online data well not in your CBD slash Essem from S.G. you’re not getting the purchase. So shut the hell up. You know that’s that’s a load of crap. So I think that’s part of it is there that the time lag right. You’ve got the sales process where they’re saying I have to get our stuff installed it’s going to take two weeks and then you’ll have glorious insights. Well it takes three months to negotiate the contract. It takes nine months to get it implemented the whole time you’re just hanging on to what the sales guy said which was the glorious insights will emerge. You finally get it Rolan and people are exhausted. They’re saying oh my god we now have all this data and we’re just going to jump into the swamp instead of saying Holy crap I don’t have a process. I need to still I still need to have rigor. I still need to have the same diligence I had when I had my Leser crappier free poorly implemented tool. You know that I had then you know that hasn’t. That hasn’t changed now. And I will say the other side is I think that the business as well as analysts will completely confuse the looking at the looking at the past and measuring performance you know K.P. eyes KPI eyes are only relevant and hypothesis validation and from your alignment perspective what is it. What do we care about at the end of the day.

[00:43:04] But when you’re using K.P. eyes you’re just measuring are we hitting our goals. Like are we achieving what we expect to achieve with our plan. We’re generating hypotheses to help us do that. But they’re kind of a separate thing. Hypotheses are let’s make an assumption on whats going on because we want to change the action in the future. And yes if we do this well we will continue to meet our performance objectives. But that 2 happens with the give me the campaign report and I want my actionable insights from the campaign report. It’s like well if you’re delivering one campaign report part of that should be did the campaign deliver what it was supposed to. And the other part is what were the hypotheses that we had that we were validating through the data from that campaign that is going to influence what we’re going to do in the future. Because just saying that campaign fell below target doesn’t tell me what I should do but that’s right. That’s what the AKP does. It says no you didn’t do what you were expecting to do.

[00:44:07] The KPI just tells you where to start asking questions where you need to like because you pay if you exceeded it.

[00:44:13] It doesn’t mean that you couldn’t exceeded farther but the fact is you’ve laid out these are the things that we wanted to do and the ones where we missed it are the ones where you better start trying to figure out why. And so that you don’t do that again. Right. How do we fix what’s broken and how do you make big better. Yeah. Or in your hypothesis maybe you know what we just were ridiculously unrealistic about that and we don’t think we’re ever going to hit it. And so we will never set that target again which is still fine because that means that’s influencing your future investment. If you say yeah that was we are not going to have 10 percent of the people who would come to the site share it socially like that was. That will never happen. So the next time we’re looking to that next campaign let’s be realistic about what we might do and then we may say we don’t want to do that campaign that tactic that makes so much sense.

[00:45:05] Absolutely. I mean I’m glad we picked a topic that both you and I have so little interest or passion about. But actually I mean it makes a good segue because unfortunately we do have to wrap up. But you know as you’re listening if you’re a marketer Well first how did you end up here with all of us cool data nerds. I can’t start taking an analyst with you everywhere in your business. Take him to all your meetings. Let them see what’s going on. Let them hear the questions are asked so they can learn how to do analysis and brings back real value to you.

[00:45:43] And if you’re an analyst start trying to hoard your way into all the meetings that they will say if you’re if you can’t just try the I dragged him along and maybe they’re not asking questions in the meeting but man you better hope they’re grabbing you afterward.

[00:46:01] So maybe this is more to the analyst as well because I watch and I sometimes come in to meetings and say aha yeah.

[00:46:10] And I’m like how you don’t know the business that well there there’s no way this is 100 percent making sense to you. And again I feel like I’ve played the hey I’m just the dumb analyst. You don’t understand the business. Let me ask the stupid question. And I is where people are like oh my god that’s a good guy. That’s a great question. Or I hadn’t really thought about it that way. And oh by the way I’m now more equipped to do the analysis because. I asked a question that was a legitimate question. So analysts you got to actually actively listen and process and say Does this make sense or do I need to probe more.

[00:46:45] Well there is a certain amount. I always say there’s a certain amount of humility required to be a good analyst because to ask a good question or to formulate a good hypothesis means putting yourself in a position of not necessarily knowing the answers to everything. And so we’re lucky him right because we can raise our hand and be like all you know I have an industry reputation so I can ask a dumb question and people will give me the benefit of the doubt and be like oh that’s interesting because you know we’re not unproven in that context but maybe someone else might feel that way. So it does I I feel the tension of that for people as they they do feel pressure like I can’t let dom in front of this room and ask this question.

[00:47:30] Yeah they got to get over that because that’s that is an internal misperception like that is a possibly. I think it absolutely is. I mean I think that is we tell ourselves that story that oh if I asked that I’m going to look dumb and I will. Ninety five times out of 100 if you ask the question because you’ve literally you sat there and thought about it and said I don’t quite understand or even made your throw and buzz words around and I just don’t get those buzzwords. And the fact is and you can you can head to I mean the humility is a good way to put it. Say look I know this may be a dumb question everybody else seems to understand this. I mean you’ve got to be careful to not be like a snarky like I know this may be a dumb question of early. I think it’s a smart question. I’ve done that. Yes there are some people who know of me or know of my organization but I’m much much more often working with people who’ve never heard of me.

[00:48:29] I feel like I’m starting from oh it’s the first thing I tell people like I’m kind of a big deal here guys I’ve got I have a podcast.

[00:48:39] I had a blog and now I have a podcast.

[00:48:42] It’s hard to let that one slip out and think Yeah yeah right.

[00:48:47] We just talked about this on my podcast. What were you. Oh yeah. You know who told you. You must realize that.

[00:48:57] No but I think that’s a personality trait of an analyst that we want to be right. We’re not comfortable naturally comfortable saying I don’t know. But it’s not that hard. I have a hard time looking back in my career when I wasn’t. Of all the things I am wracked with insecurities about this one topic apparently but my entire career I’m like please explain that more and it’s like you do it twice and realize that.

[00:49:26] Oh my god. Not only are people not looking at me like I’m dumb. They’re like so excited that somebody wants to hear it and know they’ll find more time they’ll let that they’ll show up late for the next meeting so they can draw it out for me. And now I am so much better equipped. I mean that that is just to get over yourself.

[00:49:44] I mean yeah. I mean I guess what I would caveat that with is if you’re in an organization where people are actively trying to destroy each other with data then probably don’t answer ask the questions. But other than that I’m totally in agreement with you Tim are right we’ve got to transition to the part of the show where we do our last call something we find interesting something that’s going on in the world today something we’ve come across. Tim what’s yours.

[00:50:11] So last last episode I plug day specific podcast episode this time I’m going to plug it actually ties in well to this. I’m going to plug an entire podcast. It’s relatively new although it’s sort of a transition from a different format. It’s called science versus Ash.

[00:50:31] I was sitting here. I have not heard that so I’m very interested. However my last call is also podcast related.

[00:50:37] So things were getting a little too insular here in town were this podcast world.

[00:50:45] Now we just all our friends are podcasters now so I am I am a huge Gimlet Media fan and I and other people who are one of theirs. So science versus the the premise is this Wednesday I’m going to butcher ads like Zuckermann but I want to say I’m one of the first episodes her boss actually on the recording was like I’m mispronouncing your last name and she said yes you are and she pronounced the correct and now it’s based on what it is she’s Australian but science versus they basically they’ll take on attachment parenting or fracking or gun control and they dig in and say OK you know it’s a polarizing issue let’s dig into the facts. So I sort of throw it out to to our listeners because it is one of those where they go figure on all of these polarizing issues. It’s actually not black and white. Nobody is 100 percent right or wrong. And it’s kind of interesting. I would actually say she’s not the most awesome journalist in the few episodes that I’ve listened to so far. She gets there just maybe and maybe it’s the production and the editing and that’s it’s tough. The glass houses I guess but it’s kind of interesting because to me it kind of resonates with analytics where people want to have the oh give me the answer and it’s like well it’s kind of this is kind of that it’s not totally definitive. There are sort of things on both sides so that’s mine. What’s your what’s your podcast related.

[00:52:15] SCOTT So I was recently introduced to a podcast that I cannot get heard of before it’s called invisibility. Yeah.

[00:52:22] Yeah. Well now it’s got Howard Rosen on it. She’s like the new Yeah.

[00:52:27] So OK so Tim obviously is aware so. So yeah it was recommended to me by colleague Stuart Schilling. So he’s he’s pretty smart. So I’ll give it a listen and specifically there’s an episode they did on June 17th called the new norm about how sort of normative behaviors and things like that kind of define you know what we do and all those kinds of things so really interesting. I’m really enjoying it so much of what analytics people do that is in this part of science in terms of how to influence and change people’s opinions and understand the forces that are working under the surface in terms of how decision making is happening and things like that.

[00:53:12] So that’s the genesis and visibility it’s the it’s like the visible the visible forces that are shaping the world that we live in.

[00:53:19] So probably started with that but that’s exactly right. So anyways I’m really enjoying it so I think if you if you’re in for another podcast definitely one to add to your stature.

[00:53:31] Alright well if you have been listening and you have been crying out in the car like yes or no or whatever. Give us a shout on slack on the measure slack or on Twitter or on our Facebook page. We want to hear from you. This is obviously a topic Tim and I both share a great deal of passion about. So if you found a great way to use hypotheses and you found ways to incorporate ideas into organization effectively you know what a great conversation to share with your colleagues and peers on social media.

[00:54:08] So love to hear from you as you’re working and you’re thinking about it. Keep digging for those insights and for my cohost Tim Wilson. Keep analyzing.

[00:54:22] Thanks for listening. And don’t forget to join the conversation on Thursday. Half measures like great. We welcome your comments and questions. Facebook dot com slash analytics or analytics now on Twitter.

[00:54:38] Smart guy wants to play ball. Yeah I just saw start recording today.

[00:54:53] Oh yeah. Wow. Classily lap tonight. Got to get rid of Yeah. Yeah. So slow calorie fire is my special diet drink drink low calorie swill. Can your last name me Winslow. Sure. All right. Well every arm.

[00:55:16] Fascinating. Proper baby buggy bumpers. Always throw off your game. See I figured that was those little smart cars there just a little fast moving. You could like just Tom Tom. Take all that up I’ll take. Tuck my stuff right and off. Let’s keep that.

[00:55:47] One handed off to Charlie from the social side. Well we’ve been using Facebook canvas and I know that every number that we keep showing shows that it’s not working for shit. And you know Facebook Cambo is limited to mobile. And oh by the way the coupon process on mobile for our site is is completely broken. We share. So we’re driving some registrations at a higher cost per registration and we probably are getting Makhoul.

[00:56:18] How do you get to a computer. Oh right yes. Should

[00:56:31] we publish or not tell the story about our own Facebook. Well and after our little debacle with Facebook. Oh yeah I was banned fucking nasty. Oh I mean it’s just incredible. It’s like if you’re a small business I hope you do an experiment and spend 20 dollars and then say don’t convince yourself that these random ass people actually give two shits about your harming service. Oh my god. No ours was crazy successful. I mean basically for a dollar like we can have as many Laker’s as we want. We were targeting people who lived in the United States e.g. 1865 both genders within detail targeting who match at least one of the following. Interest in technology. Web analytics entrepreneurship. Marketing and digital marketing. Technology might have been a killer or entrepreneurship or just marketing so I might have gone abroad. Yeah. Oh and ninety. Five percent of our likes came from mobile. I mean it’s clearly not working. So are we going to like double down and run it for another six months.

[00:58:07] Man I was so glad I was on tape. That’s the best thing I’ve ever heard. ROC flag in high seas.

 

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares