Have you ever thought it would be a great idea to have a drink or two, grab a microphone, and then air your grievances in a public forum? Well, we did! This episode of the show was recorded in front of a live audience (No laugh tracks! No canned applause!) at the Marketing Analytics Summit (MAS) in Las Vegas. Moe, Michael, and Tim used a “What Grinds Our Gears?” application to discuss a range of challenges and frustrations that analysts face. They (well, Moe and Tim, of course) disagreed on a few of them, but they occasionally even proposed some ways to address the challenges, too. To more effectively simulate the experience, we recommend pairing this episode with a nice Japanese whiskey, which is what the live audience did!
[music]
00:04 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour and their website analyticshour.io. And now, the Digital Analytics Power Hour.
[music]
00:27 Michael Helbling: Well hi, everyone. Welcome to the Digital Analytics Power Hour. This is episode 119. I have with me my co-hosts but we are recording with a live audience at Marketing Analytics Summit. Las Vegas, let’s make some noise.
[applause]
00:51 MH: Awesome.
00:52 Audience Member: Put your shirt back on.
00:54 MH: Now, most of us were just here to hear Gary Angel, and he talked a lot of really uplifting and really amazing insights. And it’s not just words by the way, Jim. We also drink on this show. So if you need a beverage, I think there’s some floating around. And yeah, it looks like somebody’s got it right there. So just help yourself. There’s little cups and everything. It’s part of the show. But we wanted to go in a different direction, none of this positivity and tips and tricks you can take home. We wanted to air out some of the grievances. There’s a lot of problems in our industry and it’s about time we talked about them. So we’re gonna go through a little shiny app that Tim built and we’re gonna walk through some gripes, and we’ll see how it goes. And we’ll need some from you. So, from time to time, we need the audience to chip in a little gripe we can talk about as well. So let’s get started and you guys ready?
01:53 Moe Kiss: Yeah.
01:53 Tim Wilson: I’m ready.
01:54 MH: Tim. Alright. I’m in a foul mood but I’ve got this little flask so I’m ready to rock and roll. Gripe number one: “You should A/B test every decision so you can be sure.” Moe?
02:09 MK: Well, I just think… Maybe it’s because, at the moment, I’m going to too much conversion stuff but I just think there are times where you understand, say, through like customer service tickets that an experience is bad for a customer, that it leads to some kind of really shitty experience and that you just fix it. Sometimes, we’re getting a little bit too crazy, we want to test every single thing. And sometimes, the testing is to prove the value to the business, not to improve the experience for the customer. And that’s why I’m on my high horse. I actually feel like it ’cause we’re up on a stage too.
02:41 MH: But is there a need to go through the rationale that, are we gonna need to quantify the impact of this? Is that a valid input if it’s clearly bad but we don’t know how bad?
02:52 MK: Yeah, I guess, I think to some degree.
02:55 MH: Our website is this much marginally better than the terrible website it was before. You really wanna measure that change or you just wanna make the change ’cause it’s the right thing to do?
03:04 MK: Especially when you think about the fact that by A/B testing, there’s gonna be a proportion of your customers that continue to have the shitty experience that you haven’t fixed. So, anyway.
03:13 MH: Alright, gripe number one. [chuckle]
03:14 TW: Gripe number one, it was a doozy.
03:16 MK: Tick.
03:16 TW: Let’s go.
03:18 MH: “We need to be doing machine learning.” You better take this one, Tim.
03:23 TW: Yeah, this one was mine. So I think that all three of us have drunk from the fountain of Matt Gershoff, who has preached ad nauseam not to chase the technology. So I feel like we’ve gone through buzz words and we need to be doing bit dat… [chuckle] Big data.
03:42 MK: Data?
03:42 TW: Big data.
03:43 MH: Big data.
03:43 MK: Data.
03:44 TW: This is a year in of podcasting with Moe and I’m slowly picking up an Australian accent. But even looking at this conference with the colocated conferences like Predictive Analytics World and Deep Learning World, which… Isn’t to say there’s not a whole lot of value there but I feel like Matt is the one who really brought it home and said, “We really need to get better at problem formulation and then worry about how we’re solving it.” And, certainly, machine learning is an arrow in our quiver but we don’t necessarily need to always use that one arrow.
04:16 MK: Yeah, to be fair… Sorry. Matt is probably the person that made me most learn this really hard lesson. When we were sitting at SUPERWEEK, a while back, which is actually how the Matt Gershoff thing happened, where I came to him and I was like, “Oh my God, Matt. I’ve tried this model. It was so cool. I’m really excited. This is the first time I’ve used a random forest model. I’ve answered this question.” And he’s like, “Moe, what was the question again?” And I told him the question. He’s like, “You could’ve just used a linear regression.” But I was trying to use this really cool model because I wanted to find a way to use it and that’s happening a lot in our industry at the moment. You wanna do the shiny, cool, fun stuff but the truth is, what’s the easiest solution that’s gonna answer your question?
05:00 MH: I thought this was your gripe because it didn’t say AI. So we have to be doing…
05:03 TW: That’s a separate gripe.
05:05 MH: We’ll get to that one.
05:06 TW: We’ll just repeat ourselves.
05:08 MH: Alright.
05:09 TW: These are in random order ’cause it’s AI-driven.
05:12 MH: Being asked to analyze the most common click paths to the site or the pages where people drop off.
05:19 TW: I feel like how many people, show of hands, how many people will still get some version of this?
05:24 MH: Matt. Okay, good.
05:25 TW: Man, that has been a round. That’s like…
05:26 MH: The dream is alive, people.
05:27 TW: Hold on, hold on, keep your hands up, keep your hands… That’s about 250 people, give or take. Okay.
05:32 MH: Yeah. Just endless masses.
05:35 TW: And I feel like this is one that I’ve… It’s crazy how long I have heard that. Going back to… And I’ll admit, when I first… The first time I’ve pulled up NetGenesis, I thought I was gonna do the same thing. So it’s like an understandable question but then when you start doing the math of how many, actually, unique click paths are there through the site and how many should I care about?
05:57 MH: This question, for me becomes… ‘Cause I got a chance to talk to a lot of different companies, this one is a really good qualifying discussion point. If somebody says, “We really need to understand every path,” then that tells me a lot about where we are and what we need to do together to start fixing that.
06:16 MK: The thing is… So when you do actually… And I had done this analysis and it’s… Actually, it wasn’t that painful. But pretty much like Tim was saying, when you actually look at how many different variations there are, which are thousands, of navigating a particular website, even if you look at the top 10 paths, which is what I did, it was like less than 0.4% of our user base which… Everyone thinks it’s gonna magically answer these questions, particularly about UX like, “What should we fix?” And it’s like, actually, there are some places on your website you’re meant to drop off. I don’t know, when you get to the check out, you should probably finish. I’d be concerned… Well, sometimes I do go back but, anyway, I just think that customers are so unique and I don’t think this answers actually the question that the business has, which is where does our website suck and how do we fix it? And that’s why I find this question difficult.
07:05 MH: Alright. You’ve seen how the game is played now we turn to you, the audience. Who out there has a gripe that we can talk about together? I see one.
07:16 Speaker 5: I think one of the biggest gripes I experience is people who want a lot of pie charts or contribution without the absolute value of what’s happening. So getting a lot of analysis that’s very kind of one-sided or it doesn’t tell the whole picture.
07:33 MH: Yeah, why have a pie chart when you could have a 3-D exploding pie chart? [chuckle]
07:39 TW: But is that just asking for data as opposed to asking for the… Asking the actual business question?
07:45 S5: Yeah, we… I think especially with more entry level analysts or even some of the stakeholders, they think that pie charts, because they’re so used to seeing them, are easy to interpret. So certain visualizations maybe don’t resonate as much with them and they look for those bar charts, obviously. We do a good amount, but pie charts definitely are one that we see requested from stakeholders.
08:10 MK: People ask for it? That’s awful.
08:13 TW: But it’s the tyranny of the status quo, which I think is a challenge that if you imagine that just say only that 80% of people who are in analytics, or 80% of companies are doing analytics quite well and 20% are not, my theory is that those 20% who are not are like poison because they’re repeating what they used to do and that means they’re training everybody who’s receiving what they’re providing that that’s what they should be getting and those people move on to other jobs as they’re moving up in their careers, they turn to a more junior analyst and say, “Hey, at my last company, I had X. I want you to give me the pie charts.” And it’s a vicious cycle. Like the number of times that I’ve heard a, “Well, this is how we did it at my last company,” without really being objective about, “But was it effective?” Are you separating what you did and you were used to from what was actually useful and could be valuable?
09:08 MK: I confess, I had a stakeholder who used to like line graphs, like the only format she could digest, and I don’t know what that weird thing was, but she had… That was it. Now, I don’t know if you’ve seen a dashboard with 10 metrics that are all line charts, it looks awful. And you’re also like, “Which one do I look at first? Which is the most important?” So she basically drew for me kind of what she wanted, which was 10 line charts and I gave it to her. And then slowly over time it wasn’t like here’s a bunch of other stuff, but slowly I’d go back each week, and I’d tweak it. And I’d be like, “Oh, I put this bar at the top to know how we track against our target. Oh, I added this score card so that other people could easily see what the number is. Oh, I added like a year on year so people didn’t have to cal… ” And I did it very gradually over many many months, but the end result was completely different to what she asked for and she loved it. But I think sometimes…
10:03 TW: But in her mind she’d drawn it up for you, that… You were just giving her what she wanted. [chuckle]
10:06 MK: Yeah, apparently. But sometimes it is just about holding hands until you’re like, “Oh, maybe a better way to visualize this one pie chart,” and then you can kinda hopefully lead them down the garden path.
10:18 MH: Oh, we have another gripe incoming. Go ahead.
10:21 Speaker 6: My recent-ish one that seems to keep coming up is any time anybody is looking at any data or wants to dive deeper into X, Y, Z, “You know what would be interesting?”
10:35 MH: Oh, that one is somewhere in here. [chuckle]
10:36 S6: Not interesting, tell me it’s useful. Tell me it’s critical. Tell me you need this information.
10:40 MK: Full disclosure.
10:42 S6: This is not word of the day.
[overlapping conversation]
10:43 MK: She read all of these before.
10:45 S6: That was mine.
10:45 MK: So she knows that she’s jumping the gun.
10:48 S6: That was mine.
10:49 MH: Ladies and gentlemen, the Kiss sisters.
[laughter]
10:51 TW: That’s true. Interesting. Interesting. It’s terrible.
10:58 MH: That’s so good. Yeah, interesting, yes. I would love to work this weekend to pull together something that peaks your fancy. That is what I delight to do. Yes.
11:07 MK: I actually had a stakeholder who did that. He’s like, “I want you to change all the variables in your model because I have a hunch that this thing is gonna be true if we do this.” I’m like, “Okay, so I’m building you a pretty complex model here, which I can either finish or I can figure out if your hunch is right. So which one should I do first?” And he’s like, “The hunch. The hunch one.” So, that’s when I left that company.
11:30 MH: My most favorite. Alright, we’re gonna go back to our board, but we’ll come back for some gripes from the audience in just a bit. “Visualizations that rely on the audience to be able to distinguish between red and green.” I don’t know who put this one in our list, Tim. Do you know?
11:46 TW: [chuckle] I did not make it to… None of the sessions I went to were visualization this year. Did anybody go to a data visualization-oriented session workshop?
[background conversation]
11:57 TW: Okay then. Let the record reflect that no hands were raised. But actually I used… I now have Sim Daltonism, is what I use on the Mac, but I will definitely… People will send me something and I’ll fire it up and say, “How’s it do for deuteranomaly?” And definitely have been able to send people screen captures back saying, “This is not helpful to possibly your manager.”
12:20 MK: Well, controversial, I do use red and green sometimes, but there’s a very strong caveat. So I do use red and green sometimes. You’ll have a little arrow, especially if you’re doing like month on month or year on year comparisons. And if the arrow is up, I’ll make it green. If the arrow’s down, I’ll make it red, which means it’s not only red or green.
12:38 TW: Right, right, it’s supplemental.
12:39 MK: So yeah, so people can still understand it if they’re color blind, but I would never use a line graph with…
12:44 MH: I thought you’d introduce a really cool segmented way of reporting where 8% of your audience is gonna see something different.
12:50 TW: Yeah, you just personalize your reports to the… Yeah, ’cause you shouldn’t…
12:53 MK: That would be cool.
12:54 TW: You shouldn’t not use it and punish the other 92% to not have that added. So yeah, as long as it’s not the only thing you’re relying on. Oh, but it… Oh, it pisses me off. I feel like we’re not working up to the gripe level.
13:07 MH: Oh, gripes. Oh man, gripes: “That you shouldn’t spend time making data pretty and that it doesn’t matter what a dashboard looks like as long as it has the right numbers on it.”
13:16 MK: I’m actually melting down right now. [chuckle] Like, melting. So I had a stakeholder who got really annoyed with me ’cause in my time estimations I’d always put out, not put out, but include quite rightly, some time for data visualization and how I’m gonna represent it. And this particular stakeholder liked things pretty much straight as you’d punched out a query like, give me the number. Which put a lot of pressure on what can be part of the most important piece of that process. And so she said to me, she’s like, “Oh I don’t need it to look pretty, I just need the right numbers. So you can just send me the spreadsheet.” And I was like, “Sure.” I once had to explain to her… Anyway, this is not a person, I’ve sent raw data to people, this is not one of those people that you should send raw data to. And I straight away got an email back being like, “What the hell does this mean? Does this mean that May was better than June or worse? Are we doing better than last year or is this year better?” And I’m like, “And this is exactly why people spend time on data visualization because you don’t understand what the heck it means unless I do that step.” I think my sister actually does say it really nicely when she’s like, “It’s not about making it pretty, it’s about being understood.”
14:30 TW: I swear I was gonna go to the audience and say, “How exactly do you say that, Michelle?” Because I definitely use it as well. That’s the misperception when I talk about data visualization and people think I’m not artistic, I don’t have design background, I can’t make it pretty. And I definitely quote Michelle Kiss regularly saying, that’s the best way to put it, “It’s not about making it pretty, it’s about making it understood.” And oh, by the way, there are plenty of very tactical skills that even it’s not about design, you can learn some pretty simple rules and make stuff that is effective. It’s a false thing when people say, “Oh, data visualization is about making it pretty.”
15:05 MH: Yeah. And I think we also have to recognize as analysts, we play on different playing fields, sort of, from this perspective. When it’s me and one person and we’re kinda going back and forth on an idea, we can do things very quickly and I don’t have to necessarily go through all the different things I might do to make that. But if I operationalize this for the whole organization, there is a much higher bar that we should be. And so, it’s about recognizing also, “Well, where are we at in this process and what should be… ” ‘Cause if the person’s coming to ask for this report for the organization, and they do it like that, then what will come out is a gripe at the end. Yeah.
15:40 TW: Well and related to that, when analysts have that perspective, sometimes they’re like, “Look, I’ve been looking with the data, I can look at the spreadsheet and I can see that May was better than April so clearly my stakeholder can as well.” And they completely missed that they pulled the data, they knew exactly what they pulled, they had a mental image of what they were pulling, why they were pulling it, what they were looking for. The analyst has a ton of context already built in when the data’s still in a pretty raw format but then you drop that on somebody else. They haven’t lived through that part of the experience. So they’d be like, “Well gee, I, as an analyst can understand it in raw data form, why can’t my stakeholder?” And it’s like, “Well, to be fair, you cheated, you spent four hours getting to the raw data form and yes, you can understand it. They didn’t have that.” And you don’t want them to have to spend four hours re-doing it, ’cause what value are you adding if that’s the case?
16:26 MH: Way to gripe on behalf of the business, way to go. All right, let’s turn it back over to the audience. I think we’ve got a gripe in waiting.
16:35 S7: So my question is, why focus on operationalizing reporting when you can operationalize the data to integrate into other systems? Why focus on just making your report better and more consumable, why not take the data and get it into your systems and improve things by the data itself?
16:51 MK: Oh me, me, me. I think they’re kind of both the same. If I was building a report, it would also be about how do I operationalize this data. So I wouldn’t necessarily think about it. And I wanted to talk about ETL today, and Tim was like, “Moe, there’s gonna be some people here that are not data engineery-types, AK, a lot.” But I think part of being a good report builder is actually ETL-ing that data into a format so that hopefully someone else can use it as well. So you’re finding out ways to aggregate or to pivot it so that other people in the business can then use that and also, your reports don’t take 50,000 years to run and then people get really annoyed and think the tool is to blame. But more on that later.
17:34 TW: So I had a item that grinds my gears. So basically, a virus started spreading through our organization over the past few years. And the virus is people coming to my desk or Slacking me saying, “Quick question. Did you get into our data set as well?” It’s anything but quick, ever.
17:53 MH: Yeah. No, it just shows how in tune we are with the industry. That’s a really good gripe.
17:58 TW: Zero relationship between the length of the question and the effort to actually solve it. I feel like I got to where I just respond to people when they would stop… ‘Cause they’re like… They’re gonna help me out and not even put it in a email, they’ll just drop by my desk and say, “Hey I have a quick question.” I’m like, “Let me stop you there. Just so you know, the length of the question, in my experience, has zero relationship with the level of effort required to answer it.”
18:22 MK: You actually say that?
18:23 TW: Yeah. Well, they don’t stop by my desk now ’cause I work from home and that would be… That would be a little awkward. But, yeah.
18:29 MK: See, I go with a different method and my method tends to be… ‘Cause often, I’m working with particular stakeholders in the building. And I will, I’ll probably give them 20 minutes, half an hour and we’ll talk it through and we’ll talk about some strategies to answer it. And most of the time, there might actually be other resources in the company that can help do that. It’s more just about like, “Hey Jordana, who works in the merchandising team is all over this and this is how she could tackle it,” is a good way I’ve found to handle it ’cause then they go bug the person in their team who actually knows about it. Or the other thing I’ve done is kind of like a bit of stakeholdery, managery things where basically I’m like, “I can answer that question, but it’s gonna take eight hours of analysis work. Is that worth this quick question, answer?” And often, they’ll be like, “Oh well, actually my meeting’s in like two hours.” And you’re like, “Okay, well, nice chatting. Good luck with it.”
19:22 TW: Moe really wants to help people. I just wanna piss and moan.
19:24 MK: I do wanna… Aww.
19:24 TW: We only have a finite amount of time.
19:26 MK: But apparently…
19:26 TW: You just not gonna be cathartic if you’re gonna keep solving problems.
19:28 MK: Apparently, you’re the genuine one.
19:31 MH: Oh, you’ve got a quick question? Let me just look at your calendar where we can set up an all-day session for us to dive into that. That’s a good humble way.
19:38 TW: I actually, ’cause I’ve… And with humor, I think actually immediately saying, “Just so you know, I wanna hear the question. I just feel like you should know because it actually… ” ‘Cause I got to where this was the agency where really, where the virus spread. That’s a good way to describe it ’cause it is like people are like, “Oh, I know they’re really busy, I know they have a lot on their plate, but I’m just asking a quick question.” So there were people who came by and said,” Look, I know I have no idea what it’s gonna take to answer this. I can articulate the question really quickly, I’d like to chat with you a little bit about it.” But man, that’s a killer.
20:09 MH: It’s also one where competency is its own curse too, right? Because the more well-known you are for doing a great job, the more people are like, “Uh, Quick question.” Alright, we’ve got another quick gripe. Lets see, “The tagging requirements are simple. Just track everything.” I’m certainly a fan of tools that track everything. But it’s usually a distinctive mark of an organization who has no idea what they really want to do. And as such, drowning ourselves in all the possible data then becomes the thing to do itself and its a good two years. And then by then, you should be able to find a new job and repeat the process in a new organization. No harm done except continuing to set back our industry which desperately needs credibility in this time of emerging accountability for data and its actions. Now there’s the gripe.
21:00 TW: This is actually the cousin to the “Quick question” actually though, right? ‘Cause this is the one where they’re like, “Look, I don’t have to give you a bunch of requirements. I’ve made it really easy for you.” I’m like, “Well by you, you mean me. And that you want to tag everything. You want tool tip impressions. Like, literally saw that in requirements for a website. We’d like to track tool tip impressions. But when it comes in just, “Hey, quick meeting. Only takes 15 minutes, just tag everything.”
21:23 MK: Well, I also, I haven’t tried this yet ’cause I’ve gone down a different route previously. But when I hear this, my straight away thought is the user. Like to me as a user, if a company was like, “Lets just track everything they do. Well, we don’t know why we might need it but lets just do it anyway.” Terrifies the shit out of me. So I don’t know. I think that you could potentially, if you had stakeholders in the business, saying this with GDPR and everything that’s coming up with privacy and compliance, I think you could kind of flip it a little bit and be like, “We’re gonna have to make a justification at some point why we track this thing. And we need to have a really good business use case.” Okay, Tim’s furrowing brows.
22:03 TW: I feel like that’s… That may be a tough one ’cause the reality is you’re tracking the users core behavior, making the case that tracking their mouse-over on the global navigation is bad, is… Feels tougher. The reality is, if you track everything into a traditional digital analytics tool, your data becomes incredibly messy. If we had developers who say, “Yeah, we went ahead and put this random identifier on the end of everything so that now nothing is aggregated in your analytics platform,” logically makes sense. You have super raw data, you can roll it up to anything. The problem is no business user actually knows how to roll it up to anything, so I’d make that case instead.
22:44 MK: It does also… Well, yeah but your data set being so big and complicated and messy that you can’t work with it, I don’t know if that’s necessarily gonna get your stakeholders concerned ’cause they’re like, “I don’t really give a shit how much data you’ve got. What I care about is get me the answer to the question. And if you tag everything then, I’m gonna have the answer to my question.”
23:00 TW: That spreadsheet of raw data is gonna be alot longer when you… [chuckle]
23:02 MK: Well, spreadsheets, that’s a whole ‘nother…
23:04 MH: Let me go ahead and setup that all day meeting.
23:05 TW: We don’t need to solve these problems, we’re just pissing and moaning.
23:07 MH: Alright, next gripe. “The belief that with enough data, the truth is knowable.”
23:16 TW: Matt Gershoff is in the room. I feel like before we started recording, Jim asked how many people had not heard of the Digital Analytics Power Hour. So I feel like we could ask, a show of hands, how many people have not heard of Matt Gershoff? See, he’s got almost better, he’s got…
23:32 MK: He has way better traction than us.
23:32 TW: Dramastically.
23:33 MH: Way better brand recognition.
23:34 MK: It’s definitely the SEO strategy.
23:35 MH: Way to go, Matt Gershoff.
23:36 TW: But to me, this is actually in the last few years where the big “aha” for me that we have been conditioned by the web analytics vendors, by Google and Adobe and Webtrends. And even the technology vendors out there today saying, “Throw your data in with AI, with enough data, you’ll get to the truth.” And what’s, to me, really intriguing is that the deeper you get into statistics and data science and machine learning, you realize everything’s probabilistic and that you’re not getting… You’re not trying to get to the truth. You can’t get to the truth. You can increase your confidence. You can increase the likelihood of something but there’s a reason there’s not 100% confidence level, it’s not attainable. But we have been conditioned and marketers have been conditioned to just want the answer. Thats why we’ve always gotten frustrated with statisticians, “Just give me the answer. Is it A or is it B?” And it’s like, well…
24:29 MK: Or, “Don’t give me a range, that’s way too hard.”
24:32 TW: Yeah. I think that’s a huge… I think it’s a challenge for marketing and it’s on analysts actually help educate ’cause it’s nuanced.
24:41 MK: Actually on this, it’s really funny. We’re probably gonna talk a little bit about when you have multiple analytics platforms and you’re trying to transition the business from one to the other. The really common thing that people talk about is, “The source of truth. We’re gonna use this analytics tool ’cause its the source of truth.” And I worked with a data engineer who spent his entire… He actually kind of reminds me of Tim like, “Oh.” Hands up in the air, really frustrated and he’s like, “It’s not a source of truth its a source of facts. It has a particular set of facts in that database and we are gonna use that to make our decisions. But its never a source of truth. Shout out to Adelson my main man. Never listens.
25:20 TW: But I like that guy.
25:22 MH: But actually more troubling is organizations who’ve never conceived of concept of like source of truth. And so have multiple systems everywhere running their business in multiple directions with no idea that they should be maybe trying to centralize that.
25:36 TW: But there’s a human nature I mean its understandable. That there is a Human nature wants simplicity, they want clarity, they want things to be black and white and stuff just isn’t. It isn’t black and white.
25:50 MH: We need buckets to put things in. Alright next gripe. The word insights.
26:00 TW: Actionable insights.
26:01 MH: Well now we know those are dead those are no more right?
26:03 TW: Yeah.
26:04 MH: Gary Angel killed actually insights. You saw it. Here at Marketing Analytics Summit.
26:09 MK: Or in a previous podcast.
26:11 TW: We did a whole episode with Christopher Berry on what’s an insight.
26:13 MH: I forget most of what we talk about.
[chuckle]
26:16 TW: I always forget the definition. I don’t know, that word gets thrown around so much, like it’s… I want insights in my daily report, and I just don’t use it. I’ve actually lost… There are people who have very, very strong definitions for what an insight is, which generally I think are great. The problem is nobody else is using them, so I just try to avoid the word.
26:40 MK: So, what do you… Okay, I’ve been asked for a weekly, monthly, or you promise clients quarterly insights deck, whatever that means. So what do you use?
26:49 TW: Well, in that case, I’m like, “Well, let’s talk about that. You’ve changed nothing.” I mean they want insights and recommendations, and I’m like, “Well, the first time I produced that monthly report I came up with some things that you maybe didn’t know, and I made some recommendations, and we talked about them, and you implemented none of them. You actually didn’t really do anything, except you ran a couple a campaign. But, a month later, you actually are expecting insights. So I’ve had that discussion, saying, “What do you want? You haven’t changed anything, or if you have changed anything, you haven’t brought us in to actually understand what you’re changing, why you’re changing it, what result you’re expecting. So I probably turned into the cranky bastard, and I’m like, “No, don’t have it. Let’s talk about what your hypotheses are, we can then go and figure out how we’re gonna validate those, and then that will give you something that you could act upon.” And I totally steer clear of it.
27:42 MK: See, but I think this actually comes up way more in reporting, where when you’re giving… You’re often as an analyst doing regular reports, and people are like, “Oh, you need to provide commentary, which is a list of insights. One person even said to me, “I don’t want recommendations from you, I want insights. The product team will come out with the recommendations.” And you’re like, “Okay, sure,” whatever.
28:03 TW: But that’s the…
28:04 MH: The insight in this case would be, the product team will be coming up with recommendations.
[chuckle]
28:08 TW: Well, that’s… Yeah. [chuckle] There are the tools out there that will basically take your digital analytics data, and they’ll convert it into pros, like a little fill-in-the-blank, and they’re like, “Now people don’t wanna see visualizations of the data, they want to read text.” I’m like, “Who made that pitch to your investors? Are you kidding me? People don’t like to read. 280 characters is too much.”
28:33 MK: I think we… Some bullet points, yeah.
28:36 TW: No, but the auto-generated one. But I have… Well, no, I have definitely railed about why I… ‘Cause here’s the thing, if they want that, let’s just say that there are insights to be found in the data. They’re likely not going to be automatically generated. They’re gonna take time. So if we want to get to where our increased velocity, say you’re doing a weekly report, the data is available first thing Monday morning, instead of just having the report with the facts, well-designed, easily, just monitoring the business, or is anything out of whack that we should freak out about. Data may be out of whack, something you should freak out about? Maybe it isn’t. But if you say no, that’s gonna come to the analyst first. Now the analyst is gonna spend between two hours and two days trying to dig in and find something. They’re gonna package it up and they’re gonna send it out. Then whoever gets it, they have to be out of meetings, getting through their inbox.
29:27 TW: So it’s like Thursday before they actually can read it. And then, are the chances they’re gonna act on it that week? As opposed to if you educate… I went through that, where there was… I had inherited a report that was a weekly report that the client said, “We’re getting this from this other agency, they’re delivering it to us on Thursday afternoon, and it’s got a few little bullets,” that over time, by the way, weekly, it goes from something really interesting to I’m just regurgitating something that the chart already shows. And slowly I’m like, “If we just take that out, you can have data that you just know how the business is doing, and then if something looks odd, we all can think about what might have caused that. Let’s get the information into people’s hands. And, yes, the analyst, I may jump in on Monday morning and try to figure out the why, but surely others have…
30:16 TW: I’ve done the, “Wow, traffic went down, conversions went down, revenue went down. Why did it go down? Well, let me look by device category. Let me… By this. Wow, paid search went down. Oh my God.” It takes a little while to get to, something have everyone to paid search. “Oh my God, paid search dipped horribly for the last half of last week, that’s taken me… ” And then I ask the paid search person, and they’re like, “Oh, yeah, we hit our budget cap and paid search got shut off.” I’m like, “Well, I’m glad I spent those four hours.” What if the paid search person had the same data, stuff went down, and it was like, “Hey, any ideas why I meant this have happened?” And the paid search person says, “Yeah, our budget cap got hit, it went down, we saved an enormous amount of time and energy, ’cause we’ve tapped into the domain expertise of the other people.” So I’m not saying we don’t wanna deliver them, I just think scheduling them is a disaster, it slows everything down.
31:03 MH: I think it’s also about the expectation of what an insight should be, or when it should be delivered, and at what cadence. I think an insight is something you’ve gotta come up with, and it’s hard to predict, sometimes, how that will work.
31:18 MK: So you just back on that point, let’s say… Okay, report goes out Monday morning automatically to everyone. Yeah? Which is, to be honest, both of the descriptions you’ve described, I’ve experienced a lot. It’s painful. But the one that does sometimes concern me when it goes out without any commentary or context, the first thing that happens is you get 10 questions fired at you. If any metric has gone down, if any metric has gone up, all the CEO is gonna be talking about in the weekly stand-up about how awesome it is, but no one gives a shit about why it happened. So then you spend your life digging into why did this number go down? Why did that number go down? And, I don’t know, I don’t… Sometimes it seems just giving it… But if…
32:00 TW: But if you send it out with commentary, you don’t think they’re still gonna ask you? They’re just gonna ask you three of four days later.
32:04 MK: Yeah, but at last you’re on the front foot, and you’re like, “I can see that metric has gone down. I know they’re gonna ask why. Great, I’m gonna look at that first thing, and that’s gonna be the first side of my email.”
32:12 TW: I would admit that in practice, in the most extreme example of this on a weekly report, and it was actually automated to the point that due to, shocker, Adobe Analytics didn’t quite do everything it was supposed to, it was waiting for me on Monday morning, and I made sure that by 8:00 AM… But I literally looked at it, there was no room for the commentary, and I manually send it out. If there was something that was a KPI that really moved, I would still send it. I would say, “Hey, this thing went down. I don’t know why, but I’m looking into it.” So I would actually just include that. I wouldn’t try to provide answers, I would say, “I noticed X or Y, I don’t have the answers. Hey, if any of you guys actually have the answers,” ’cause that actually… They could reply all, they could reply just to me. But I think some of that goes back to… There is a culture. That same report, when you were describing earlier, the sea of data, and slowly over time. I did the exact same thing, it was a sea of numbers that they initially got, and it took about a year to slowly morph it into something that they could take a glance at. And I was helping steer them to… From a data visualization perspective, they were only looking at four KPIs.
33:17 TW: There was more data on the dashboard, but they were the right KPIs, they were looking at them and I would still get it into their hands by 8:00 AM and they… They loved it. I mean, they were like, “This is great,” and it saved me time from when I was trying to explain something that… Also I was external. So there may be something that everybody knew, if I had… If I waited for three days and then told them that you know, they… Looked like they had a huge inventory problem, it would have been like. “Yeah, we all knew that. I guess we forgot to tell you.” The old “Oh yeah, we forgot to tell the analyst,” that’s not in the list. Anybody ever had that? “Oops, sorry I forgot to tell you that campaign launched.”
33:52 MK: “Oh, we did testing.” Yeah, we did testing on the website and we fired like 50,000 million hits and we didn’t tell you.
33:57 TW: I needed to. That’s our shortest stripe and I think it went for the longest period of time.
34:00 MH: Yeah, it went on for a long time.
34:01 S1: Wait, can I…
34:01 TW: And we’re not done yet.
34:03 MK: No, I just… I’m sitting here thinking about this, right? So, the alternative to sending this out weekly is you build a dashboard, and hypothetically everyone goes and looks at it first thing Monday morning. That sounds great. But then… I’m gonna…
34:17 TW: But they don’t. If they had to go through it…
34:19 MK: Yes, they don’t and they’re lazy and they want someone to just give it to them and tell them what they should care about, and then suddenly the culture in the company is like, “I don’t know what’s going on with our numbers” like, “No one tells me what’s happening” and it’s like, “I build you this thing that you never go and look at.” So…
34:32 TW: But there is nothing stopping the analyst from going and looking at it and then…
34:36 MK: Oh sure.
34:36 TW: But then you’re not on the… You’re not… There’s not an expectation that you’re gonna do something that may or may not be right or valuable.
34:42 MH: Yeah. And what I would recommend Moe is that you build in… Like, journeys into your dashboards and also forecasts. Yeah.
[laughter]
34:52 TW: That’s good. Good throwback to the previous unrecorded sessions that the podcast listeners will not…
34:54 MH: Yeah and the Gary Angel [34:57] __. Alright, we’re moving on. The belief that machine learning and AI will ultimately just do analysis seamlessly with minimal human involvement.
35:06 TW: The exhibit halls are now broken down. All those vendors are now gone, correct?
[chuckle]
35:12 MH: I mean, you know, it would be great to allow for some of the quick questions stuff. You know, “Hey, quick question.” But you know, I was at a McDonalds recently where they had one of those new automated ordering systems. It took me a two tries.
[chuckle]
35:27 TW: So is the dynamic yield powered…
35:29 MH: Yeah. That’s either not good for me or not good for their UI. I’m not… It’s probably just not good for me.
35:36 MK: Well, actually we talked about this a lot in our last episode, about ethics in AI, and it’s one of my favorite topics. So Finn Lattimore was on the show, and one of the things she talks about is that when we build anything in AI, it’s not just like you shove in the data and it gives you an answer. Whoever is the person that’s putting that data in, that’s prepping it, that’s figuring out how to treat every single variable that’s working on it, they come with their own bias and sometimes when you just shove data into a model, it’s actually gonna reproduce the bias that we have in our own society. So you think about something like bank loans or… Anyway, there’s a whole bunch of examples, and I don’t know, I just… It’s depressing.
36:17 TW: Well, but it also is… It’s a misunderstanding of the concept of feature engineering which… I’ll admit, I didn’t quite realize this until I was starting to dive in and try to do some of this as well, but if you think that you’ve got just raw hit-level data… You got the raw data, you throw it into the magical machine and it’ll find things, well, there are transformations that require human thought. Like, I need to… Maybe I want to think of a variable, might be how many unique weeks has this person come to the website in the last three months? The raw data, it’s there, that’s captured in the raw data but the AI is not so magical that it’s gonna actually be able to figure that out. We need to actually come up with… These are some features we need to actually say, “How would we flag the users this way?” So that’s the other piece that I think it’s undervalued and there’s talk of like, “Well there’s automated feature engineering.” I’m like, “Well, that’s the pitch to the venture capitalists in Silicon Valley. Good luck on you”.
37:13 TW: The reality is there’s still a real need for humans to think about it and understand the results because this ties back to people thinking that the AI is gonna give me the answer. It’s gonna tell me who will buy. No, it’s not gonna tell you who will buy, it is gonna give you who was most likely to buy and it will give you some confidence, but it’s always gonna be probabilistic and you really ought to understand that before you think the machine is just gonna like, “Oh, we have our drag and drop” and insights are gonna emerge automagically.
37:41 MH: So this is a bad time to announce my new AI-based company that does attribution with crypto-currency underlying it. Is that what you’re saying?
37:50 TW: Is it on the Libra platform? Is it the Libra Crypto? The non-blockchain blockchain?
37:53 MH: Well, of course. Yes.
37:54 MK: But it’s funny, like, you even think about shopping. If we were to just do a normal predictive CLV model. So female shoppers typically shop with us like every month and male shoppers might shop two or three times a year, but they do really…
38:09 TW: We’re talking about the podcast online store by the way.
38:11 MK: Oh yes.
38:11 TW: We have extensive demographics on the…
38:14 MK: But yeah, the guys do really big shops when they come. So when you’re talking about like, even just predicting something little like that, like, the kind of little nuances that you need to understand about your customers in order to build an effective model.
38:28 MH: Alright, we’re gonna go to our lightning round and you guys, I’m putting a new rule. You can only respond with like one or two words max. Ready? Go.
38:39 MK: Useless.
38:40 MH: The media analyst who doesn’t know the proper format of a URL?
38:44 TW: Oh, my god.
38:46 MK: You used your two words!
38:47 MH: That’s it. We’re moving on. You don’t need anyone to do any ETL-ing as everyone should do their own. All data scientists should ETL their own data for their own purposes.
38:55 MK: Dangerous.
38:56 MH: Moe’s question.
[chuckle]
38:58 MH: Media analyst who don’t know how to check whether a pixel is firing.
39:03 TW: Good fucking lord.
[chuckle]
39:05 MK: Three. Again, three.
39:07 MH: No coding required. Just a drag and drop. One line of JavaScript.
39:11 TW: Every vendor.
39:13 MK: Every vendor. Ever.
39:15 MH: Multi-touch attribution.
39:19 MK: Still a fan.
39:20 TW: Usually bullshit.
39:22 MH: Canada test requiring tests… Okay, I’m skipping that one too long.
39:25 MK: Oh, that one was the best one.
39:26 MH: Doing a podcast with someone who doesn’t like StrengthsFinder 2.0 or emotional intelligence.
39:30 TW: Bogus.
39:31 MH: Ruling.
39:31 MK: All looking at Tim.
39:33 MH: Yeah. That’s right. The beliefs that machine learning is just a fancy way of saying, statistics with computers, which has been around for ages.
39:39 TW: GPUs.
39:40 MH: There you go. Alright, we’re not gonna do anymore. We’ve gotta wrap up. Thank you all at Marketing Analytics Summit Las Vegas for letting us be here with you, do the show with you, thank you for your gripes. While we have a lot of gripes we still love our industry. We certainly are super happy to meet you guys. Thank you all very much for my two co-hosts, Moe Kiss and Tim Wilson. I’m Michael Helbling. And despite the road being long as analysts and thus having a lot of gripes on the way, I wanna encourage you keep analyzing.
[music]
40:24 Announcer: Thanks for listening, and don’t forget to join the conversation on Facebook, Twitter or Measure Slack group. We welcome your comments and questions, visit us on the web at analyticshour.io, facebook.com/analyticshour or at Analytics Hour on Twitter.
40:44 Charles Barkley: So smart guys want to fit in. So they’ve made up a term called analytic. Analytics don’t work.
40:50 Tom Hammerschmidt: Analytics. Oh my God. What the fuck does that even mean?
40:56 Jim Sterne: This is… So, question. Digital Analytics Power Hour, how many people have not listened to this podcast? Okay, you’re totally in for a treat. And Gary Angel far be it for me to presume to disagree with you about anything in this industry. However, I have a bone to pick. You spent a few minutes talking about how talking never changed culture. After 117 podcast episodes of these people talking, they have had a serious impact on our culture. So, I’ll just… That little bit.
41:44 JS: Well, they’re getting ready, ’cause they’re gonna plug in and there’s chairs up there. They’re getting their water, and they’ve gotta get a laptop. I would like to introduce them. You are about to witness the wonderfulness, the insightfulness, the irreverence of the dream team of digital analytics podcasting. The three magi of measure, the Three Musketeers of metrics. And I could go on, and so I will. The Harry, Ron and Hermione of hits. The Superman, Batman and Wonder Woman of WebbedIN [42:26] __. The Luke, Han and Lea of analytics. The Kirk, Spock and McCoy of monetization. The Edward, Jacob and Bella of bounce rates for you Twilight fans. The longer this takes the worse it’s gonna get. The id, ego, and super ego of engagement. The bacon, lettuce and tomato of tagging. The snap, crackle and pop of page views. The Nina, Pinta and Santa Maria of mobile optimization. The three-ring circus of segmentation. The extract, transform and load of data logistics. The Father, Son and Holy Ghost of social sentiment. The wink, and blink and nod of not provided. The three stooges of stickiness.
43:12 JS: These are the Groucho, Chico and Harpo of HTML. They are the faith, hope and charity of omnichannel. They are the hear no evil, speak no evil, see no evil of eVars. Now we’re just gonna have to go to the animals. They’re the three French hens of server side cookies. They’re the Huey, Dewey, and Louie of latency. The Alvin, Simon and Theodore of third-party cookies. The momma bear, papa bear, and baby bear of bots. The three blind mice of multivariate testing. I could go on, and I shall.
43:50 JS: The three little pigs at path analysis and the three little kittens who’ve lost their mittens and one of them is writing away. They are the gold, frankincense and myrrh of metadata. They are the rock, paper, and scissors of sessions. They’re the Three Mile Island of iTracking. They are the reading, writing and arithmetic of ROI. The executive, judicial and legislative of visitors. The Kentucky Derby Preakness and Belmont of the back button. The red, green, blue of the buy button. They are the small, medium and large of landing pages. The three card monty of the Monty Hall problem. They are the two sides in the hypotenuse of hypotheses. And don’t make me get all the way to the goo, bad and ugly of UTM.
44:33 JS: They are the truth, the whole truth and nothing but the truth. So I give you Michael Helbling, Moe Kiss,Tim Wilson, for the love of God, take it away.
[applause]
44:51 MH: Rock flag in Grinds my Gears. Gotta get it all.
Subscribe: RSS
Towards the end of the podcast I was laughing my as* off when it gets to the “intro”. You guys are awesome and thanks for making analytics fun! I couldn’t find ways to get in the slack channel. help needed please!
Hi Grace, thanks for supporting the show. You can join the conversation on #measure chat at https://join.measure.chat