Who would have thought that we’d get to 2020 and still be debating whether recurring reports should include “insights?” As it turns out, Tim did an ad hoc analysis back in 2015 where he predicted exactly that! Unfortunately, the evidence is buried in the outbox of his email account at a previous employer. So, instead, we’ve opted to just tackle the topic head-on: what is a report, anyway? What are the different types of reports? What should they include? What should they leave out? And where does “analysis” fall in all of this? We have so many opinions on the subject that we didn’t even bring on a guest for this episode! So, pop in your earbuds, pull out your notebook, and start taking notes, as we’ll expect a *report* on what you think of the show once you’re done giving it a listen!
00:04 Announcer: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at facebook.com/analyticshour, and their website, analyticshour.io. And now the Digital Analytics Power Hour.
00:27 Michael Helbling: Hi, everyone, welcome to the Digital Analytics Power Hour. This is episode 132. Don’t just spit numbers at me; I want insights. It’s passingly strange that a document with numbers and charts in it somehow becomes a vehicle filled with the expectation of magical tomorrows filled with deep understanding of the complete customer journey TM. The truth is, some part of the analyst job that is about getting the numbers out there, maybe that’s the boring part, and some part of the job is about figuring out something new that will create or expand an opportunity or minimize a risk or a challenge, but we need to talk about it. It’s reporting versus analysis. Tim, what do you think? Do you reserve a text block at the top of every report for your insights?
01:18 Tim Wilson: Does it even exist if you haven’t put in insights that are really just regurgitation of the chart below?
01:25 MH: Especially in daily reports, I find it is super useful.
01:29 TW: That sometimes slows things down, so the daily report comes out four days later, but that’s what you gotta do for insights.
01:35 MH: And Moe, I know you’ve probably drawn many an insight from an exploding pie chart, huh?
01:40 Moe Kiss: Yeah, I really, really have. Sometimes the donut chart too. They’re my ultimate favorite.
01:46 MH: Exploding donut charts. Yeah.
01:47 MK: Yeah.
01:48 MH: High in carbs but still very satisfying.
01:51 MH: And I’m Michael Helbling, and I’m basically an insights master. LOL. Okay, let’s dive in because I think there is a continuing and persistent challenge in our industry as it pertains to generating data, putting it into something that someone else will carry, and then what our job is in terms of helping that along. And so Tim, get us started. Define the problem.
02:23 TW: I think the problem goes back, like you can trace all the way back to when data started getting presented, and it’s been this steady evolution as to what could go… What data was available. I used to think about… There was a time when if you printed all of the available data a company had out, which means to be writing up by hand, it would be like three pages and somebody could look at it, digest it, and draw whatever conclusions they needed from it. Whereas now you look… We’re never presenting, we’re never delivering more than one millionth of the data that’s available to us. And throughout all of that, there was this Word-like report. And so people want reports and that’s a very simple English word, but to me, if you ask ten different people what a report is, you’ll get ten completely different answers. And even depending on when you’re at, it’s become shorthand for “I want stuff of value in a deliverable although even online BI tools delivering reports”. So I just think it seems like it’s a real simple idea but it’s not.
03:32 MK: I don’t even think that it’s… I think that they want to have something that they understand. And the assumption is that by having a report, they will understand the numbers. And that’s where I completely lose my shit because I’m like… That just doesn’t happen. Like, giving you the numbers doesn’t mean that you understand it.
03:51 MH: Yeah, for sure.
03:52 MK: But what’s your definition of a report then?
03:55 TW: So, I actually try to either qualify or avoid the… I will talk about a performance measurement report, and try to be very clear that a report to me is generally something that is on a recurring cadence which immediately takes it out of the world of analysis. So, I have fought for years to very tightly define a report is a very concise way to see how are we doing against our goals, goals and objectives. I lose that battle. I’m constantly fighting and losing that battle. But that’s where… That’s where I wish we would put it. That they’re concise, they’re simple, they’re automated, they absolutely don’t have a human being adding in commentary that they label insights.
04:46 MH: Well, and this is definitely something that predates digital analytics for sure. BI tools have been around for 20 more years than digital has been. And the reports being generated there are basically I think that’s what we’re inheriting a lot of this stuff. But somewhere along the line, we got tricked into thinking that when we generated the report that something else was gonna happen, too. Like something magic was gonna happen about knowing something. And maybe, Moe, that’s kind of what you were talking about?
05:17 MK: Which is the understanding.
05:18 MH: Yeah, like I suddenly get it and I know something I didn’t know yesterday and you need to tell me that.
05:23 MK: But… And that’s the thing I’m really struggling at the moment. It seems like there is this thing that’s going on in data where everyone who’s in a data team is like the problem is that we just need to give everyone the dashboards that they want, and soon as we do they leave us alone to do the important work. And I’m not gonna say where I heard this, but I did hear someone say the other day, like, “Oh here’s a dashboard I built four months ago. I don’t even remember how I built it or the logic or anything.” And I just cringed, like I cringed because, yes, you can pump out a dashboard, and you can automate the shit out of it, and you can build it exactly how they need it, and then occasionally make the old refinement. But if an analyst isn’t looking at it, I don’t actually think your stakeholders are gonna get a ton of value out of it.
06:15 TW: I disagree.
06:17 MH: So I think, and this is actually where my definition of report… There’s a word that really pulls it in for, which is “context”. So a report to me is something that is providing context around an ongoing activity or operational aspect of the business. But to your point, Moe, people need to be trained in how to understand what the data is actually referring to in the first place. What does it mean if my Facebook ad spend has gone up, but conversion has gone up too in that channel, what should I as a marketer take away from that, right? That’s not something that someone just naturally knows the answer to. Experience teaches you how to understand. But by showing you those two numbers, I have now given you context to understand that my ongoing operation of advertising on Facebook is now giving me an opportunity to ask a deeper question or go prove out a hypothesis, which then I could take the next step.
07:20 MK: But I think that’s coaching. You need to coach your stakeholders to get to that point, and that’s a really long-term relationship.
07:30 TW: But there’s a counter to that, and it winds up getting back to effectively defining KPIs and who the audience of the report is. If it’s somebody who is managing the Facebook spend, they should have been engaged with, what am I supposed to be doing with Facebook, what outcome am I trying to deliver, and possibly at what cost? And if that has been really well-established, and a report is well designed, then they actually do understand it. If you’re giving them something and they say, “I don’t know what this metric means.” The thing with recurring reports… Side note: The insidiousness of automated reports is that they cost nothing to continue to produce, and so there’s no feedback loop as to whether anyone is looking at them and giving them value or getting any value out of it. That’s separate. But if you’re actually defining something tightly, reports get longer and longer and longer. ‘Cause somebody says, “I’m confused,” or they ask a deeper question.
08:22 MK: “But I wanna delve into one little thing,” yeah.
08:25 TW: And then somebody goes and adds that to the report and the thing grows over time.
08:29 MH: And that’s the mistake though, you can’t add stuff to a report. And so this is where…
08:35 TW: But you also don’t need to look at a report and actually have a deeper question, right? You don’t.
08:41 MH: You don’t have to, but it’s good to know where it’s coming from.
08:44 TW: But it’s conditioned in that once you start cranking that out, people are like, “Well, I need to get my… ” This is where that “we need my weekly insights,” or, “I see the report, but I wanna tell… Tell me how we deliver these results.” It’s like it didn’t matter. If you plan to do x, and you did x, then stop looking. Go and have a different idea. The report is there… It can be there to say it’s a check. Did something go off the rails? Is something surprising? If it’s not, we suck. It’s saying, “There’s nothing to see here. I got value because I looked at it. There’s nothing to see here, let me put it aside. We suck.” Analysts and business alike. It’s seeing that, people all over.
09:28 MK: I totally agree, but I also think that one of the issues with reports… And I mean I worked for a CTO who… I got to the stage where I completely anticipated any time there was even a 1% or a 2% point drop in any metric, the first question would be why. Why did that thing drop? And conversely when it went up, there was never any questions about why it went up. But I just trained myself, even though those reports were automated, and I don’t know how often people looked at them, I would look at them. And every time that happened, I would do a bit of digging to find out the answer and anticipate that question. And I think that’s what I think the role is. When things do change. I agree. I think when things are steady… The point is that we should be saying, “There’s nothing to see here.” This is exactly… Everything is ticking along, it’s fine, we don’t need to do… Yeah, we don’t need to do like an email of weekly insights when there’s nothing to report, things are performing as expected. But inversely, when things go well or they don’t go well, we as analysts have a role in helping explain to the business and understand and doing further analysis.
10:33 TW: Yeah, why would you pull out an insight of something that’s not changing? So I’ve gotta ask a question. So, with that, if I’m hearing right or I’m guessing, did you wait until you could answer the question before those numbers got to the CTO? Or did the CTO get them, basically, at the same time as you did and you were just… Knew that that question was coming and you needed to either say, “Here’s the answer,” or “I’m working on it already.”
10:57 MK: Yeah. I wouldn’t delay giving the numbers, but I would often be like, “Hey, I noticed that this thing has gone up or down. Just FYI, I’m looking into it.” And then later that day I would send an update and be like, “Hey, I’ve done some analysis. This is the outcome.”
11:10 TW: Which I love, and I feel like that’s another area.
11:15 MK: I thought you were gonna say the opposite. Oh, God.
11:17 TW: No, no, no.
11:18 MH: I know, I thought for a second it’s like, “Which I hate,” and I was like…
11:25 MK: I was like, which way is this gonna go? [chuckle]
11:27 TW: No, ’cause that’s another one. ‘Cause I hear it so many times, “Well, I can’t… They’re gonna expect me to have all the knowledge and all the… I don’t wanna deliver it to them until I can explain it,” and actually you kinda went one step further, which is the other, delivering it and saying, “Hey, heads up I already saw this.”
11:45 MH: Yes.
11:46 TW: Because what the opportunity is right then they may say, “Well, did you hear about the outage over there?” They may have that answer and you don’t need to go spend half your day tracking it down. But, man, it is like, analysts, we so wanna say, “We wanna know everything.”
12:01 MK: Totally agree, and I use to often say that to my product managers. It’s like, “You guys know the product better than I do, and you guys know the tiny little tweaks that you’ve done, the little releases.” So part of that communication right is like, “Hey, I saw this weird thing. I’m looking into it. If you can think of anything that might have driven that, can you give me a heads up so I can look into it?” Because you’re gonna save yourself a shit ton of time versus you trying to poke around and figure out what’s going on than just talking to your stakeholders who know what’s happening better than you do.
12:14 MH: And I think, Moe, what you described is a great cadence for an analyst. That you described a very good… Like you were proactive, you understood what to do, you were in tune with the business context of those numbers. The only thing that could be improved there would be your executive having better intelligence about when to ask for more information versus not asking. But that’s sort of like… Is that on you or where is that at? But the point is, is I think that’s actually an excellent model. So anyone listening could take those steps and replicate them in your own company, and that would be a good, good way to proceed.
13:04 MK: That’s not how it started, though. That’s the thing, is like that came from many months of me sending stuff and her straight away coming back to me and being like, “Why? Why, though?” And then over time… We were talking about this with Nancy Duarte a few episodes back. Same thing where you learn to anticipate the needs of your stakeholders. But that takes time. You have to develop that relationship and understand what are they gonna care about when they see this report?
13:30 MH: And you as the analyst need time to build an understanding of the underlying dataset too. Like, what do these numbers mean? The only reason you knew that these numbers might be important is because you’ve been spending time in the data, understanding its nuances and those kinds of things. So there is a ramp up both from a personal and an operational perspective that I think is really important for getting a report, but this… Everything we’re talking about is basically just circling around data literacy or numeracy, right? People understanding what is possible and what is to be understood from the data that they’re getting.
14:05 TW: I think it’s… I agree it’s around data literacy. If the CTO had said, “Moe, this is silly. Why do you keep delivering me your report? And then I ask you why, and I have to have a second interaction with you. If you had been driven to… ” She had said, “Why don’t you figure out the why before you come to me?” I feel like that… Stakeholders will have that sort of perspective like it… Like you have to have that deliberate, “Hey, I’m gonna give you this, so that you have the information in your hands in a very timely manner. I, in parallel, will be doing… I will be taking a look at it as well.” I had a client recently say they produce their reports through a very, very manual and tedious process. And they said, “Hey, one of the benefits of that is, is that it forces us to actually look at the data. And my head about exploded. I’m like, “You need to be forced?” Like, you’re woefully inefficient on a biweekly basis because…
15:02 MH: There’s the curiosity.
15:04 TW: Yeah, I’m like, “How about you automate the shit out of it, and then you have a little bit of discipline, and then you look at it, and you deliver it.” And I’ve had it with stakeholders, I’ve had with clients before where… There’s the infamous, Michael knows about this one, the weekly report that got delivered on Thursdays. ‘Cause it had to go through a bunch of data quality checks, and it’s just like…
15:24 MH: Oh, my gosh.
15:25 TW: The timing kind of doesn’t… It doesn’t work. I’m like, “Let’s get it on Monday. I’ll look at it, they’ll look at it, we’ll talk on Monday afternoon, and then there might be a snowball’s chance in hell that we could actually react if something needs to happen.” But even that we’d have the discussion and say, “Look, there’s nothing to see here.” And they had to kinda get used to saying, “We’re all looking at it, but lots of these things aren’t moving.”
15:47 MH: A classic case of trying to accomplish too much with one thing. The other thing, Tim, you mentioned earlier is like, yeah, you were automating report, but then that leads to this laziness of people just jamming stuff in here that doesn’t matter. I think if you’re gonna automate reporting, you also need to establish a process or cadence for re-examining what data is actually helpful and actually recreating that report automation on some kind of ongoing basis. ‘Cause the other thing that happens, if you stay with a company for a significant length of time, the things that matter to the leadership will actually shift to moderate over time. So for instance, companies that sell things to people go through phases of looking for customer acquisition. And then phases where they are looking more towards customer loyalty. And so the metrics that bubble to the surface or of most importance actually will shift around a little bit. And if your reports aren’t reflecting that, your reports start becoming less and less serious to the people who are trying to make decisions based on the go-to-market strategy or whatever it is, the marketing or whoever has.
16:49 MH: So that’s the other piece of that is I think I’m a huge fan of automation, but you also have to build a process of going back into being like, “You know what, we’ve never talked about this metric in the last quarter. Let’s take it out of here. You have asked about this and I’ve found a metric that I think will give you insight to that on a more regular basis that I’m gonna add in.”
17:08 MK: What do you guys think about one-in-one-out? Like, if people wanna add metrics, they’ve got to get rid of something else.
17:13 MH: Generally, I love the idea of it. I think you can’t write that in stone ’cause there may be stuff that you’re just like, “You know what? We just need to have this ’cause we didn’t think of it before, but it’s really good.” That’s the thing. I don’t really believe in getting it perfect the first time. I know Tim gets it perfect the first time ’cause he’s the quintessential analyst, but I believe in iterating. Because as I discuss it, as I have that conversation about why this happened, I come back with, “You know what, let me get this in here because actually that will help us get a better understanding of what’s gonna happen.” So there should be some flexibility in that. And the only place that doesn’t really work is more in the agency model where you’re kinda like “I have to be perfect and buttoned up and be the expert all the time.”
17:54 TW: But you don’t…
17:55 MH: No, you don’t. And that’s the reality.
17:57 TW: You don’t. There are people sending those signals, but you don’t.
18:01 MH: Well, yeah. That when you get a report on Thursday is when your agency because you had to be perfect.
18:09 TW: The iteration, you gotta be careful, ’cause you’ll see the cases where literally the report… Every version of the report is an iteration. So I like your note of saying kind of plan it, communicate it. Say, “We’re gonna live with this one for a quarter… ” And then I think on the one in, one out, my feelings about appendices have evolved over time, but… So one in, one out. One in, one into the appendix if you really want it. Just because… And Nancy, Stephen Few, Tufte, all of them talk about like the… What we can take in at one time. If you have one really… One-pager then… And you say, “Sure, there’s pages two through 10, but man, the first page is really focused on what we think matters.” Shockingly, yes people want to dig into the data, but most of them wanna have something they can look at in a glance. And then page two, three, four, but on top of that still no insights. That’s the.
19:09 MH: Yeah.
19:09 TW: That the insights are kind of a separate…
19:12 MH: Okay. Well, the show’s going great, but we do need to step aside for a multi-touch moment. Hey, Josh, have you heard of this service? It’s a vendor translator. It’s called Vendolator. They have a free version of their product where you could stick in any website’s URL, the tool, and whatever website you’re looking at, and it’ll automatically use advanced AI and machine learning to scan the page and translate all the marketing copy on the site to help you understand what’s really going on with that platform. And, when you pay for the paid service, you could upload an audio file of a vendor call or even a recorded demo and the platform will provide a transcription of the reality rather than just the claims that they made in the vendor pitch. So here’s an example: Vendors will tell you you can deploy something with just one line of code and that gets translated to, “Technically, there’s some code that has been added to your website that could actually load a much larger set of code that will get injected into the dom. And for that code to actually work, to deliver on most of the promised features, other updates of the code on the site will be needed. Oh, and the code has been placed in a very specific location, which require weighing various trade-offs of performance and user experience and tool capabilities.”
20:26 Josh Crowhurst: Wow, that’s great. Hey, let me try one. Let’s try this one. Advanced AI, I’m gonna punch that into the algorithm. Advanced AI gets translated to, oh, conduct simulates statistical methods that have existed for a few decades.
20:42 MH: Isn’t that amazing? And actually, when I type in actionable insights and Next Gen insights I get…
20:50 JC: That’s amazing.
20:53 MH: That’s so great.
20:55 JC: How about this one? Data scientists of all skill levels will find sophisticated easy-to-use tools that support rapid development training and tuning of machine learning recipes. All the benefits of technology without the complexity. What the fuck? Okay, here we go. This copy was generated by a marketer who searched for hot topics and analytics in 2020 because he met with the development team’s data scientist multiple times and never could quite understand what they were saying. I totally get it.
21:25 MH: You know, with over 8000 MarTech tools in the landscape today, a tool like Vendolator becomes almost a necessity. It’s time to start making AI and machine learning work for you. Check out Vendolator. Alright, let’s get back to the show.
Let’s talk about a different kind of report because I think we’re all talking about sort of general business reports and giving a whole review of everything that’s happening. What about specific reports that are driven by maybe a one-off campaign or those kinds of things?
22:00 TW: I was thinking to say…
22:00 MH: I imagined you had an opinion on that.
22:02 TW: Well, that’s one of those uses of the word, if somebody says, “Well, give me the campaign report.” And there’s two aspects of that. One is, I’m monitoring the campaign over time, which absolutely, if it’s a two-month campaign, you damn well better not be waiting for two weeks to do a weekly report, so there’s that piece, but then I think more of where you’re going, which is another great area to explore is, we finish the campaign, give me the campaign report.
22:29 MK: I think of it as a campaign write-up.
22:31 TW: Well, yeah, so I think that’s where it gets called a report, and that’s to me, I put very distinct sections. If we did our planning upfront, then the first part of it is, how did the campaign perform? It’s literally where did we go against our KPIs? It’s objective, it’s unambiguous. Ideally, when we run campaigns well, there were ad hoc analysis and optimizations, and adjustments that were made during the campaign that worked or did not work. And if we ran it well, we kept a record of those and that’s another section saying these are the two or the three things that we actually changed during the campaign that positively impacted or course corrected. And then it’s when people say “Okay, we’re done with the campaign, go analyze the results,” it kind of infuriates me. That’s the first time that it’s like we set the KPIS upfront, so, great, we involved you, you built us a dashboard, now we’re done with the campaign, go analyze the results.” I’m like, “Well, I shouldn’t be doing some big analysis.”
23:30 TW: Unless I had a hypothesis upfront, that, hey, one of the things from this campaign that we wanna learn is if we touch people with a higher frequency than we have historically on our campaigns, will they become multi-channel consumers? Something that has to have the full set of data run through and now I need to do that analysis. But even that should be defined upfront. So when it is working well with a stakeholder or a client, I’m not sitting at the end of the campaign saying, “I have to produce the campaign report. Gotta find something interesting, gotta find me some insights. That’s a fail if there isn’t already a rich bit of information to read back out on the campaign.” But what do you guys think? I haven’t put a whole lot of thought into this, I’m just riffing off the top of my head.
24:16 MH: Yeah, you’re just off the cuff there.
24:19 MH: What do you think, Moe?
24:21 MK: I don’t know, I’m really just perplexed by this one.
24:26 TW: Well, talk about it. Were you doing at The Iconic? Or are you on the hook for campaign write-ups or reports, or no?
24:35 MK: Yeah, I was a little bit at The Iconic, less so here, but I suppose that’s because at the moment, our performance, I don’t know, it’s not always on kind of model, but it’s less like, here’s a two-month campaign and now go away and report on that, if that makes sense.
24:54 TW: It’s like a regular… You said it is kinda always on. In some way I feel like I get my Canva emails on a fairly regular cadence.
25:00 MK: Yeah.
25:01 TW: Not helping your click-through rate.
25:03 MK: No, no, not at all. But, yeah, it doesn’t really seem to be as kind of campaign analysis driven, but I do also just wonder if that’s because it’s the maturity of our marketing efforts. For example, performance has really only been done since the start of this year. And really only ran top part right through the year. So it’s not like it’s a six or a seven-year-old performance marketing team, it’s like things are just starting. So, I think you get to that more campaign tactical level once you’re more mature, if that makes sense. I think at the moment, it’s more trying to optimize keywords and strategies and that sort of stuff. Does that make sense?
25:42 MH: Yeah, it does. I think there is a place for reporting of some kind in certain contexts, but…
25:49 TW: We’re talking about campaigns and then you’re talking about during, or are you talking about after it?
25:53 MH: Well, honestly, part of the campaign planning process should be what you said, Tim, which is, set up some kind of upfront objective or target, KPI, whatever it is. Same thing with any test or hypothesis you’re trying to validate, ’cause really what a campaign is on a certain level, especially in digital, is basically a hypothesis, this creative, with this treatment, with these channels, and these things are going to produce a specific outcome that we have planned out and thought through. Now, of course, nobody ever does that work, ’cause it’s marketers and it’s way… It’s really hard, and we feel bad for you. But that’s what should be happening. In analytics, analysts sit there and they plan out how we’re gonna let you know where you’re at on that journey in some way shape or form. So keep it simple. And this is where I find the waste of paper in our industry happening is mostly in this area. All these daily reports are too many data points about something in flight because somebody’s curious, and this is more of the demand for info as opposed to the demand for use, is really heavily skewed in this context I think.
27:04 TW: Which is… That brings up the… Because we have these BI platforms and they’re so much lower cost, more integrated, more data can be pulled together, and that’s always a fun one, when people really just want information and you’re like, “Look, you know what? I can show you… Maybe it’s not the most beautiful user interface, but I can show you how to get these basic things,” and that way I’m not in the middle. Like, “No, no, no. I just want it in a report.” I had a client that during their peak season, I was like, “Okay, well, I’ll set up one basic automated thing that’s really just informational, send it Monday through Friday,” and they’re like, “We need it on Saturday and Sunday too.” I’m like, “Really? Do you really?”
27:44 TW: And then they would still come back and ask questions and I’m like, “This is a simple spreadsheet with some drop downs,” and they would say, “Can you pull a report on this?” I’m like, “It literally is in the thing that you sent.” And you can’t say that, you have to gently point it out and say, “You had to wait for two hours to get this, because it’s already in something in your inbox.” If there are stakeholders who will say, “I’m curious enough about the data that I would like to be able to do some self-service.” As long as it’s set up… Like, “Look, I’m not telling you just to go talk to the hand, I’m not gonna support you, but I’d like to help you help yourself, and let’s figure out how that works out.” And there are… I’ve seen business stakeholders that are like, “That’s awesome.” And occasionally maybe they steer themselves a little wrong, but, generally, we say, “Look, if you’re doing a whole bunch of slicing and dicing and find something amazing, trust me, run it by me first before you shout from the rooftops,” and if they don’t, then it bites them in the ass and then they learn, and then we move forward.
28:47 MH: Well, I’ve even observed people who take pride in how much time they waste of other people’s in doing this activity on their behalf.
28:57 MH: They’re like, “I’ve got all these people pulling these reports for me.” It’s so dumb and dangerous. It’s troublesome. So if you ever find yourself in that position where someone is proud of how much time they’re wasting of someone’s, you should be concerned.
29:12 MK: I was just gonna say, what about the signs? You often hear this from stakeholders, where it’s like, “I just need a report that has all of these things, and if I had all of these things then my life would be perfect and I’d know all the answers.” And you’re like, “Yes, wait.” So you’re like, “Right, KPIs, yada, yada, yada.” You design the perfect report, automate the shit out of it. Let’s say, weekly cadence, you send it out, and you know they’re not looking at it. You never get a single question, a single comment, it just goes into the void. I’m curious to hear how you would handle that, because… As much as the CTO’s questions were always tough, at least you know she’s reading it, right?
29:52 MH: Absolutely.
29:52 MK: Because if someone comes back with questions, they’re looking at it, which is exactly what you want, and it means they’re interested in what the data’s saying. But if you just get blip and there’s no discussion… I don’t know, I had one case where I just stopped sending it and no one noticed, and then that was the end of that report.
30:10 TW: I’ve totally done that, yup.
30:12 MH: I look at questions as the starting point of me getting to do my job as an analyst. If someone asks a question, better if it’s a great question, but any question is the start of us getting on a journey of figuring out something to do that’s not just spitting numbers at people. And so that’s awesome. And so if that CTO, if she’s asking me these questions, that she’s really engaged with the data, then I’m just gonna be like, “Hey, I’m your analyst now, let’s go figure some stuff out. How do we make you a winner?” And then other people learn from that.
30:44 TW: But for the flip side, there’s never enough hours in the day, and I feel like I’ve learned a long time ago that the people who aren’t asking questions, I stopped a long time ago trying to make the horse drink after I’ve led it to water. Because it goes to Michael’s point. All you need… If you have a few people who are asking good or great questions and you’re working with them, and their peers are just crickets, if they’re sucking down your time… And I did have a client there where they were sucking down all of my time producing ridiculous volumes of stuff. And that’s a problem.
31:23 TW: But if you’ve got it where you’re like, “This is what you said you wanted. You didn’t wanna engage with me. You didn’t wanna have… I automated it. It’s out there. You never asked a question. Not my problem.” I’m gonna go work with the people who are fun, who have clarity of thought around the work that they’re doing who I can support, and then I have the opportunity organizationally to champion those people and do it without trying to drag the people along. Celebrate the ones who were… Where you’re doing great stuff, and showing how, “Wow, well, they were looking at this and they noticed this, and they asked this question and we translated that into a hypothesis, and we talked about how they could… If they validated it, how could they could act on it.” And there’s your nice story, and that was human interaction with good analysis and great collaboration, and the people who were just crickets, it’s like, “I don’t… I hate to say, don’t waste your time.”
31:32 MK: Yeah. That’s cool. Except for the case when the people who are crickets, you’re responsible on, say, reporting back to the business performance and helping make budgeting decisions, or whatever the case is for that team. And you end up getting in this really sticky place where they just don’t give a shit about the numbers, but it’s your responsibility to help the business understand the numbers which are relevant to the performance of that team. I don’t know, I just feel like it’s hard to… Just wipe the hand and be like, “Oh, well, they’ve only aged.”
32:49 TW: And I do a lot of work trying to do sort of data literacy training, explanation in a different scenario, and because I’m kind of more in the consulting world we can talk to the archy people and say, “Look, we think this entire team needs some outside education on how to think about the numbers. That is tougher to do internally. To say, “I’m the analyst, I need to come in and just talk to you guys about how to use data.”
33:18 MK: Yeah, and I do wonder if you have that case with those stakeholders or crickets that is the time where you then engage the level up, and it’s like, that’s the time that you go basically to their managers and talk about data culture. Because if there are big chunks of the team that just don’t give a shit, I actually think that’s a cultural issue.
33:39 MH: Yeah, absolutely.
33:39 MK: And then that’s also somewhere where their manager can be your ally in helping… Getting them to understand that their KPIs and the metrics that matter to the business, or what they should care about and if they don’t, I don’t know, maybe there’s not a place for them in the team.
33:54 MH: Yeah, but I definitely believe in engagement as well. If they’re not asking questions they’re not engaging, it might be that they’re not knowing what to ask. There’s lots of reasons why, but this is where… And, again, I’m going back also, Moe, to the episode with Nancy Duarte where she talked a lot about how people receive information and consume it, and had some examples of people who only look at it this way. Well, Jeez, that’s hard for us, but that’s the reality. So it’s like, well, let’s get a little engagement going and just ask the question, like, “Hey, you get this report every day… ” And, again, that’s hard to do, ’cause that’s nothing to do with analyzing data, and everything to do with sort of the touchy-feely stuff, which is not super… Our forte. Generally speaking, Tim, I know as a quintessential analyst, that is your forte.
34:43 TW: No I was more thinking it’s your forte, touchy feely and I like her.
34:47 MH: We’re all good at it together. But I think it is problematic, because we absolutely need… And, Moe, I loved your idea of, hey, how do we elevate the conversation and talk broader about data culture, ’cause certainly we’re all saying so externally why not do so internally when it’s time to actually look at numbers and utilize this data.
35:09 TW: Which is, I’ve recently came across a case, and this wasn’t a kind it’s an analyst I know who left one retailer and went to another one where I’ve known him for years. The whole time he’s been at the other retailer. And, fundamentally, it’s weird in 2020 to imagine a retailer that doesn’t have a C-suite that is really really interested in using digital data effectively. And that retailer, they didn’t. Any time we would say, “Wow, if you invested in this, then… ” This is like, they were so lean that it was like, you’d say, “What if you added this little tool, or increased your skills a little bit,” or, “What if you kind of cut over to this, or… ” And it’s like, “Yeah, no,” we’re so lean, they’re literally looking how to cut costs of analytics, and it’s like, “Get out.” He finally got out, he went to a company where they said, we wanna get more value out of analytics so we’re bringing you on because we want to get more, and they really meant it.
36:13 MH: Alright. So we talked a lot about when we do analysis, it can sometimes stem from operational reports or other performance measurement reporting to kinda use your vernacular, Tim, but let’s talk a little bit about sort of that next step, in doing the analysis, and bringing that back, because is that reporting or is that something else? I’d like to hear your thoughts on that.
36:38 MK: I think one of the issues that I find is that the automating of the dashboards and building that all out, and even in the case of my team now where we do a lot of ETL, that takes away from the analysis, and I do actually really struggle with… I don’t think a lot of teams have that balance right of how much you’re building versus how much deep diving you’re doing. And to be honest, we’ve just been through goal bartering for 2020 in our next season and one of the things that keeps happening is people will be like, “Oh, I want a dashboard, which will give me the answer to this question,” and straightaway, I start a conversation of like, “That’s not actually a dashboard. It sounds like what you really need is someone to deep dive into the data and give you an answer.
37:20 TW: How many times do you need to answer that question? [chuckle]
37:23 MK: Yeah, it’s like you need an answer to that, which is the result of a really in-depth piece of analysis, and then, a couple of months later, we can re-run the analysis and see if there’s any changes, but that’s not a dashboard. A dashboard is not gonna give you that answer about, I don’t know, how someone’s using the product or a really painful friction point or… And I think there’s a lot that we need to do about educating stakeholders about the difference of what a report or a dashboard can do versus what in-depth analysis can do, and part of that is even, I don’t think they understand that if your analyst is spending all their time automating dashboards, they’re not doing analysis for you.
38:02 TW: But I think it is seductive both the ETL and building automated stuff. It’s just like I feel like our industry has way more people who are really interested in the data collection tagging, ETL-ing, munging, it’s very tangible. If you look on the Measure Slack and you go through some of the channels, it’s all about, how do I capture this, how do I collect this data, how do I put it somewhere? And that is a unfairly broad generalization, but I draw into that, like, “Hey, I thought this cool idea to track this thing in this video.” I think of it as kind of three buckets being basically operational support, which is your data collection, maintaining your tools, getting your data infrastructure in place. The performance measurement, that’s kind of, recurring automated reports requires investment upfront to establish really meaningful KPIs with targets and good data viz to automate stuff. And then there’s analysis, which is so fundamentally different, because analysis is inherently ad-hoc and you don’t pick an analysis, and say, “I’m gonna do my four-hour analysis every week.” You have an analysis that may take two months, you have an analysis that may take three minutes.
39:19 TW: And I think there’s just… It’s hard. And if you think of those three buckets as being your investment portfolio in analytics, you got to get some real scale of time and energy spent on the analysis. If you’re spending all your time building the data collection, you may need to do that upfront, but you can’t get sucked into that’s all you do. And if you’re spending all your time doing recurring reporting or building that, you’re not really forward looking, but that idea for the business and for teams to grasp that reporting outside of a one-time campaign report that gets murky, but recurring reporting versus analysis, you can’t shove analysis on a schedule, you may have updates on which analyses are in flight, but on a schedule, by trying to actually fit analyses into a schedule, really hems you into a ridiculous box.
40:12 MH: So what’s the right mix? Reporting is necessary, we gotta do it, but so if we were running the optimal program in a company, how much analyst time would be on reporting and how much would be on analysis?
40:26 MK: One of the women I’m coaching at the moment, we go through every session, and we talk about the percentage of her time that she is either unblocking other people in meetings, working on data migration staff and then analysis. And, yeah, it’s probably been five or six weeks now. It’s really interesting because you don’t think about it a lot, but actually having that touch point once a week to be like… Actually 60% of my time this week was in meetings or unblocking other people by reviewing their code, or stuff like that. And you’re like, “Oh wow, that balance doesn’t sound right at all.” And so then you get to have this really great discussion about what is the right balance. I don’t know, I kind of think that maybe analysis should be sort of 50% or 60% of your job.
41:15 MH: Yes, I was thinking the same. Yeah.
41:17 MK: It should being part of your job. And I don’t know anyone that’s at that level.
41:24 MH: A great way to think about this is, where does the value creation for an analyst start occurring, and then, how do you as a company maximize that value creation.
41:36 MK: Nice.
41:36 MH: It’s all on the analysis side. The hypothesis and validation side of it. None of the value creation or very little… I wanna call it none, but very little is on the reporting side.
41:50 MK: Well, if they have reports, it’s not, but if they have zero reports, then you could argue that they add a lot of value.
41:58 MH: Yeah, okay. That’s fine. As a starting point. And that’s why there’s some value. And it’s again, it does go up and down because like, “Oh, we’re gonna start a new initiative or program, we need to update our reports where you’re gonna spend more time on reporting or enabling tagging or whatever it is,” but getting into… And this is actually the funnest work too. So that piece of it… I will tell you this, when I worked at Lands’ End, we secretly got all of our reporting down to about 20 minutes a week, and then we spent all the rest of our time doing projects that drove things forward or planning A B tests, those kinds of things.
42:35 TW: Which is key, because I will tell you, if I ever meet anyone who’s listening at a conference on the list of which it’s probably a longer list, and in my mind I think it is, but one of the things that will set me off is an analyst saying, “My problem is that I spend all my time doing reporting and I’m just not given time to do analysis, and I need more time… ” That is…
42:55 MH: No one’s gonna give it to you.
42:57 TW: That’s the thing, it is a…
42:58 MK: So, sorry, why does it drive you mad?
43:00 TW: It drives me nuts because it is an analyst blaming the organization, and, in many cases, the people who have complained about that, are people who aren’t capable of doing meaningful and deep analysis. So it is a little bit of a way for them to… I’m not… When they complain about it, I’m saying there is a, it could be happening, but it is on the analyst to aggressively be trying to figure out how to work around that.
43:29 MK: My issue with that comment though is that… And it’s something that we’re talking a lot about here right now, and I’m trying to help a lot of people in the team about managing stakeholder expectations, is that, in order to have that discussion or in order to have that analyst really advocate for, “I’m gonna add the most value by doing this work. I know you’ve asked for this other stuff, but I am good enough at my job to know that I’m gonna add more value to the business by doing this other stuff. So how about I somehow pair back this request over here so I can do more of this request over here.” I think that takes a lot of time, a lot of experience and a lot of confidence, and in my team, for example, I have a lot of young people who are not at that level yet. They have no way of pushing back to their stakeholders and I wanna help them develop that skill, because I think that’s really important, and they do know, they know where they add value, they’re just not at the stage where they know how to have those conversations with stakeholders. So I don’t think it’s always just about the cop out. I think it’s that we need to equip our analysts to have those conversations.
44:33 TW: But, again, I’m being clear, I don’t wanna hear anybody say it to me. Seriously, I will be like, “Fuck you, go away.” It’s really not gonna be solved even by… If you’re having the explicit conversation with the stakeholder, I wanna spend less time doing your reporting so I can spend more time doing analysis. You’re right, you’re not mature enough, you’re not developed enough to actually have a meaningful conversation with the stakeholder. Yeah, it does take some time and some experience, I just… It drives me nuts and I’ve run into many analysts and you will see them tweeting. It is a cliche to complain about my company doesn’t give me enough time to do analysis and it’s like fuck you, adjust your processes and figure it out, and maybe that means you have to do a little bit of extra work in the near term to show some value, maybe you have to do some stuff on the side, but as soon as they’re like It’s out of my hands. I’m like Welcome to the world of analytics. It’s always out of your hands.
45:38 MK: Yeah, that’s true, that’s true.
45:39 MH: And at the same time, there are definitely examples where people are pulling themselves up by their bootstraps, so to speak, and the organization itself is creating the environment where they can’t succeed and then, Tim and I think you’d agree then it’s really like, well you do find a company that’s gonna be excited about what you’re bringing to the table. And Moe what I’ll say is your analysts are very lucky to have a manager who understands what you just said, and is hopefully providing a demonstration of like to how to build that context awareness and negotiating capability within the organization to manage expectations appropriately and bring that to the table, ’cause that’s literally what I think an analytics manager’s job is, is to clear the road so analysts can do analysis and provide it in coherent ways to the organization.
46:25 TW: I agree, completely with everything Michael just said even though I was… Right, you’ve got the advocating and the [46:31] __ happening. And you’re wresting with it, right. Yeah, people are really lucky to have a manager who’s wrestling with that, as opposed to a manager who’s not. So yes, I concur. Here to validate you.
46:46 MK: Thanks guys, yeah, that’s… You’re alright dude.
46:50 MH: I mean obviously attributable to spending time with Tim, and I on a podcast of course.
46:58 TW: What else would like us to mansplain for you.
47:00 MH: Let me just swoop in and take some trant there? No, I’m just kidding. No it’s awesome. Okay, we have to start to wrap up and I think we’ve done a good complete lap around the topics so, great job. I think these guests really slow us down [chuckle] and… I’m kidding, it’s a awesome. And we’re in a brand new year, in a couple of weeks, we’ll be at Superweek we’ve got an amazing audio live stream, an unprecedented activity filled day of Super Week Digital Analytics Power Hour. 12 hour live stream, no matter where you are in the world, you can experience what Super Week is like through our podcast, which we will have some streaming that’s available on YouTube, and superweek.hu and analyticshour.io so when Superweek is going on, there’ll be ways to engage via Twitter as well as we get the hashtag lock-down. Okay, before we jump into last calls, we go through and we read all of our iTunes reviews and we’re very thankful for them, and we wanted to give a little shout out to a couple of them, and we might do this from time to time throughout the course of 2020, it’s just our way to show some appreciation for our folks who took the time to give us a review and stuff like that. So this one came from Soul-sis and I won’t reveal her name, but she knows who she is, thanks for listening.
48:18 MH: And for serious digital analytics Power Hour is easily one of the best professional-ish, and we certainly appreciate that ish podcast. Their last call knowledge sharing is full of nuggets that are relevant and thought-provoking together Michael, Tim, Moe, and the occasional guest, Rock flag and analyze. Thank you so much, Soul-sis we really appreciate it. And this one is a short one from Preston Radol, says a very funny, super intriguing, highly relevant. Makes my day when a new episode pops up. Well, Preston it makes our day that you take the time to listen. So thank you so much. Okay, let’s jump into last calls, Moe, what do you got?
48:58 MK: Yeah, so I’m gonna be spending the next week or so, building out a power curve. And when you talk to someone super smart in your team who is a PhD in maths about problem and then suddenly they’re like, throwing formulas at you, and you’re like, “Oh shit, I’m really gonna have to do some reading on this one.” But he did point me to this site that just suddenly made everything click a little bit. And it’s called Wolfram Alpha and it’s wolframalpha.com, and basically you can put an equation in there. So what he was showing me was how to put a power function, in there and then see how that gets visualized in a graph. And I just thought that it was really cool ’cause we played around with some of the calculations and it just helped me make sense of it and I still have a bit of work to do on this, so if anyone listening has done some stuff on this, feel free to ping me because I’m just working through learning some stuff.
49:53 MH: That’s awesome. It’s weird that the immediate thing, that pop to my mind when you said power curve Moe was the offspring song called Killboy Powerhead. But teach their own. Yeah, nothing?
50:04 TW: No, no.
50:05 MH: Anything, Moe? Any bring up with you. No? No this is a very old song, it’s from the ’90s so. Okay, Tim, what about you?
50:16 TW: I’ll start the year with a quick two for, so one. Every time, Simo have a… Rolls out a new site, whether it’d be whimsical, or content-rich or just awesome, it should be a last call. So, cookiestatus.com, cookiestatus.com for our listeners who have not found it yet. Simo basically said we really need to just for those major browsers, so much stuff is changing so fast. It’s a kind of a technical site, it’s mainly one page, but you can kind of dig into the details, but it’s kind of trying to track all the ins and outs of ITP and ETP in each different browser so that is gonna be a big deal for 2020, how much that’s gonna change our world.
51:00 TW: That’s a super handy reference. But also because I don’t want it to get too too old there was a 99% Invisible podcast episode that covered the Eliza effect, which was a chatbot… I’d read about it in the past. It was a chatbot that was developed at MIT back in the 1960s, and the guy who developed it became very, very concerned that what AI might do to replacing human interaction. But basically it was a kind of a dumb chatbot, but he watched people, starting with his secretary, actually start sharing a lot of stuff with it. So it basically runs all the way from one of those original early on natural language processing, trying to pass the Turing test, all the way through GPD2 and open AIs, kind of too dangerous for at least text generation. So it’s a good little podcast episode that covers kind of the arc of natural language processing and the potential and the pitfalls of it. What about you, Michael?
51:57 MH: Well, I took to reading today and this is an article that came out a few weeks ago, but there’s gonna be more in the series that haven’t been released at the time we recorded. But basically the New York Times got a hold of a very large data set of more than 12 million people with their location data as they moved around different cities in the US, and they were able to leverage that data to basically show how you could pinpoint an individual person using their data in a way that’s a fingerprint that is extremely suspect and dangerous. And so location data on cell phones is sort of a little bit of a wild west from a data perspective. So it was a very eye-opening article. And so basically I think if you have any apps that are recording location data on your phone, you should probably delete them right now. And…
52:45 MK: I don’t… Honestly, I don’t know but some very large companies are gathering this data and selling it to third parties without any oversight or anything like that, so we really have some pretty big data challenges in front of us. And certainly there’s more interest in identifying some solutions, but this is one that’s just sort of lives there and there’s no real good answers for it. More articles on this data set, we’ll publish by the time this comes out. So watch this space. There’s more to come.
53:15 TW: Awesome.
53:16 MH: Okay. You’ve probably been listening and you’re like, “Well, you just said something exactly how my job is as it pertains to reporting or analysis.” Or, “You’ve really hit the mark again, Tim.” Or, “Moe, I identify with you and not these other idiots who are on this podcast with you.” So we would love to hear from you. Reach out to us on the Measure Slack or on Twitter or on LinkedIn. We’d love to hear from you.
53:39 MK: And also when Tim mentioned those people that love to track stuff and think about how stuff and metrics and yada yada yada. And if you like that stuff and live in cities, please let me know.
53:51 MK: Because I need to hire a bunch of people.
53:53 MH: Really. Oh, Moe is hiring, so just go to canva.com/careers. No, I don’t know if that’s really where it is, but that’s just a rehash of an old joke we used to do on me.
54:05 MK: Just message me.
54:06 MH: That’s awesome. Are you taking anybody from the United States? ‘Cause I’d come work for you.
54:10 MK: I’ll take anyone.
54:11 MH: I don’t have any… You know I don’t have any SQL skills though.
54:15 MK: Well, SQL and Python are a pre-requisite.
54:19 TW: What’s funny is the canva.com/careers page actually does have the video playing in the background, which is exactly what the old search discovery jobs page did. The joke that goes back three or four years also had different video.
54:32 MH: Nice. Alright. Well, so we would love to hear from you, whether it is to get a job in Sydney at Canva working for Moe, which would be amazing, or about this podcast. Either of those things are great ways to reach out. As always we deeply appreciate the hard work of our producer, Josh Crowhurst, who helps us make the show possible. And I know I speak for my two co-hosts, Tim Wilson and Moe Kiss. And I tell you, no matter where your time get spent every day, one thing you can never afford to stop doing is analyzing.
55:08 Announcer: Thanks for listening and don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, facebook.com/analyticshour, or @AnalyticsHour on Twitter.
55:27 Charles Barkley: So smart guys want to fit in. So they made up a term called analytics. Analytics don’t work.
55:35 Thom Hammerschmidt: Analytics. Oh, my god. What the fuck does that even mean?
55:43 MH: Oh. Hey, Moe you should get a holder. One of those cool truck, race trucks they make down there in Australia.
55:51 MK: So the only that drive those are tradies.
55:55 MH: Who are tradies?
55:56 MK: Tradies are trades people. So like carpenters and electricians, and…
56:00 MH: But they’re like race cars.
56:03 MK: Huh? Are you talking about utes?
56:05 MK: That are low to the ground?
56:06 MH: Yeah.
56:08 MK: Okay, yeah. So… And then the only people who drive them are douches like my ex-boyfriend.
56:12 MH: Oh, well. Sorry, he ruined it for you ’cause they’re sweet.
56:17 TW: No, they aren’t.
56:20 MH: They’re not. They’re not.
56:23 MH: And it’s perfect that we now have “I Shat Myself” as a clip but now, unfortunately, we have it for me, too. Oh, no.
56:33 MK: Professional-ish fuckers.
56:34 MH: Get ready to disagree with Tim Wilson about everything. He did ESQL. Is it even a report? Okay…
56:42 MH: And I know that I speak for both of my co-hosts, Tim Wilson and Moe Kiss. And I say, no matter what percentage of your job you spend on a reporting or analysis, the most important thing is never give up on it. Just keep going with analysing.
57:07 TW: You wanna try that again?
57:07 MH: Yeah. I’ll probably gonna try that again.
57:07 MH: I always did it like… I jumped it.
57:07 TW: You’re usually not in the right spot. But that one you were a down… You were in a cul-de-sac and sort of they put a barricade up behind you and, yeah.
57:07 MH: My horse stepped in a gopher hole and now I’m like “Oi, go.” Okay.
57:07 MH: Rock flag and reporting.
This site uses Akismet to reduce spam. Learn how your comment data is processed.