#083: KPIs in the Absence of a Clear Conversion with Amy Sample

Raise your hand if you work for a company that sells exclusively low-consideration products and only sells them online. Anyone? Anyone? We only see a couple of hands out there. For all the rest of you, this episode might be of interest. We sat down with Amy Sample — Senior Director of Consumer Insights and Strategy at PBS by day, president of the DAA board by night — to discuss approaches for effective digital measurement in the absence of a clear online conversion. That challenge doesn’t get much bigger than in the mission-driven, not-for-profit world of public television! After listening to this episode, you may actually feel like you have it easy!

References Made in the Show

Episode Transcript

[music]

00:04 Speaker 1: Welcome to the Digital Analytics Power Hour. Tim, Michael, Moe, and the occasional guest discussing digital analytics issues of the day. Find them on Facebook at Facebook.com/analyticshour and their website, analyticshour.io. And now, the Digital Analytics Power Hour.

[music]

00:28 Michael Helbling: Hi, everyone. Welcome to the Digital Analytics Power Hour. This is Episode 83. You know it’s a sunny day, sweeping the clouds away, on your way to where you can analyze sweet, sweet goals and conversions on your website. Isn’t it nice, how it feels when all your metrics and your tools point towards a nice easy to use conversion goal? But you know what? It isn’t always that easy. Sometimes there isn’t a clear conversion. You still need to measure, and certainly establish key performance indicators. Take for example the measurement of this podcast. We don’t have clear KPIs or easy to measure conversions. How does that make you feel, fellow co-host Moe Kiss?

01:19 Moe Kiss: Okay, I guess. Good. Perfect.

[laughter]

01:24 MH: Okay. Well, good then. That’s great. Moe, it’s a pleasure to have you on the show again. Moe Kiss, for those listening, is the Manager of Analytics at The Iconic. So that’s a slight change for you. Congratulations.

01:42 MK: Thank you.

01:43 MH: And we also have another co-host and regular on the show, who I’m pleased to introduce, Tim Wilson.

01:51 Tim Wilson: Hey, Michael. [chuckle]

01:52 MH: Oh, I just, I was gonna let you… You always interrupt me anyway, so I was gonna let you…

01:58 TW: I’m just keeping you on your toes. I was gonna behave myself now that it can show up on my performance review.

[chuckle]

02:03 MH: Tim is my colleague and close personal friend. He’s also the Senior Director of Analytics at Search Discovery. And I am, as always, Michael Helbling, and I also work at Search Discovery. And we’re still figuring some of that out. Alright. But who can help us sound out the words that are the KPIs, in the absence of clear conversions? Well, we went straight to the top for our guest, Amy Sample. She’s the Senior Director of Strategic Insights at PBS, and she’s also the current Board President of the Digital Analytics Association. She’s also held consumer research roles at AOL. And so welcome to the show, Amy.

02:48 Amy Sample: Thanks everybody.

02:49 MH: It’s awesome to have you. I always, I just have that picture in my head, on Sesame Street, where they would paint the letter eight on that bald guy’s head. It was very disrespectful…

[chuckle]

03:00 MH: But it began my love of numbers. Anyways, enough of my personal stories.

03:05 TW: Let’s move on to Oscar the Grouch.

03:07 AS: [03:07] ____ Nice.

[laughter]

03:10 MH: Yeah, I always liked Snuffleupagus. I don’t know why. Anyway, let’s get started. I think maybe it’d be a good place to start maybe is just, Amy, if you don’t mind giving a quick overview of PBS.

03:23 AS: Sure. So we are kind of like a three-headed monster of an organization. We’re first and foremost, a media company. So we deliver television programs, both for adults and for kids. But on top of that, we’re also a membership organization. And our members are the 350 local stations around the country. They’re the members of PBS. And then we’re also a mission-based organization. So we have a mission to entertain, educate, and inspire the American people, and deliver on that through content across platforms. Does that answer most of it, I think?

04:01 MH: You tell me. So your major competitors then, would be like the BBC and NPR? I’m just kidding. [laughter] I’m kidding.

04:11 AS: I should say, we are not NPR, NPR is a separate company, although we do share some of the local stations. So if you have a station in your market that might be both a PBS and an NPR station, is we share some members. And the other big misconception is when viewers donate, they’re actually donating to their local station, they’re not donating to PBS. So you support your local station who then buys membership into the PBS family.

04:37 MK: And for Australian listeners, my understanding is that it’s not the same but it’s similar to… We have an organization here called the ABC, the Australian Broadcasting Corporation, which is government funded. So that would be, I guess, the key difference. But it’s similar in that vein.

04:53 AS: Very similar with the mission slant. But the difference is we’re technically not government funded. So government funding goes to the Corporation for Public Broadcasting, which then doles out money to the member stations and grants to producers and grants to PBS.

05:09 MK: So, Amy, I’ve spoken to you before. You’ve been with PBS for some time. Can you just talk us through a little bit about what kind of roles and work you’ve been doing there?

05:20 AS: Sure. So I’ve been here for 10 years. I relaunched the digital analytics function when I started. I took the company through a tool transition, from at the time, we were using Visual Sciences, for all the old-school analysts out there, and moved to Google Analytics as our tool of record. We support both local member stations and TV producers, so the people who create the shows, we provide both of those constituents with digital analytics data. And we built a digital analytics team, and then over time, we’ve merged that with the TV research team, so I’m now leading a team of analysts that are both digital analysts and TV research analysts, as well as primary research, to kinda put all the data into one organization. And then I have a colleague who is running data strategy and operations, so that’s how we collect data, report on data, and the governance rules around that.

06:12 TW: What’s the relative, not dollars, but the relative when it comes to data usage by I guess, the organization? ‘Cause I think you guys do actually support the stations somewhat with the digital analytics data, and presumably you feed results of TV as well as research data out to the stations. Is that correct?

06:36 AS: Correct. We directly syndicate digital analytics data to the 300 stations that are using our video platforms, with all video-based data. And then we also syndicate that data to approximately 150 content producers as well. And then, on the TV side, we do more of producing reports. So the Nielsen model is slightly different. So we analyze and do presentations and reports for stations.

07:04 TW: And so, which report is it that actually gives you your week over week tracking against the delivery against the PBS mission?

[laughter]

07:18 AS: Bringing us back to the topic?

07:20 MH: Yeah…

07:22 TW: No, because it seems like… ‘Cause that’s one, I think a lot of your media company and… The Nielsen stuff has always been off on the periphery of anything that I’ve done. So I could [07:34] ____ you for hours with questions about the quality, and the caveats, and the cost, and the reliability of Nielsen data, but then we wouldn’t be talking about the topic at hand. [chuckle] But I guess, are you tasked with reporting out against the mission and other of those squishy goals?

07:55 AS: Yeah. So one of my roles as strategic insights is I actually lead the corporate metrics process for PBS. We do that, we set those KPIs every year, and we report on them to the PBS board. And so, this had to be really high level.

08:11 TW: But they’re only reported on once a year? I mean they’re… Or are they annual?

08:15 AS: No, I report quarterly. We report progress quarterly, and we set a target for the year. And they’re very high level. So you have to figure our how those map to the mission and the organizational goals. And then I work with my team to set the cascading goals out of that. So we might have some very tactical things that we’re doing that don’t get reported to the board, but we’re working on those internally. And then we help stations with defining what their digital KPIs might be. So there’s several layers to the onion in setting the goals. Certainly, the first thing, if you’re a media organization, the biggest thing is reach. How many people are we reaching? And so we’re trying to get a total cross-platform view of that.

09:01 AS: But the measurement tools don’t really allow you to deduplicate digital viewers from TV viewers. So the way that we’ve solved it for right now, is reporting those side by side, and setting a target for how many people we reach each month on TV and how many people we reach each month on our digital platforms. And then, we cascade into a couple of other things. So, one might be how well we support stations. There’s the broadcast operations. So we have a… Is the satellite up and running, to send the content to stations? And then there’s a financial goal too. So it’s really high level proxies for meeting the mission.

09:36 TW: What’s the financial goal for a for-profit is revenue or profit or margin? So how are you guys doing…

09:44 AS: It’s a balanced budget type of goal.

09:47 TW: Okay. Got it.

09:49 MK: So one question that I had, which probably is a touch off topic but I’d still weirdly like to touch on it. When I worked in government, obviously similar situation where revenue or profitability isn’t the main metric, we often had a really difficult time. It’s really difficult to get stakeholders on board. Like when you’re presenting a business case at a company like I work at now, it’s really easy to be like, “Hey, let’s do this, we’re gonna raise $4 million. We’re gonna… ” Like it’s so tangible that people will buy-in. When you’re talking about the KPIs that you’re setting, how do you get people to buy-in when there isn’t that tangible “Hey, we’re making more money” component?

10:40 AS: It is a process. And we’re starting it right now for the next fiscal year. And it takes me a couple of months of going around and around, usually to end up in the same place where we started. [laughter] Surprise, surprise. And every year, people wanna debate the metric we have, and have a thirst for something new that’s more tangible. And yet, we always end up back at the same place. Because there’s so very little that you can actually count or report on. So it would be great for me to be able to say, “The initiatives we did with stations reached 450 million Americans.” But the only way I can do that is for each station to manually count those and then self-report them back, which doesn’t make for a good reporting metric, if you will. So we end up with what we can measure and what can stand in for that behavior. But it does take a lot of meetings and conversations. And there’s a lot of argument, a lot of like, “Well, maybe that’s not what that really means,” even though the stakeholder might have agreed to it two weeks before. And so you do have a lot of back and forth until you settle on something.

11:49 MH: It’s interesting though, I think one you described was this process of having these top level metrics that are the ones that the board is really using to understand how things are performing. And then I think teams are probably taking their metrics and aligning them to that. How did that process take shape? ‘Cause I imagine, day one, when you walked into PBS, this was not the same as it is now. What evolution, and how did you manage that evolving process? Because I think as we incorporate… Like probably 15 years ago, PBS didn’t do digital content or streaming video to the web as much as they do now, if at all. And so obviously, these changes have impacted you greatly. So what was that process like over the years? I’m just really curious about…

12:38 AS: Yeah. It started with being tied to a strategic planning process, as PBS goes through an update of the strategic plan every three years. And what really drove it is we ended up with a COO who joined the company, who came from a financial services background. And he was very much wanting the company to be much more metric driven, which is not a bad thing. I think when I came here, we were much more mission driven. So you might do an initiative because it served an underserved population of 10 people, but it really had no return. But you said, “Okay, we have a mission to do this.” Whereas he brought some discipline to it, so that you could evaluate projects and focus resources on places that mattered. So when we started out, we had something like 20 board KPIs, and over the years, the board gave us guidance and said, “This is too much. This is too much, this is too much.” And so we narrowed it down. Over the last three or four years, we’ve had about seven board level KPIs, with the goal of trying to balance mission and media, and membership. We call it the three Ms. So we’re trying to balance all three of them, which can be difficult. That’s probably the one complication that we have that other even non-profits don’t have, is you’re trying to balance all three. And so you end up in a lot of conflicts, [14:00] ____ are we maximizing for reach or are we maximizing for revenue? Well, we’re kind of going down the middle. [chuckle]

14:06 TW: But that should be, that’s ideally… That’s not at all what the balanced scorecard has. It’s four different bucks but have specific buckets. But it does make sense that you’ve got KPIs that maybe are a little bit sometimes at odds with each other, so they at least forces that discussion. How formal is the linkage from, say, those seven KPIs down to specific teams and groups? Are they saying, “We’re really about supporting these four KPIs”? And maybe there’s tension between the ones that they’re looking at. Do they have an eye to when they’re setting in a more tactical level? Are they going through a causal modeling, causal mapping type exercise?

14:47 AS: No. It’s a lot less formal than that. I would say on the digital organization, we were more aligned. So we had four or five KPIs that would cascade out of that, compared to television programming where they’re really just maximizing for that reach number, for ratings as a separate target. But we were probably more cascading on the digital side. But it’s, I think, a discipline that exists more on digital than it does on the linear TV side, too.

15:16 TW: ‘Cause we’ve been trying to figure out what’s the purpose, what’s our purpose for here, for a less amount of time.

15:21 AS: Yeah. And they just accept the purpose. [chuckle]

15:24 TW: Yeah. [chuckle]

15:26 MH: Well, it is interesting, ’cause it’s like, in digital, we’re just questioning all of our metrics still, and I think it’s a maturity thing. Whereas in television, the metrics are all so awful, but nobody’s really questioning them. Like it’s just we understand why they’re there.

15:42 AS: Yeah. I think, on TV, your ratings is your currency. And what I have found is, most people who work on the TV side could not explain to you how a rating is calculated, but they know whether a 0.2 is good or bad.

15:57 MK: So, just for context, because I’m not a person who’s ever worked in TV. When it comes to TV, am I hearing correctly, that the main two metrics that you would normally look at would be reach and ratings?

16:14 AS: Typically, you use ratings, and ratings is a function of reach frequency and time spent viewing.

16:19 MK: Okay. And so are you relying… ‘Cause it sounds like for some of the data that you need, you actually have to go back to the local stations, and some of it you have yourself. That, in and of itself can be quite difficult, I imagine.

16:33 AS: It can be. I think there’s often times of… Where we talk about, “What are we… Does this initiative do to raise individual giving at a station level?” I don’t have any insight into that data. So the membership data at a station is completely separate. Our stations are their own businesses, they’re independently-owned, and there’s real history between us. And so, they wanna keep that data to themselves. And so there is no centralized membership database or dollars, which can make it difficult when we’re working on things like… We have a product called PBS Passport, which is sort of like Netflix for members of your local station, and to try to optimize retention or churn, but we don’t have the data to be able to do that.

17:20 TW: It sounds like you are sort of struggling with that tension. I wanna make sure that we get… There was an example that you’d shared with us before that was a pretty old article from McKinsey about the Nature Conservancy, and I’m wondering if… Because that, it has an example in it where they marched a little too far towards the clear, tangible, understandable, and then sort of woke up to realize that that hits that out of the park, but it actually isn’t particularly well-aligned. So I wonder if it’s worth having you sorta walk us through that, the bucks and acres example.

17:57 AS: Yeah, so it’s an example from The Nature Conservancy where they were counting dollars raised and numbers of acres of land that they acquired; so they’re protecting land by acquiring it. And they were really optimizing for that and doing well at it, but eventually, they got to a point where they realized that they weren’t having an impact on the biodiversity of the ecosystem they were trying to protect. At its core, that’s what their mission was. Their mission wasn’t actually to acquire land. Acquiring land was just a tactic to deliver on the mission. And so they changed their metric to focus on some research data around the effect that diversity of the ecosystem, and… I can’t remember exactly the other metric they were using but it was sort of like the extinction of animals or something. To get back to the mission, I think that can often be hard for us, as non-profits or sometimes government as well would fall into this, where you’re forced to pick a proxy, so you go to the things that you can count and suddenly, you’re just reporting on things that you count but you forgot to go back and say, “Does this thing actually… Is it actually a proxy for what I’m trying to accomplish?”

19:05 TW: Yeah. Which was like… That was the other thing I really liked about that article was it said, look you basically have… It says for measuring mission, but it’s almost really for measuring when you’re in a pickle, where it’s not kind of an easy and obvious measurement, and they said, “Well, one, define the mission narrowly.” So that seems really dangerous ’cause you’re letting what you can measure drive what your actual mission is, or what your goals, or vision, or strategy are. So that sorta seems like that’s not ideal. They said, “Invest in research”, which you guys do have… You have enough scale that you do have data sources that can collect basically panel data or qualitative data that sort of helps, but it seems like a lot of organizations don’t really have that. And then the third, they said, “Well make… ” They called them micro level goals that are kind of where they landed. And that’s the really tricky part of saying, how do you figure out those goals to say, “Look, we’re pretty confident that if we move this… ” There’s not gonna be a way for us to kinda manipulate or make poor decisions to drive that goal, that goal should be supporting what that larger mission is. Right?

20:19 AS: Yeah. I think that’s the point, is to find the micro things that ladder back to the mission. But the challenge can be you can get distracted by the things, the micro things, and maybe those micro things aren’t necessarily moving the needle on the mission either.

20:35 TW: Well, that’s where I… I’ve done that exer… I’ve done this exercise of sort of causal mapping of saying, “Let me plot what I’m doing on… Let me write that on one side and let me write those high level, really hard to measure things, and now let me force myself to draw boxes where I link from this activity or this tactic, leads to this… ” Somewhere along the way, I get to a micro level goal that hopefully it’s sitting in the middle of this diagram that says, “I understand the really tactical stuff I’m doing, how it supports that goal, but I’ve also kind of mapped how that goal supports the larger mission.” And the handful of times I’ve done that, it’s been really useful. Partly for just the discussion that it drives and you can find that, “Oh, this micro level goal’s good,” but it’s not… I never called it that. But it’s really off on the fringe, like it’s probably having a positive impact on the mission, but it’s a tiny little thing. It’s like trying to measure the impact on brand from a tweet. You’re so far removed that it’s a bit of a stretch.

21:43 MH: I think there’s another side to this too, which is the risk of people getting excited about metrics they think should be KPIs, but actually have nothing to do with your business. And my most favorite recollection is bounce rate, of course, because there was a good year of my life where that metric was reported all the way to the board level of the organization I worked at, and it didn’t have any place there and we kinda knew it. But how do you keep those metrics out of the mix? How do you kinda guard the gates, if you will?

22:22 AS: I don’t know. I mean I always try to get us back to what we always talk about is, “What are we trying to accomplish here?” And if you can get the conversation back there, I think you can kind of talk to the stakeholders about, “Okay, so if we knew what we’re trying to accomplish, how does bounce rate tell us whether we’re accomplishing that or not?” But it can be difficult and we do end up often with things that are suggested or recommended, strongly recommended, that maybe we all would look at each other and go, “I don’t know if that means anything.” I think you have to build the case of like, “Let me show you why this doesn’t mean anything.”

22:57 MH: As a digital analyst, I have all these cool data points in my ecosystem that I’m trying to kind of ladder up into kind of objectives or KPIs of the business, but there’s gaps. There are gaps in my ability to connect those with the meaningful metrics of how the business operates. And it’s almost true… That’s true almost in any business, some are more able to connect than others. But I think we all agree, the connection from, “Okay, look, I’m seeing something in my digital data, and I wanna take that up and see if it’s a predictor of something in how our business performs, but I have a gap in my ability to connect it.” And I feel like that’s something that’s pretty common.

23:41 MK: I definitely think it’s common. I don’t even think that’s something that just happens for not for profits, where you have this metric and then you’re trying to kind of connect it back to, what are the inputs and the outputs for that, and how can we actually impact it? And I mean, that’s an issue…

24:01 MH: The way I think about is, “What if I show up to work one day and none of the primary drivers of the business are in my digital world?” In theory, that could be true. I can contribute to how the business runs but the digital space that I am a part of may not have a metric or a key performance indicator in it without translation actually matters. Most of the time, you’ll find stuff that’s important, but I’m just curious, what do you think of that?

24:31 AS: Let me give you an example. I did some analysis at PBS and we look at… We obviously, over the course of my time here, launched a video ecosystem, We had… You can watch full episode videos on the web, on [24:47] ____ devices, which are like… Or through an Apple TV. You can watch on a mobile app, and so video became really important for us, but we were trying to connect how important is video to driving station membership? And the way that we actually did it is to figure out, through surveys, whether people who watch video are more likely to donate or less likely to donate. And so, we were able to look at those people who were more likely to donate and those people who watched more video, “Oh wait, they’re more likely to donate than people who don’t watch video.” And what’s different about them? Well, they watch a lot more video, so how do we get people to watch more video? So we aren’t able to get in through our analytics all the way to the impact outcome, which is giving to your local station, but we were able to kinda connect the dots a little bit and then optimize for a KPI that’s further upstream. That got the whole organization really rallied around video and how do we put video in front of more viewers?

25:40 TW: Of course that’s the dangerous… I feel that that runs in social media, the classic social media, people who like our Facebook page are more likely to purchase. Or some flavor of that where it turns around, it’s like, “No, your most loyal customers are the most likely to like your Facebook page.” But still I think starting with the correlation… And I guess what I like about that, and maybe back to where Michael was, and I don’t know that I’ve really tried it, is that if you have sufficiently granular data on that disconnected front, quarterly maybe tough, but may be usable, maybe you do have some monthly data and you say it’s completely disconnected, but now that arms the digital analyst with the opportunity to say, “Well, what are my 15 possible metrics? Can I find a relationship? Is it causal?” Don’t cross that bridge just yet.

26:34 TW: If they at least track together, then maybe it doesn’t matter, maybe you’ve got a way to track what’s likely happening with that other metric. And then in setting that, then you’re opening up the opportunity to say, “Now I can start trying to figure out, is it a causal relationship? Can that model it?” Do I have to go and do some primary research to actually say, “Hey, what were you giving before you were watching videos?” A before and after. And I’m sure that was not in any way meant to say that approach was assuming causality, ’cause you probably had data where you could compare pre/post type stuff. But that did have me kind of heading down the path of saying, “Yeah, if you’re able to measure that outcome, even if you can’t tie it back to the digital data, you’ve got a starting point”, which is back to the thing the industry’s been talking about for five years now, so we’ve gotta link data sets. So we don’t necessary have to link them with this person, this person watching video, and this person clicking on this site, and this person watching TV and donating. But we can at least get those… Overlay those two data sets together and say, “Are they moving? Are they correlated?”

27:48 AS: Are they giving me a signpost that I should head down this path?

27:51 TW: Yeah, as opposed to, if there’s no relationship, then not only is it not causal, it’s not even correlated. It’s kinda like bounce rate. In a lot of cases, it’s a good metric, it just probably shouldn’t be your KPI.

28:05 MK: I swear I have a newfound admiration for analysts now who work in a not for profit sector. Even just wrapping my head around this, it sounds incredibly tough. I think our arguments here about KPIs have probably got nothing on some of the discussions that you guys have. But it sounds like, belief in the mission and understanding it has to be that much more important for the company because you don’t have those, I don’t wanna say, easy metrics, but you really have to constantly bring the conversation back to, “Yeah, does this KPI actually help support our mission?” And to have that, my assumption is you need to have a greater cultural understanding and belief in your company’s mission. Do you think that’s accurate?

29:02 AS: Yeah, I think so. I think the thing that you have to watch out for is, do you rely too much on the mission? And get yourself back into, “Well, it wasn’t a success from a KPI but it was a success for the mission.” Are you using that as an excuse?

29:19 MH: Oh my gosh. ‘Cause yeah, I worked for a company for a period of time where it was all brand, so it didn’t work, as for the brand.

[laughter]

29:27 MH: Brand building. It’s a brand building thing.

29:28 AS: And there’s legitimate times… We do a lot of symphony programming on TV that doesn’t get high ratings but you do it for a reason, there is a mission-based reason to give access to people who wouldn’t ordinarily be able to see this symphony. So it’s appropriate, but you have to make sure it doesn’t become a crutch that, “We did it for the mission.”

29:47 MH: Is there a mission-based reason for The Lawrence Welk Show?

29:50 AS: I don’t know.

[laughter]

29:54 AS: That does not technically come from PBS.

29:57 S?: I know the member stations inside, what they’re gonna air… No, I’m just kidding.

30:06 TW: I worked with a large philanthropic foundation, different kind of nonprofit, but they also had a time-component, where their mission was… They were trying to shift things that were literally gonna take decades to change. Part of their… Their core is, “Make Americans Healthier”. Okay, that’s pretty broad. Well, they could boil that down to say, “Well, let’s reduce smoking.” Well, how long does it take to move… Two challenges. One, how long does it take to actually move the needle to reduce smoking? There’s data that says that’s happened, but now how much of that can be… How do you tie that back to one foundation? Even if it’s a large foundation doing a lot of good stuff, damn near impossible.

30:49 TW: And the movement of the data just… It’s not gonna be fast enough for them to tie those two things together. Although I think they then wind up back essentially in that micro-level world of saying, “Look, if we can expose people to a campaign where they learn about the negative side effects from smoking, and we can survey teenagers more frequently and say how likely… Have you been exposed or not exposed? How likely are you to start smoking?” Or whatever the questions are. So I guess that laddering still works, it just can be… It does, it takes some work. Something going way back, you sort of mentioned in passing that people don’t even question if it’s… And I know nothing about Nielsen ratings, so I think you said, “0.2, they just accept it as being good.” Have you had experience…

31:39 TW: I think we tend to not like these things that are just these indices that are derived metrics from a combination of ratings… What was it? Ratings, reach, and frequency combined together, but is there sometimes a benefit if you say, “Yeah, this thing we wanna measure with our digital data, we do have to stitch together traffic and pages per session, and the percent that reached this one micro-conversion, and we’re gonna roll it into a number that’s between one and 10, and we’re gonna call it traffic quality.” It tends to be a tall lift to get people to just kinda buy in to whatever the intuition behind that is, even if it’s right. So that’s maybe a question for everybody. Does that ever work?

32:27 AS: I had an engagement metric that I used at PBS for a long time, which was sort of a compound metric, sort of inspired by an article that Eric Peterson wrote a decade ago. And we used that for a while. Everybody thought it was cool, ’cause at that time it was engagement was all the thing, and we need to stop measuring pages, we need to measure engagement. But nobody could ever understand it. And so we used it for a while, and if I can get people to understand it, that you could apply that metric to particular pieces of content and say, “Oh, look! This graphic of a photosynthesis on the NOVA website has much higher engagement. What are they doing that then a person on Nature might wanna try and do something like that?” But we could never get the conversation even to that level. It always was around, “Well now, tell me again how is this calculated, and what does that mean?”

33:17 TW: How did that differ form the Nielsen one that they didn’t understand either?

33:21 AS: Well, Nielsen ratings is a currency. That is the currency of television. And so, whether you’re a buyer or a TV network, everybody is using the same metric. And that’s the part that media companies get upset about with digital is, there’s no common currency in digital. And that’s what they’re always sort of looking for, is how to make the digital numbers look like the TV numbers.

33:44 TW: That engagement, that got some traction and there were people using it. If it had managed to continue on that trajectory, maybe it could’ve gotten to be a currency of…

33:53 AS: Maybe, I think the challenge for that was, it was too complicated to explain. And then over some time, we kinda tore it apart and realized that one metric in the compound metric was actually driving everything, which was frequency. So I’ve actually gotten people to really focus on how do you increase the number of times people come to the site rather than focusing on all the other things, ’cause the other things didn’t drive any behavior.

34:18 MH: Interesting.

34:20 AS: So simplify, actually.

34:21 MH: I remember when that white paper came out and I was reading through it. And I also remember having beers with Eric Peterson and talking about that model and why I didn’t think it would work. [laughter] And then you guys did it, and I was like “Oh, wow!”

34:39 AS: Yeah, but I’ll tell you…

34:40 TW: Well, there was… For years, there were people who were still using it.

34:44 AS: Yeah, it died out for a while, and I’ll tell you what, at the Digital Analyst Hub this past fall, it was a big topic of conversation. There are a lot of people who are doing it again. And so, I was a little bit surprised. I was like, “Really? This is a thing, it’s back?”

[laughter]

35:00 MK: So sorry, for the newbies in the room, can you catch us up?

35:02 TW: Imagine a formula that had to use the formula editor in Word, ’cause it was a fraction with Greek symbols. It had something like five or six or seven different components. It was very thought through of what are all the components, what’s the numerator, what’s the denominator, what’s additive, what’s… And I guess maybe I shouldn’t have answered the question, ’cause I can’t remember what a single one of the components actually was.

35:30 AS: For mine it was recency, frequency, page depth, and time spent. And so we set those as benchmarks, and then anybody who fit above those, they were all in one group. So you ended up with some. And then we expressed it as a percentage, so it would be like 12% of the site traffic exceeded all four of those benchmarks.

35:50 TW: So frequency, I’m just thinking of the Google Analytics view of recent, the frequency, where it winds up being a bar chart, or a horizontal bar chart, where to me, that’s one that’s actually really hard to interpret and make as a metric. I wind up going to the part of saying, “Well, I wanna go with the median frequency, and I want my median frequency to above X in order to turn into a single metric.” That’s way off in the weeds, except it’s coming up for something else I’m working on that is a similar type challenge. So how do you measure? Is it an average frequency? Or is it a…

36:26 AS: I usually do average frequency, so the average number of sessions per month, usually for us. And then look at where somebody’s a frequent visitor, are they visiting on average four or more times a month, which gives me an idea about weekly.

36:43 TW: Okay.

36:44 MK: And I’m assuming that you have to log in to access the online digital content?

36:51 AS: No, you do not.

36:53 MK: Oh wow! That’s a whole ‘nother level of complexity you’ve got there.

36:56 AS: Yep.

37:00 MK: Whoa! If it were up to me, the whole world would have to log in.

37:02 TW: Now Moe’s gonna ask, how are you stitching users across mobile platforms to desktop, ’cause Moe says everybody has to do that, ’cause it’s critical.

37:12 AS: We try, but it’s difficult.

[laughter]

37:16 AS: I’ll just stop right here.

37:17 MK: That’s why the whole world should have to log in, it would make my job so much easier.

[laughter]

37:21 TW: There, we’ve solved it.

37:22 MH: Oh, Moe! Then how do we guarantee people’s privacy? Okay, we gotta start to wrap up. Again, this is great. I heard so many really good things, although I’ve also heard a really hard thing in this, which is, “Hey, to make this work, first set up an objective-setting process at the top levels of your organization.” And so that part I think maybe is a whole show by itself. But Amy this is really awesome. One thing we love to do on the show is go around and just do a last call, sort of a way for us to share things we’ve seen that are interesting lately. You’re our guest, Amy. I don’t know if you have a last call to share?

38:02 AS: I do. I do listen, so I know that news is coming.

[laughter]

38:07 TW: Oh, excellent. You’re ready. That’s better than most of us over here.

38:12 AS: I had three, but I picked one. I picked one. So my last call… You guys have been talking a lot about personality types and Myers-Briggs over the last number of episodes.

38:22 MH: Tim is learning to love that right now.

[chuckle]

38:27 AS: I know. My last call is to recommend for all of the introverts out there a book called The Introvert Advantage. It’s Marti Olsen Laney, and a friend of…

38:36 TW: Oh, my wife [38:36] ____ was reading that.

38:37 AS: A friend of mine gave me that book, and I am an introvert. And I will say that one thing that I took away from that was a technique that we introverts can use in meetings, which is you know, you walk away from that meeting and they’re like, “Oh, I should have said this,” is to follow up with an e-mail that says something to the effect of, “Dear whatever, I was thinking more about the discussion we had and here are the things that I could recommend.” Or, “Here’s this analysis that I did,” so leverage the tools to help you out. [laughter]

39:06 MH: Oh, man.

39:08 MK: Nice! I really like that.

39:09 MH: That’s awesome. Yeah, that’s really good. I need to go check that one out.

39:13 AS: It’s a quick read too.

39:14 MH: Well, Moe. Just going around in a circle, what about you?

39:17 MK: I have a bit of strange one. A few weeks ago, I shared a blog post by Melinda Gates. And I really wanna encourage everyone to read it, because I feel like Imposter Syndrome is something that everyone or most people, kinda suffer from at some point in their career. And I really like how she frames kinda challenging that and overcoming it. So we’ll share in the show notes. It’s called, “How to Tackle Imposter Syndrome.”

39:46 TW: That’s funny. Dylan Lewis private messaged me that, that specific post, was like, “Here, Tim. You should read this.”

39:56 MH: Yeah. I’ve spent all my years in analytics befriending all of the people like me with Imposter Syndrome. Like, “You, me? Yeah. Okay.” Anyway. Alright, Tim? What’s your last call?

40:08 TW: I will go with a book recommendation. I’ve actually had this for a while, I just haven’t had a chance to use it. So it’s called The Big Book of Dashboards. You guys seen this?

40:19 MH: No.

40:21 MK: Wow!

40:21 TW: The premise of the book… It was at MeasureCamp, Cincinnati that one of the co-authors, Jeffrey Shaffer, actually is an adjunct professor, and he’s based in the University of Cincinnati, and he works in Cincinnati, and so he presented. But the premise was, “Hey, we’ve got all these books and all these guys to talk about bad data viz, and bad dashboards, and show those.” So they basically sat down and said, “We’re gonna compile… ” And it’s huge. It’s probably over an inch thick or two and a half centimeters for the international audience that’s not stuck on crappy inches and feet. It is actually cool. He’s a heavy, heavy, heavy Tableau user. So he’s kinda inherently sort of studied at the [41:03] ____ foot of Stephen Few. But I haven’t read it, but it’s definitely like the, “Oh, let me flip through and get a little bit of inspiration.” And they really are some nicely done dashboards with some analysis of what makes them work.

41:16 MH: Nice.

41:18 TW: Big Book of Dashboards. What about you?

41:20 MH: It’s weird that it wasn’t a podcast for you, Tim, somehow.

[laughter]

41:27 MH: Alright.

41:27 TW: Oh, wait. I have seven more last calls.

41:29 MH: Okay, yeah. Well, I think everyone who listens to this show knows that Tim is the only host who keeps a list of last calls ready to go at all times.

41:41 TW: I did have an experience when I was onboarding in my new company, where I realized a fellow coworker and I got into such an intense discussion that my new manager and another new hire just kinda stopped and watched us geek out about podcasts and podcast trivia.

41:57 MH: It was pretty funny. Yeah, anyway. My last call is for Moe… No, it’s not for Moe. It’s actually something I found for me, but I think it might be good for you too, unless you probably already know this stuff. Turns out, Google has been doing a lot of research over the years about HR related things. And a while back, they actually did a big research project to prove that managers aren’t important, or don’t matter to success, and actually, the research immediately went the opposite direction that managers are extremely important to people’s success. And so then over the years, they then kept studying that and coming up with the key aspects that really good managers do to help their teams, and things like that. And so there’s actually a whole set of guides and things that Google has published for this research with case studies, and blog posts, and tools. And it’s really cool. And I’ve been using it, so if you’re in that sort of nexus of managing analytics people, and/or managing teams or whatever, it’s from Google, and it’s pretty good stuff. [43:03] ____ What it’s called…

43:03 TW: Did you hear about the follow on where they took their homepage designs, sort of applied it? So they just have managers. They’ll have them cut their hair a little bit. They’ll have them come in with glasses, without glasses. They just started testing like crazy. Did you not see that?

43:17 MH: No.

43:18 TW: They have like a thousand managerial experiments going on at once.

[laughter]

43:21 MH: A, B.

[laughter]

43:26 MH: Yeah. Doing…

43:27 TW: [43:27] ____ Michael taking his glasses on and off…

[overlapping conversation]

43:28 MH: Doing visual jokes on a podcast, really good. Okay.

[laughter]

43:34 MH: This is excellent and actually a really great topic. Amy, thank you so much for coming on to kind of discuss and answer all of our questions, some of which were related, some of which were just curiosity about PBS generally, which actually is okay too. But really appreciate you coming on the show.

43:53 AS: Thanks for having me.

43:54 MH: We, as a group, while we may be at times introverted, we do certainly love to hear from you who are listening. So if you are listening and you heard something you wanna ask more about, we would love to hear from you. Probably the best place to do that is on the Measure Slack, or you can reach out to us on our Facebook page, or Twitter, or our website. But we would love to hear from you. I’m gonna skip an episode where I shill for iTunes ratings. Except I did. [laughter] See what I did there? But…

44:28 TW: Nicely done. Maybe this will be the one where we…

44:29 MH: If we could just get 15 more people in the next half hour to the $30 pledge… No, I’m just kidding.

[laughter]

44:37 AS: I’ll give you this DA.

[laughter]

44:42 MH: Those drives have gotta play havoc with your numbers, because people have just gotta be tuning out as soon as they see that, in droves. The people who aren’t members of the stations, and stuff like that. Anyways, sorry. We should have brought that up way earlier.

45:00 TW: Another topic.

[laughter]

45:01 MH: Anyway. Yeah. It’s okay. I’m doing the same thing to our listeners right now, but we…

[laughter]

45:11 MH: Just like, “Michael, stop.” I can’t. I can’t get off! [laughter] It just keeps going. No. We’d love to hear from you. For my co-hosts and our guests, for all of you out there, just remember: Keep analyzing.

[music]

45:30 S1: Thanks for listening. And don’t forget to join the conversation on Facebook, Twitter, or Measure Slack group. We welcome your comments and questions. Visit us on the web at analyticshour.io, facebook.com/analyticshour or @AnalyticsHour on Twitter.

45:51 S?: So smart guys want to fit in, so they’ve made up a term for analytics. Analytics don’t work.

[music]

46:00 MK: I’ve been listening to all of the old episodes. Look at Tim’s face. Why would you listen to the old ones?

46:05 TW: Why would you listen to the new ones?

46:07 S?: Somebody else can visit her at the corporate [46:10] ____ core and go.

46:12 TW: Oh, that’s right. She was working for you. [laughter]

46:15 S?: Yeah. What is it? Smart goals. Smart goals.

46:19 MK: Smart goals. Wait, isn’t that in one of your presentation decks?

46:24 S?: No. Not in one of mine.

46:25 S?: I’m allergic to acronyms.

46:27 MK: Oh no. What’s the one you had with the keynote, with all the words flying and it has the… Oh, it’s ADAPT, isn’t it?

46:33 S?: Yes. So I actually made that up. It’s a forgettable thing.

46:36 MK: Which I clearly remembered. And I actually have a history of not remembering stuff.

46:43 MH: I tried to sneak in everything, even though I had less time. It did not go well.

46:51 S?: Although, I was just this team of me, in the first part but… [chuckle]

46:54 S?: That’s sometimes the hardest person to manage, isn’t it? No…

[laughter]

47:05 TW: I can’t tell if he’s still up.

47:07 MH: I’m still here.

47:09 MK: I think he’s still working.

47:10 MH: No, I’m not. I’m taking notes.

47:14 MK: Oh, jeez.

47:17 MH: Sorry, my phone started ringing. Is that vibrating my microphone?

[laughter]

47:21 MK: Yes.

47:22 TW: Yeah.

47:24 MH: All right. I’m holding it away from my desk now.

47:28 TW: Yeah, I have no idea where I’m going with that. Cut that.

47:33 MH: Hold on, Tim. I’m reaching out here. Let’s see. ‘Cause what I was thinking while were saying that was, “So sensitive!” Example, somebody’s streaming a PBS show… Nevermind. I’m gonna skip the example, ’cause I don’t know where I’m going with that. Usually, I have somewhere I’m trying to go and I…

[laughter]
[overlapping conversation]

48:02 MH: I was out there, and it was sort of dusk in the desert, and I’m looking around. I don’t know which direction… [laughter]
[music]

48:12 S?: Rock flag and tricky measurement.

[music]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#258: Goals, KPIs, and Targets, Oh My! with Tim Wilson

#258: Goals, KPIs, and Targets, Oh My! with Tim Wilson

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_258_-_Goals_KPIs_and_Targets_Oh_My_with_Tim_Wilson.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares