#046: Measuring Podcasts at NPR with Steve Mulder

Do you listen to podcasts? Well, of course you do! Are you working in or involved with analytics? If you listen to this podcast, you almost certainly are! Where do those two interests intersect? On this episode! Steve Mulder, Senior Director of Audience Insights at National Public Radio (NPR), joins Michael and Tim to discuss podcast measurement…and audience measurement…and the evolution of analytics…and standards (well…guidelines)…and more! Tim fanboys out in a way that would be embarrassing if he was sufficiently self-aware to be embarrassed. In other words, it’s a rollicking good romp through public media.

Resources and the like mentioned in this episode are many and varied:

 

Episode Transcript

The following is a straight-up machine translation. It has not been human-reviewed or human-corrected. We apologize on behalf of the machines for any text that winds up being incorrect, nonsensical, or offensive. We have asked the machine to do better, but it simply responds with, “I’m sorry, Dave. I’m afraid I can’t do that.”

[00:00:23] Hi everyone. Welcome to the digital analytics power hour. This is episode 46.

[00:00:30] You know where we started this podcasting journey more than a year and a half ago. One of the things we attempted to do was develop a measurement strategy and key performance indicators or keep the eyes and my right for the show. And just to review our goals were to publish 26 episodes get at least 300 downloads per show and have a growing trend of listeners and do a podcast about making the podcasts while we hit all but the last one and we aim to rectify that right here and now. This might initially strike you as a bit indulgent and definitely Medha but if you’re nerds like us I think you’re going to enjoy it too. Tim Senior Partner analytics demystified and my cohost Hello sir.

[00:01:18] WCB and w o s you would be my public radio stations that I listen to. Thank you very much.

[00:01:24] That doesn’t make sense in context and it doesn’t matter. But I know what we’re talking about so I’m making a joke. Anybody who’s read the title telling the people on this podcast The context matters anyways. It doesn’t matter.

[00:01:37] I’m Michael hobbling I’m the analytics practice leader or Discovery. So naturally we wanted to draw on the experience of someone in the big leagues of casting and so I’m excited to introduce our guest Steve Moulder. Steve is the senior director of audience insights and PR. Prior to that he was the V.P. of experience strategy at Isobar and a long long time ago is the user experience at Lycos all search engine. Probably you might have heard of he also has written a book about user experience called user is always right. A Practical Guide to Creating music personas for the web. We are delighted to have him on the show to talk about podcasts. Welcome Steve.

[00:02:23] Hey thank you very much. Great to be here.

[00:02:26] I think you have a perfect NPR voice so this seems like it’s working really well.

[00:02:32] You are very very kind. I think I’m surrounded by people that I work with who would never say such a thing.

[00:02:38] OK well I also have no idea. So anyway welcome to the podcast Reseda we’re here. I think as we kick things off we want to talk about both podcasts measurement as a practice but also kind of how NPR is doing it and may be a great way for us to start is just for folks to understand sort of what it is you do at NPR and kind of how that intersects with all the different podcasts and an audio and measurement that NPR is doing.

[00:03:08] Yes so our team the audience insights team is really the hub of you know audience intelligence within NPR and across the entire public radio system of NPR stations. So we do you know we run basically digital analytics broadcast analytics custom reserves third party research. We’re really you know trying to bring. An understanding of audience and how to act on that understanding across the entire organization right whether it’s programming decisions digital decisions business decisions you name it across the board. And of course part of that umbrella is podcasting the wonderful world of podcasts measurement is that is an entire team based in Boston where you are scattered across different offices scattered across the galaxy. Now it’s primarily in DC with a few folks in Boston as well.

[00:03:58] OK. So what is the podcast ramp up from NPR. So I think I’ve I make no secret of the fact that I listen to multiple NPR podcasts and it was excited when like fresh air kind of jumped onto the became a podcast because it seemed like it was a laggard for a while. I think that’s w h y y now to Philadelphia. Is that right. Yeah that’s right. That’s right. Yeah we’ll edit out all my all the fanboy stuff. Maybe maybe not. We’ll keep it in there.

[00:04:24] Keep it there.

[00:04:26] Yeah I mean I’m sure we will get to analytics. There is.

[00:04:30] To me it seems like NPR is such a natural because you guys already already do audio content and kind of journalism based audio content and so podcasts seem like kind of a natural shift. But like how much can you talk about the evolution over the last three or four years of somewhere a lightbulb going on and kind of where did measurement fit. Because it seems like when you’re running through affiliate stations basically and everybody has a you know a member hopefully as a member and there are funds being raised and you have that local tie in. I would think that measuring media is radio is kind of radically different from measuring audio media as the OnDemand radio of podcasts. There’s like either six questions or no questions.

[00:05:20] Yeah you know the weird thing about what what we’re doing right now is measurement across all of these all these platforms that NPR takes advantage of. So different traditional broadcast radio comes with know long time existing standards.

[00:05:35] First through Arbitron now through Nielsen you know that’s kind of that’s kind of an established science right like measuring traditional radio but then you get into other forms of listening whether it’s a live audio stream which will pretty much all the NPR stations have now or On-Demand audio such as podcasts. It’s all vastly different which is what complicates our life every single day. You’re totally right.

[00:06:00] So let me ask is the traditional longstanding I’ve always had a theory about TV measuring TV audiences that is actually fundamentally crap. But it’s just blong standing totally accepted crap and is is live broadcast radio a. Everybody accepts it and just kind of takes the numbers at face value and doesn’t question the validity seems like it would be very similar challenges as measuring TV effectively.

[00:06:27] Yeah it’s you know it depends who you talk to. How much credibility there is in those numbers. But it speaks to something and I really believe in which is that whenever we talk about a standard and measurement of anything we’re talking about a standard it is a you know it’s a good enough mutually agreed upon thing we’re all going to point out and say yeah that’s a good yardstick that’s good enough fleshless use that.

[00:06:49] So you know while any kind of measurement is never going to be perfect as we all know we live and breathe this stuff you’ve got to start somewhere. So you started with TV. You start with radio and you know there’s a lot of technologies and radio that are getting better and better and measuring broadcast signals but they’re not perfect but it’s something the entire industry has pointed to and said you know what that thing that Nielsen is doing and they’re coming out the ratings every every month every every year. That’s good enough. Let’s just move on with our lives and reliability as a standard and that’s why we need every listening platform in every text platform. We have it on the web. We need it in the world of podcasts. That’s something that’s been evolving I know we’re going to talk about but that’s that’s absolutely true in the broadcast radio world as well. It’s just you know Nielsen’s decided this is the way to measure it and we’re all sort of sort of onboard with that and rely on it as pretty good. They’ve gotten really good at a lot of things good. It’s going to be perfect. No never going to be perfect.

[00:07:45] So it gets like bond rating agencies right I mean those are those are totally reliable and never know.

[00:07:51] I mean there were possibly nothing ever goes wrong. I did not even.

[00:07:56] It had not occurred to me.

[00:07:57] The live streaming which occasionally I like dip into when I’m like surprise and like oh now hell Sonos Look I can click on you know radio by tune in I’m not sure which one i click on I’m like just to check and say oh my god I can listen to my local. So do you have a sense do you guys have when you talk about audience that you’ve got X percent of your audience is only traditional broadcast radio which I do that because it’s my wake up in the morning on the radio and then x percent is live streaming and X percent is more on demand and these are the ones who are across all channels is that some of the stuff that you try to wrap your heads around. Yes.

[00:08:37] We constantly try to wrap our heads around that and we do with mixed success. You know the hardest thing about my job right now one of the hardest things is getting a true sense of audience because while I have these amazing data silos it turned out to be really hard to measure someone like you who might be on the radio and on podcasts. And to me and my systems you look like two different people.

[00:09:03] Right. You’re a podcast listener over here. Your broadcast listener over here and I can’t necessarily combine that and see you as one person. So I end up with right. I end up with with exaggerated audience reach numbers in a way that I can’t do right now and in a really easy way and there are folks working on this but it’s absolutely a key problem for you know a lot of organizations and more media organizations like us that are struggling with how do we get a much truer sense of audience reach when we’ve got all these platforms out there.

[00:09:34] You know for somebody who claims to be such a huge fan Tim it’s pretty bogus to do that.

[00:09:41] I’m telling Steve right now. Just write it down. Don’t give me giving you the. What do you need.

[00:09:48] Just know that 30 per cent or so of the traffic you’re getting from northern Columbus probably Tim will write that down.

[00:09:59] I’ll put that into my analytics tools. Yeah that’s perfect.

[00:10:03] Well I mean the get the crazy thing is it runs into the same thing. It’s more challenging than Web site. Literally my daughter my youngest it’s kind of a common thing she wants to do is when I pick her up for gymnastics she’s like can we listen to a podcast and I’d say half of the time it’s an NPR podcast and you’re back to the same audience issue. Now you’ve got a household listening to it. Now she’s probably a couple of years away from being a donor but you know she’s the future I don’t know where I was going but it’s one of the key ways where you know we think about podcast remeasurement as an evolving thing.

[00:10:39] Like all measurement is right. But it’s kind of it’s kind of a couple of steps behind where we are with with Web measurement. When you think about it so I’d like to think about this as sort of as an evolution right so you’ve got on the Web we began with hit counter’s oh we remember them fondly counterfeits of our youth. So you start with head counters you start talking about hits. Then we then we started talking about page views. Finally we started talking about audience the people who matter talking about people who would have thought right. And you get this evolution of what you can measure this evolution of what’s important to measure how we’re going to use that information. Similarly in the world of podcasts where you got initially it was all we had was that file was requested. That particular episode oh we don’t know what happened after that. Then you got to talking about downloads. How many downloads that a podcast get. And now finally we’re getting around to talking about what’s important which is audience talking about the people that are downloading these things or listening to them via streaming or whatever and we can start treating that data in a much more evolved way than just talking about heads and page views and downloads in a very raw sense. So that would be the way we think about it even I think even in the last three years has really evolve that way.

[00:11:50] So not to take us too far off course but there’s sort of a subject that I wonder if you could shed some light on which is just sort of what have you observed in sort of the growth of podcasting generally just you know you can you speak from the perspective of NPR naturally. But you know I don’t know how long podcasts have been around they’ve been around a long time but they haven’t been. They’ve grown in popularity over the last say few 5 7 years maybe. And I don’t know. Have you seen anything in that growth or what have you observed.

[00:12:22] Yeah I think there’s a few things going on. One is you’ve got a entire society that we are now drawn to self curated ways of consuming media. Right. I mean think about our consumption of video and TV and movies. It’s become much more prolific across all types of channels and platforms. But it’s also much more about selection right like choosing what we want. And similarly now in audio in spoken word you’re seeing that same evolution where people are realizing there is so much amazing content out there. This podcast included of course there’s so much amazing content out there that I have so much in my fingertips. We have merely to expose people to all that rich offerings and let them choose. And part of why you see podcasts just exploding is that level of choice right and the kind of just amazing diversity of content and voices and perspectives that are out there right now much of which you just can’t get on traditional broadcast airwaves. I think that’s a big part of it and you’re also seeing you know organizations sponsors catching on to just how much value there is in this industry. So. So here’s the here’s a stat that we’d like to talk about recently at NPR which is compared to three years ago. NPR’s revenue from podcasts is now ten times what it was 10 times in three years. And that’s not just because we’re you know we’re drilling 10 times as many shows right.

[00:13:47] That’s a lot of our stalwart programs like Ted Radio Hour and right although we’ve been talking about fresh air as well as new programs like embedded and Hidden Brain and invisibly. So it really it really all adds up to a pretty profound growth in terms of the number of people listening and the number of sponsors who are interested in taking advantage of it.

[00:14:09] Two questions. So one observation is the other thing is I think the time shifting that I think that is people got everyone got DVRs and everybody got conditioned to TV on demand. Because I think if it was podded there was kind of an early heyday of podcasts and then it did. I remember being made fun of when I was listening to a podcast because people still listen to podcasts and those a few years ago and now they’ve kind of exploded and people point to cereal and but I think part of it is also the time shifting nature that I can listen to it literally on demand when I want to listen to it. But when you say the the revenue growth.

[00:14:45] I haven’t listened to a pledge drive in probably three years because I’ve shifted so I know I’m aware when they’re on because it’s my the radio comes on in the morning. But for the most part I’m kicking over to a podcast. So it is the revenue because they’re not called advertisers they’re called What underwriters or whatever the NPR is it because you’ve got those you know hidden brain is supported by. Or is it listeners when they’re there are occasional I’m trying to think whether NPR whether it’s more just PR acts that kind of occasionally says go here. I remember trying to go and donate to Planet Money and it was it was hard like I wanted to go give money to support Planet Money and I think I don’t know if I want to give to WNYC or whatever it was because. So where does that resonate when you say the revenue growth. I guess I’m asking what’s the model that that happens.

[00:15:38] Yeah. Really what I’m talking about that it’s it’s classic underwriting and sponsorship. So the hidden brain is brought to you by. That’s really where the direct revenue comes in addition to that obviously NPR gets gets a lot of funding from the member stations themselves paying for programming.

[00:15:53] So you know indirectly you could say that there some revenue coming from that direction as well and institutional giving and that kind of thing. But what I’m talking about really is just just looking underwriting and sponsorship itself. You know it’s growing tremendously and I know a lot of podcasts are finding that to be the case and you guys clearly need to get on it.

[00:16:10] I don’t think underrating train you know we’ve seen our revenue growth also triple in the last year and a half it’s actually increased infinitely because yeah that’s when the Times adulations that’s awesome. Yeah.

[00:16:29] But is that do you have because you got the specific nature of the programs is part of that or you’re providing evidence to the underwriters to say wow if people are interested in neuroscience and psychology you know these are people that you want to target so we can kind of guarantee and we can also give you some sort of data about the scale of how many people are listening to Hidden Brain.

[00:16:56] Yeah. So you know the advantages that we have are a lot of ways similar to the advantages on the air as well with sponsors so you know we have a lot of evidence decades going back now about how listener perception of people who advertise on public radio who sponsor public radio in the perception of those brands are likely to consider those brands is significantly higher than on commercial radio.

[00:17:19] The downside is are all hippie liberals who are just being brainwashed by the left wing media but I don’t know which podcast you’re listening to.

[00:17:30] Ok so I think I’ll cut you off but you were saying that you’ve got credibility is a desirable target. I feel like even for for profit podcasts I feel like with my kind. There’s kind of a mist that it is kind of pretty easy and my perception is relatively cheap to become an advertiser on a podcast and you have a pretty good sense of the audience.

[00:17:55] But is there also a defining the audience so your audience guy and other podcasts and I could have sworn I heard this on NPR when I went back and tried to find it and I googled for it and couldn’t find it. Like the old fill out listener survey. Tell us about yourself. Are you into all those.

[00:18:14] Yeah. So that’s part of what we do for the various shows the various podcasts is constantly doing customer research or listener our audience surveys really to understand how people are reacting to the shows to understand of course their demographics. We test things like how much do people remember the sponsors they’re hearing in the broadcast as well.

[00:18:35] Right. So we can go back to the sponsor and say like hey this is how many people unaided we’re remembering your particular underwriting spot and the kind of reach that you’re getting. We can we gather that kind of data through surveys for sponsorship purposes for content purposes right to understand what’s working for just overall strategy purposes so we know where are we connecting with our audience in a very real way. You know we’ll do we’ll do cross show surveys where we do a bunch of we get a bunch and we’re doing a little one of those right now that’s out in the market and we’ll do like deep dive. We did a survey on you know just invisibly as audience for example just recently when that team really wants to immerse themselves in how the listeners are feeling about the shows that they just they just finished listening to and can people tell the two original hosts apart.

[00:19:24] Audibly that yes some people claim to but we think that’s a myth.

[00:19:30] This actually just sparked guy you’re gonna kill me. BOLLING So I know.

[00:19:35] So now remembering I’ve forgotten the episode The Planet Money did where they talked about they talked about dance record. I don’t think they mentioned optimize Lee but they talked about AB testing and AB testing through the NPR One app and actually pushing different improvs different bumpers or different titles like him and what it was.

[00:19:51] So are you guys also AB because you have the NPR One app. Are you regularly AB testing or are you involved in that. Is that another element of it.

[00:20:02] Yeah it is a little let me lay out like that.

[00:20:04] The two fundamentally different types of podcasts measurement we’re doing right now so you know for most of the NPR podcasts listening that’s happening out there in the wide world it’s happening via players all manner of audio players that we don’t control right that are way outside of the NPR influence and so forth. It’s only a fairly small percentage of listening that’s happening within environments like NPR One app or the unpaired or website or what have you right. So where we controlled the platform. Of course we can control and do a lot more measurement and do a lot more with data and testing as you said. So yes for example when the NPR Politics Podcast was getting started last late last fall one of the first things they did was they put a couple of sample episodes on NPR One as a testbed before anybody else heard it as a way to start gauging interest. Did people stick through it. Did they skip. You know you can learn a lot by just looking at a sample audience right in a platform like that and unannounced right just it starts appearing in the flow of NPR audio and and seeing what works and they just they learn a tremendous amount about what that show the promise of that show was and the level of interest as well. And then we back that up. So you know very often our research our group that does the research behind concept testing of shows so we’ll get together either as a focus group or just a bunch of individuals and we’ll have them listen to part of a new show and give us their opinion right.

[00:21:28] Existing listeners potential new listeners. And so what will you use all this data and making decisions about you know which shows get brought to fruition. If a focus needs to change or things need to pivot you know we can learn a lot early before an official launch. So yeah NPR One is a great platform for that kind of testing because we can roll things out. We’ve got a ready audience and we can see all the detailed metrics much more than we can get through. You know the vast majority of NPR podcasts listening that’s happening out there.

[00:21:59] So is there a weekly podcast performance measurement dashboard or something like Is there internally.

[00:22:06] Is it are you guys saying yeah that one started off strong but it’s kind of slipped is there I guess even backing out from that when it comes to do you have consistent eyes for all podcasts basically that people are aligned to. And if so I guess then how how good or bad do you feel they are.

[00:22:30] Yeah yeah yeah well let me lay out the way we really think about it is is that there are different ways of measuring podcasts for different purposes right now. And in fact that’s a good thing. So let me back up you think about think about the websites that we’ve grown to love over the years. When you think about the Web we really have at least three different measures of the web for different purposes. We have tools like good old Google Analytics or number of record our internal number of record with the most comprehensive data we can do the deepest kind of analysis. Right. I mean that’s one source. We have another source for doing industry comparisons. We have the scores of the world right where we can do true apples to apples industrywide comparisons that we can’t really get through. GA obviously. And then finally think about sponsorship and underwriting and the kind of metrics that they need on the web and you’ve got D.F. P. It’s not like we’re using Google Analytics to provides data for our sponsors for banner ads and so forth. So this is exactly where the world of podcasts is growing up into right now. So same kind of format you’ve got the number of record our internal number of record for that I’ll come back to that in a minute. But for that we are doing raw server log analysis and using Splunk which is it originally made for this purpose but we’re using it and it actually works out really well using Splunk to parse raw log files and generate reports and dashboards as we see fit.

[00:23:51] That’s our number of record internally and our number of record externally as well for like the number of official users and downloads for our shows for industry comparisons. You know you’ve got a couple of sources. ITunes was the only one for a while but honestly that there are no top 20 is a bit dicey. The iTunes ranking is people who have tried to analyze this kind of throw up their hands because it’s not really a top 20 it’s more like it’s more like here’s the top movers and shakers as Apple sees them right now but not necessarily a reliable top list but pod track recently in the last few months has come out with their ranking now of the top podcasts producers in the country. And it’s becoming sort of the de facto I think reliable metric of that.

[00:24:35] And then finally you’ve got in where’s Tiger answers. Where’s the pod tracks data coming from is that a Paino is that the APOD track is sort of like Freedberg or used to be.

[00:24:48] So essentially essentially an organization works with Poldrack and submits you know every file request through pod tracked down to the CDN. So Patrick seeing all the file requests and can therefore look at all the raw data almost as well as you could if you’re looking at your own raw log files. So that’s how they do it. You have to actually sign on for it for their service and then redirect through them. But once you do you’re in business and it’s free. And then finally for advertisers and underwriters sponsors. Similarly right. We’re using ad server numbers for podcasts as well. So with NPR our podcasts are on Triton digital media hosted there is our ad server use them for our ad server and they provide the sort of sort of third party metrics for underwriters in terms of billing purposes.

[00:25:33] So you’ve got just like with the Web you’ve got three different ways of measuring all equally valid different advantages and all are really important to our business. So you asked about that we have dashboards so if those internal numbers I talked about were we’re actually parsing raw log files using this tool called Splunk to generate reports and dashboards. That’s really the the intelligence that internally we’re looking at day in and day out week in week out.

[00:26:01] But it does seem like there’s in the for profit world where you’ve got the panoply network which is I think kind of a media driven but more for profit media driven there’s Gimlet Media which essentially spun out from NPR because most of the founders you know came from NPR. I feel like you’ve got probably more clarity of thought around some of this measurement stuff than maybe some of them do. I mean because it seems like they’re the for profit world there are almost exclusively playing to the advertisers whereas it seems like with the public media focus You’ve might have a little heavier stake in the audience maybe in content quality. I don’t know. I’m extrapolating or riffing.

[00:26:44] Yeah it’s a good question I mean I can tell you that obviously serving our audience creating amazing content is way up there in the priority list. But obviously it doesn’t happen without sponsorship right. So you’ve got to have all of this at least a three legged stool going on there. And so the focus is just really high on all of the above.

[00:27:02] So you were talking about pod track which is interesting because that’s sort of giving you that external view and that sort of being provided in terms of competitive intelligence. But you’ve also done quite a bit of work inside of NPR to do standardization and that work.

[00:27:18] Can you talk about that and how does that track with or align with what you’re seeing happening just with tools like Pade track and so are there conflicts emerging. Are there is there good unity in terms of that. Is it because there’s just so little other data that everybody has to basically follow the same path.

[00:27:37] Yeah I am I’ll sometimes talk about this as podcasts measurement being kind of still the Wild West insofar as there’s been for a decade now a lot of really smart sheriffs all around the country doing great work independently on measurement and getting a lot better at it. But what we haven’t had is a true law of the land. We haven’t had a true industry wide standard like we were talking about earlier where everyone can point to that thing and say Yep that’s the standard that the yardstick we’re going to use and we talk about defining what the download is what a user is warning about podcasts. It turns out there’s a lot of little things in the world of podcasts that can really result in vastly different numbers. So questions for you guys. So here’s a scenario. So imagine that you’ve got one person and you know there one person because they have an IP address and a user agent combination. Right. Because you don’t have cookies that the role of podcast. All you have is IP and user agent. So you’ve got an IP and user agent combo that downloads one of your episodes three times the same episode three times in a single day. Does that count as one download. Or three. And who decides.

[00:28:46] So here’s another scenario Let’s say that that person downloads a third of this particular episode we’re making right now this is very meta. But if they download a third of this episode and they don’t know if they play it or not. All know is dad downloaded it. Does that count as a download. Like would you count that in your metrics like all these decisions come to the fore really quickly which is why we needed the yardstick. So what we did was last year NPR got together with about a dozen other public radio organizations. So some of them were stations some of them were big public radio producers like projects and APM and so forth and all sorts of other alphabet soup and we got together and we started talking about this problem realizing that we need to create a yardstick of our own. We need to create a common understanding guidelines if you will about how we as public radio are thinking about this space thinking about how we count what we count why a standard is important. So we worked on these public radio podcast measurement guidelines and they came out in February. So earlier this year. Now it’s been great. We’ve been seeing vendors start to use them. It’s been influencing larger conversations at the IAB. So the Interactive Advertising Bureau has a working group working on this issue as well on podcasts measurement and technologies and so please tell me that they’re not telling me they’re not going to be the ones who wind up as the final. Well we’re you know we’re at that table and look having this same conversation right.

[00:30:13] With a much larger group across the industry than just public radio you can imagine at that table. But it’s really important because you know we want to make sure that when we are talking not only internally but also with potential sponsors and with the press in terms of coverage that we get you know we use the same language all of us and we want to use as much as we can. The same yardsticks and then we’re talking about how many listeners a podcast has or how many downloads an episode has we can be confident that we’re reliably and credibly talking about it in the same way we can compare apples to apples. So yeah big believer in creating industry consensus around this.

[00:30:52] Ideally eventually a real standard like we have on broadcast television etc. and the web you could even argue like we don’t even have a standard per se but at least we’ve all agreed on like yes there’s this thing called a page view there’s a thing called a user and generally how we measure it. We trust our tools as the defacto standards creation right in that regard.

[00:31:13] We don’t have a podcast right now but we really need it just to be clear that when Steve you’re saying there are standards because you shared the link beforehand. There’s literally a google doc standard which I’m assuming we can just post in our show notes for anyone who’s curious as to what those are.

[00:31:30] Yeah yeah. I would describe it if you were to post its Bentley slash podcast guidelines. Feel free to post it. And it is just guidelines I mean NPR and the world of public radio. We are not a standards body. Nor do we want to be one when we grow up. We are not in a place to audit police deal with compliance. Create a real standard. So they are they are merely guidelines but we hope youthful ones.

[00:31:54] There’s a 3C joke in there somewhere. Yeah right.

[00:31:58] I think this standards conversation is one that’s very applicable across any analytics pursuit so that’s really fascinating and certainly the overarching governance from an industry perspective is one right. You know to your point NPR is not looking to take that on but just for yourselves to have that is such a great step.

[00:32:18] It’s pretty cool and so I read through it and it was just fascinating because it started off pretty normal and then all of a sudden it was like into a 6 return codes are like this and I was like. And we’re back a real time call.

[00:32:34] Actually one or two or six plus byte range zero minus zero. I was like Oh right. So this is cat. This has got some very specific things to it.

[00:32:45] And I think that’s what you were kind of alluding to maybe before around sort of hey for all these ten years people are doing a great job just doing a different way and not a lot of consistency on a user or show by show basis for measurement.

[00:33:03] So creating a standard like that is really good. So Tim we’re going to have to go back. Let’s let’s say now let’s subscribe to the public radio podcast measurement guidelines and let’s try to refactor our numbers. We don’t really talk about our numbers so we won’t.

[00:33:24] Well but I guess there’s a question in that because we you know we use a third party host which there is sound like you guys you said where do the hosts kind of the cloud based hosts fit in this they’re not on the list which I think is fantastic that they’re not contributing to what the definitions are.

[00:33:42] But it sounds like you were saying that potentially provider saying where we’re running the stats.

[00:33:48] It’s funny looking at different platforms hosting you know like oh if you upgrade you’ll get more advanced analytics and I’m like I know what’s basically in the logfile like you can’t give me much more but there is a part of me that thinks when do they start saying we are following the public radio guidelines. Whether that is our default is we follow those guidelines or whether it’s an option for you to view. You know what does it really look like if you’re feeling good about yourself you know. So I switch the other way and you’ll you’ll realize that you and your extended family are half of your audience.

[00:34:24] No that’s exactly that’s exactly right. And that’s starting to happen. It’s been really exciting to see that start to happen both on the side of vendors and hosting companies as well as measurement companies starting to take on this issue. You know a lot of them have a seat at the table at this IAB group that I mentioned so the conversation is happening it’s really fruitful and I think everyone recognizes that we can’t just be out there measuring based on whatever yardstick we want. We’ve got to be talking about it in the same way if we want this industry to evolve to its full potential and are the. I

[00:34:56] mean Splunk is a pretty serious tool. I’ve never worked with it but familiar with it. You’ve got your log files you’re cranking them through spark. You’re doing some real some real real stuff there as you guys were talking through some of this. What’s the cut off for their contributors to this document who were like never really thought about it. We just counted every request. I mean are there some that are there like their brains are hurting and I realized that they’ve been really really simplistic or is it kind of by definition the ones who are involved are ones that have you know you kind of get their get their hands dirty with a logfile.

[00:35:32] Yeah it’s a mix. So you know you’ve got some of the larger organizations that have their own capability for this rolling their own numbers using tools like spelunk or ElasticSearch or what have you. Based on log files and then you’ve got you know a lot of Oreg part of that group as well a state you know smaller stations for example who are much more reliant on third parties whether it’s Pade track right or whether it’s their own hosting provider to give them the metrics. But now they can go back to these hosting providers for example and make sure that they’re getting the kinds of metrics that they want so they can ask you know hey how are you guys calculating downloads right now and are you doing this x y and z and making sure you’re getting audience metrics talking about measuring users and not just just measuring downloads are you filtering out bots how are you filtering out downloads that we kind of know aren’t relistening how are you dealing with. You know the same user downloading the same file three times in a particular day. So you can be much more intelligent. Right. And we’re all trying to up our game in this department of just being more intelligent about the metrics that we do get whether we’re generating them ourselves or whether we’re that we’re getting them from any kind of third party.

[00:36:40] And that’s true across everything right not just the world of podcasts which obviously I care way too much about that you can’t care too much about it because it takes me back to early early in your career when you would do something to say Hey technically our data quality is really poor because we are tracking like to your point like all these bots. So let’s start filtering them out and people will be like well wait a second. Don’t get all make our numbers Code Pink.

[00:37:10] Great. And so it’s just interesting. It’s sort of like. All right so yeah I get that says baby steps. So maybe the real big closer of a question is are we smarter this time around with podcasting analytics than we were with digital analytics 15 years ago when we got that curve. Pastor Wright.

[00:37:30] Well will we at least go through that mature maturation process more quickly.

[00:37:35] I mean it does seem like it happens it does happen faster and faster. Right I mean going way back thinking about radio and TV measurement you know it took it took decades to really get better and better at that. And you know we were never going to hit perfect like the asymptote never approaches the effort. I mean you know my geometry just got lost in my brain but you know we’re never going to Drakon with you.

[00:37:55] That’s right. Right.

[00:37:56] We’re never going to get perfect but we’re going to get close. Yeah I think we were getting faster at getting better you know and it started with the web and it’s caring for us to measurement and podcast measurement and you know you name it.

[00:38:10] I wonder whether light bulbs.

[00:38:12] Right now I feel like podcast advertising is likely comparatively incredibly cheap that even without without great measurement with all of the gaps I still feel like just because the weird mix of people who are at organizations that are advertising in the lack of kind of really big mainstream advertisers where they could just own seven podcasts for a year for the cost of what they spent and you know two days on banner ads are 30 seconds on TV. I kind of wonder if podcast adoption keeps going up in all of us and there will be more and more brands kind of clicking to that. We may run into kind of a messier world where it still seems like there’s a limited cost to make podcasts like you can get a a high margin. We’re not making a business out of this and you guys are public media so maybe the three of us are the worst to talk about it but it seems like from a business model it’s going to get better for the podcast is going to get more competitive right now. I feel like brands are missing an opportunity if they’re going if they have a key audience that is a demographic that is likely to be enabled to listen to podcasts and they could spin a little bit of time figuring out where those podcasts are and get some some decent conservative or aggressive some a range of what the reach is.

[00:39:35] And it’s a captive audience and to your point there are some podcasts where they probably have credibility not necessarily NPR ones but other ones saying these guys are putting out great content and I’ve got a better boost than if I’m just running on CBS. There’s potential there that’s tangent number 72.

[00:39:53] Well yeah. And the potential is it has yet to really reach its max I’m sure.

[00:39:59] Right. So you look at Edison data Edison Research data now saying that 57 million Americans are listening to podcasts monthly. 57 million that’s like 21 percent of the population of adults or 12 plus adults. So that’s still you can look at it either way or you can look at that as wow that’s gotten a lot bigger than it used to be 10 years before that. But you could also look at it as that’s got a long ways to go to get huge truly huge. In a way that the really large companies are going to think about it as a primary channel for sponsorship and not just kind of like a minor secondary thing that they do.

[00:40:34] When Patzer was kicking on what that where it was kind of like this little one person’s little part time job for a pretty large company. They’re just throwing some money at Edwards. So yeah those were the days.

[00:40:46] All this has been really awesome but unfortunately we are going to have to restart to wrap up. I think this conversation will be interesting to more than just the three of us.

[00:40:56] I hope so.

[00:40:58] So I’m sure we know our audience has a need to be podcast listeners. We’ll be right back. That was in the podcast. We’re talking to our own type of people for sure with this show.

[00:41:10] But yeah as you’ve been listening if you do have thoughts or questions or things like that please definitely reach out on our Facebook page or the measure slack. Well one of the things that we love to do or we like to do at the tail end of the show is our last call. We go around the horn and talk about something that we find interesting or useful in our job career or life. And so Steve why don’t we kick it off with you.

[00:41:39] Yes sure. So I came across this article last week in media Scheft that the title of it is kind of fun I like to say it is the title that is Bulgarian analytics startup aims to fix how publishers use data and I just like seeing the phrase. So if you go through the media or for Bulgarian analytics startup you’ll find it and it wasn’t.

[00:41:59] So I feel like a content has to be a down like the kind you know how could it possibly deliver on that.

[00:42:05] Well I would say you know there are a ton of articles and perspectives across the years about how how publishers media companies use analytics right. There’s a lot of ways to do that. There’s a ways you know how do you integrate analytics into your newsroom and your content folks really really embedded in the numbers. I think I mean this article is kind of interesting. But what really stood out for me is it jumpstarted yet another conversation of many that we have at NPR around. How do we expose folks in our newsroom and our programming groups to all the numbers we have in a way that’s simple and understandable right. So they actually they talked about this. This company who does something they call a CPI content performance indicator where for every particular story they’ll do basically in an index calculated score around that story’s performance so it’s a combination of page views and shares and likes. And so they do a calculated average calculated score that they’ve made up and we have this conversation all the time within NPR as well about you know there’s a certain benefit to creating a nice simple index score for evaluating content. But at the same time it hides all the really interesting complexity about how individual pieces can be better or worse in certain ways. So is that better is that worse. So this article you know it’s not a it’s not a crazy detailed article and I don’t think it has pushed things and a lot of new directions but jumpstarted the conversation within our group that’s been really healthy over the last couple of days.

[00:43:32] All right. So yeah I’ve got the last call I just recently started reading a new book called Smart Choices. It’s a practical guide to making better decisions. And I’m really enjoying it. It’s written by John Hammond and Ralph Chini and another guy who I’m totally blanking on his name but it’s you know what we do in analytics is how people make better decisions. So I’m trying to learn how to do that. So I’ll go book recommendation.

[00:44:02] Ok I will continue the downward slide of the quality of the last call. I believe I made a podcast recommendation on the last one I said. There is no flipping way that we could actually talk to NPR and not have me just totally log roll for NPR. I’m going to do a dual one and I’m breaking I’m sure some internal style thing and that both of these have been mentioned while we’ve been talking but because it came back to me the Planet Money episode on A or B not super super deep from A B testing I was back in December of 669 but if you are in the atomization space and you’re kind of some time trying to talk to a relative or a friend who says what do you do. That’s not a bad way to say hey you know listeners to NPR think that whoever was there ever being talked about has credibility as Steve told us. Point them to that episode and then I second. I’m going to also last call the NPR Politics Podcast because if you enjoy this podcast if you’re at all a political junkie and that is another one of my many many vices the politics Slate I mean the slate that was bad was bad. The Slate Political Gabfest has been around longer than the NPR Politics card podcast but the Slate Political without saying it. They pretty much said that the NPR Politics Podcast like overtook them somewhere by some measure at some point.

[00:45:39] I think the quality of having smart journalists is talking you know everyone from Tamara Keith who apparently everyone adores to Ron Elving who is just like the The Old Man of the sea the lies. SAM SANDERS So I kind of think the format is great because I think there are five to seven different people and it just kind of rotate through and it’s in theory kind of a weekly roundup. But they’ve been doing. They’ve got to do three or four a week. And it’s one little mind boggling being on the inside of this podcast knowing how long it takes us to get from recording to standing it up to their turn and stuff around inside 10 hours. So I’m going to do a dual last call and they’re all podcast NPR related preach.

[00:46:25] Well that story is outstanding and Steve Moeller thank you so much for coming on the show.

[00:46:34] It’s been a pleasure. Great to be here thanks.

[00:46:37] Awesome. Well as you’ve been hearing Podcasts can be interesting measure so don’t hesitate to reach out on slack or Facebook. Talk about it some more. If you’re not on the measure slack then you probably ought to be and says we’re talking about podcasts and you want to jump over to iTunes and rate the show.

[00:46:58] Thanks for that we appreciate that in the end. You know give a couple listens to some NPR podcasts. I think Steve will know exactly where that bump in traffic is coming from. That’s

[00:47:10] her. That’s right. So for Tim Wilson My cohost was hobbling every Monday you guys.

[00:47:26] Thanks for listening. Forget to join the conversation on Facebook Twitter or measures like crazy. We welcome your comments and questions about Koplow flash analytics. Art. Analytics on Twitter. Chose smart guys want to fit in a made up of term.

[00:47:47] Limits. Don’t worry. I’m either anonymous lemur or anonymous Wolverine. I’d like to think of myself as anonymous valorem. I get to be honest but. I like podcasts. I hear they’re going places. Yeah palm casts are. Something you know we shouldn’t keep mind. Only that with better content. Yeah. There’s second season kind of. Clouse. So did True Detective. So hey nobody gets a thousand yeah but the wrath of Congress was the best start right now is number 0. Big point. The Empire Strikes Back. Just saying Number Twos can be good. Maybe get your brother to come back down to Columbus. I haven’t seen him in three years. He spent a little time in Africa recently and now he’s back here. That’s good. You have a similar relationship to him that I feel to have with my sister. Now we talk. Sometimes I just watch her Facebook post chats. So. That was like right after like like an older sister. Oh I could go down. Search Engine history. Also my dog is deciding right now as a target. Bart. Does not happen in NPR’s DC studio. Exactly. Although there was there was like a family day like Bring Your Daughter to Work Day something and somebody went into the studio and am I kid wrong but my kid did all sorts of mischief happened so you know it happened definitely whose kid wasn’t actually I never heard his kid that maybe they were trying to keep it a secret. There was too much embarrassment but the engineers acted like that.

[00:49:35] It was really Bob Edwards snuck in and he did it better. I’m a borderline unhealthy NPR fanboy so. I don’t know. That sentence makes no sense to me. So I’m looking here at HUD track and we’re speaking Tim we have a little bit of work to do. To crack the top two. We call ourselves number 1 explicit analytics pod cast. That’s what we are with a digital digital analytics. I think all you really need is this American life to like land example episode Franklin won that one weekend and then you’ll be golden while you’re. If you’re doing all this testing on the NPR One app I mean we should probably just sort of chalk one of our episodes in there and just see how well done. This may break out. Ahead. Of Time. Castro is just of roses.

[00:50:41] WAGNER I can’t measure.

 

One Response

  1. […] Radio has long been on the forefront of the world of audio media. Why, you might even remember episode #046, where Steve Mulder from NPR made his first appearance on the show discussing the cans and cannots […]

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares