#198: Live from Marketing Analytics Summit

We’ve always said that the genesis of this podcast was the lobby bar of analytics conferences across multiple continents, and this year’s Marketing Analytics Summit in Las Vegas was a reminder of our roots on that front. All three co-hosts made the trip to Caesars Palace for the event. Moe presented on bringing a product mindset to analytics (by “presented on,” we mean “workshopped content for a future podcast episode”), and the closing keynote was a recording of the show in front of a live (and thoughtful and engaged) audience. Give it a listen, and it will almost be like you were there! 

Links to Resources Mentioned in the Show

Photo by Grant Cai on Unsplash

Episode Transcript

[music]

0:00:05.7 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language. Here are your hosts: Moe, Michael, and Tim.

[music]

0:00:22.1 Michael Helbling: Hello, everyone.

[overlapping conversation]
[laughter]

0:00:24.6 MH: Welcome to the Analytics Power Hour. This is Episode 198. Okay. It’s been amazing to be back in person with analytics people, although I feel like I haven’t been ready for it, and I feel super tired from all the human interaction, but it’s been awesome. So let’s get this started. What a pleasure to be here in Las Vegas at Marketing Analytics Summit, recording a podcast live with all of you. I’m very excited. There may be prizes. I’m looking forward to everyone’s participation. So let’s get started. I’m Michael Helbling, I’m the Managing Partner of Stacked Analytics, and please, join me in welcoming to the stage, my two co-hosts. First up, Moe Kiss, Marketing Data Lead at Canva. Come on up, Moe.

[applause]

0:01:19.8 MH: And of course, the Senior Director of Analytics at Search Discovery, the quintessential analyst, Tim Wilson.

[applause]

0:01:31.6 MH: Alright. Moe and Tim, what a pleasure to be back in-person together.

0:01:36.1 Tim Wilson: Can we hug?

0:01:37.2 MH: Let’s do a podcast.

[laughter]

0:01:40.8 MH: There’s a recurring theme in the classic Australian movie “The Castle”, where Darryl Kerrigan is complimenting something his wife has done, and she always brushes it off as commonplace, and he always replies, “But it’s what you’ve done with it.” And often, the work we do in analytics, it’s anything but mundane, but how or when we communicate those results, the insights, the recommendations, they really matter, it’s what we’ve done with it. So, live from Las Vegas, we’re gonna dig into this topic a little bit. And we have been gathering input all week from conference attendees, from social media, from other ideas and conversations, and so we’ve got a number of topics to bring forward, but we also wanna hear from you as well. As we go through this, we’d love to hear you chime in, ask questions, it’s a participatory event…

0:02:38.6 Moe Kiss: Heckle… Heckle is… Heckling is encouraged?

[chuckle]

0:02:41.8 MH: Unless you are a sibling of a co-host, then that’s not allowed.

[laughter]

0:02:47.5 MH: So let’s get started. Alright. Tim, why don’t you kick us off with our first topic?

0:02:52.4 TW: Oh, really? [laughter]

0:02:52.9 MH: We’re delivering results. Yeah. You’ve got this whole spreadsheet there, you can…

0:02:57.3 TW: I was not prepared for that.

0:02:58.7 MH: Oh. Well, just pick one. Oh, you wanna start?

0:03:00.6 MK: I’ll start. So…

0:03:01.5 MH: Moe will start. Moe is more prepared.

0:03:03.7 MK: I had a situation recently where I was sharing the results of a particular experiment we’d run. It was honestly the best feedback I’ve ever had, it was glorious. It was like, “Oh my gosh, team, this looks amazing, I love how you’ve communicated the results, I agree with all the recommendations, full steam ahead.” And I was so pumped. And it wasn’t till I was walking home later that I was like, “Huh, I just told that person exactly what they already thought was true, I told them what they expected.” So that’s a really nice situation, but what the heck am I gonna do if the next experiment doesn’t align with their personal opinion or their preferences or their worldview? And to be honest, that’s why we have this podcast. I have not figured it out yet.

[chuckle]

0:04:04.8 TW: Have you had to do that yet with that person?

0:04:07.0 MK: No. But it’s coming, I know that it’s coming and it’s gonna be tough.

0:04:12.5 TW: Don’t you feel like you’ve got credibility though, because you got to give them the win?

0:04:17.1 MK: I’m hoping that yes. But it doesn’t always work out that way, right? Sometimes the first time you’re presenting results is you’re telling them the thing that they don’t expect and that’s really freaking hard.

0:04:27.9 TW: True.

[laughter]

0:04:30.4 MK: You gonna weigh in with any advise here, Tim?

0:04:32.7 TW: Well, it is actually making me realise that… I think it was… And If I’m gonna go off memory and then find it, but Frederick Warner, one of the… Not here, but he had some great thoughts on Twitter. I think it ties into the… Well, certainly don’t do it in front of an audience, like if you’re gonna surprise somebody, all the… Talk to them first… I don’t know. I did this a few times and it bit me in the ass, and then I’ve not done it where it’s like, you really don’t want the key stakeholder to be surprised in front of their peers, really good or bad. ‘Cause really, you wanna go in as a… And I think it was Frederick who’d made a comment about a couple of them, I’m about to blow through a couple, ’cause one he was like, “Don’t surprise them, talk to them ahead of time,” and then, “Treat it as… ” I loved his comment about making it our result. So it’s not like you’re telling that person it’s the result…

0:05:23.9 MK: Actually, he said, “Make it their result.” And what I found, which is really interesting, so we have this amazing woman who works with us at Kantar and does a lot of our ground research, and it’s funny because I noticed when she was presenting back to Canva with results, she always said the word “our,” and it just is this really small tweak of words, but it created this shared commonality about the research they were presenting. And both me and the head of brand marketing really picked up on it, that communicating it as our result, something we’re sharing, we’re doing together, versus “Your result” or “My result” was a really nice way to frame it. So yeah.

0:06:03.3 TW: Which has come up in some of the… I think two different sessions talked about the whole partnership. And being on the agency and consulting side for a while now, it does start to crawl under… Get under my skin when the biz dev people are saying, “We wanna be your partners.” And I’m like, “But you’re saying, we wanna be your partners, ’cause we wanna be able to sell you more.” Whereas, I’ve got some clients who I do feel like I’ve been able to… I think of myself as an extension to their team. And when we’re using… We’re using “we” language, when I’m using “we” language and talking about the client and us, and they’re using “we” language talking about it, I’m like, “Yeah, this is great.” Unfortunately that, the client I’m most thinking about right now has had high turnover and they’ve moved on to better things, they’re still a client, just the people that we were, so “we” with.

0:06:54.3 MK: We’ve really reached a low point early in our conversation.

0:06:56.3 MH: Yeah.

0:06:56.9 TW: That’s right. That can bring us down.

0:06:58.3 MK: Like a resignation stage.

[laughter]

0:07:01.6 MH: One of the drawbacks of the high turnover rate in our industry. Alright. What about in situations where you’ve got a recommendation and you just have to finally accept that the business isn’t ready for what you’re recommending? What about stuff like that? Because as analysts, we see a broad cross-section of data and information, but sometimes the organization isn’t there.

0:07:30.2 MK: Yeah, it’s really funny. This one I think a lot about because I think particularly earlier in my career, I would have fought that fight, I would have died on that hill, because I was like, “But the data says this, this is the right thing to do.” And meanwhile, the business is like, “Moe, we’re trying to hit a profitability target. No, we’re not gonna go and invest $20 million in this thing over there, we’re not gonna hit our target. It’s a silly recommendation.” I’d be like, “But the data says.” And the funny thing is, it’s actually a really amazing book by Adam Grant called “Originals”. That first got me… I read it probably four years ago now, that made me really reassess my thinking on this. He tells this anecdote of an analyst at the Central Intelligence Agency, who had this amazing idea that she wanted to revolutionize the way that they worked. And she kept pushing shit uphill and realizing that no one was listening.

0:08:22.6 MK: And she finally just stopped, and was like, “You know what, I’m just gonna put this to bed.” She waited like three or four years, she got a bunch of promotions. And then she was like, “Now, everyone’s talking about all this stuff, the climate is right, I’m going to recommend this suggestion again.” And it was completely adopted. And I do think there are times when the business isn’t ready for it, or it’s not the right thing to recommend at that point in time. And you just need to suck it up and pat yourself on the back for knowing better. But just… It’s not there yet.

0:08:53.9 TW: Does that also, like the… The other way to look at that is that introducing it, it takes people… Especially if it’s getting them to shift that it… And I’m… I’ve got a couple of specific scenarios where it’s like, “This doesn’t make sense, you can’t actually track people that way, so let’s look at it this other way.” And they’re not ready to do it. They’re not ready to do an experiment at a DMA level, media experiment, so it goes nowhere. And then it comes up again, again, just taking what the media agency is saying, “It’s not really what you want.” So it may take a year or a year and a half, and then you start hearing the stakeholder say it back as well. So there’s a part of me that feels like there’s that balance of not digging your heels in and like, “If you don’t do it now, I’m gonna take my toys and go home and pitch a fit.” I definitely pitch fits. But…

0:09:48.9 MK: So, you’re good at wearing people down, that’s the summary here?

0:09:51.7 TW: Well…

[laughter]

0:09:52.8 TW: I think there is the… I’ve had cases where the clients, we’ve been on a call and like, “Oh, yeah, Tim, you’re gonna remind me that that’s the case.” And we’d just chuckle about it and we move on. So it’s like, I can show that “No, I’m not gonna push you, but I’m glad that you’re recognizing that now.”

0:10:11.1 MH: And I think related, there are times when maybe they do wanna do the thing, but they… It’s not the right time. I have a client right now who… They need to hire their own internal analyst now. We’ve got all kinds of work this analyst could do, it’ll help their organization immensely. But they’re a startup. And they’re about to go and ask venture for more money, and right now is not a good time to be adding headcount in tech. That is not a good suggestion right now. That’s just not happening. So, while we all agree that’s a great idea, A if you can find an analyst in the first place, right?

[chuckle]

0:10:49.7 MH: But B, sometimes there’s things that creep in that are external to the great ideas we have or the right timing for things that make you have to wait. And that’s tricky too because you have to re-evaluate or adjust then what is possible in that scenario a lot of times, because it’s sort of like, “Well, we all agree we wanna do it, but maybe it’s not the time for it, or we have to wait for the right moment.”

0:11:15.1 MK: The thing that does make me wonder though, particularly if you’re someone that’s managing people, how do you manage the team’s expectations in that situation of like, “Hey, you need to let this thing go, we all know that we should be doing this thing but we’re not gonna do it. We all need to just suck it up and revisit it in six months.”

0:11:36.1 MH: Yeah. I don’t know. For me, it’s always like the old Bezos disagree and commit kind of idea. It gets a little overplayed, but…

0:11:44.9 TW: Wait. That’s… Say that again. Disagree and commit?

0:11:46.6 MK: Disagree and commit.

0:11:47.5 MH: Yeah.

0:11:47.9 TW: Yeah.

0:11:48.4 TW: Okay. Oh, okay.

0:11:48.8 MH: So we all gotta just ride it out. It’s like, “Hey, maybe I’m the manager of the team, guess what I don’t control? The purse strings of the company, and I’ve gotta go through the right channels to get the right headcount, this is where we are, so let’s all pull together and make this happen… ”

0:12:05.8 TW: Is there a fine line to tread that you’re not trashing the people who do control it?

0:12:11.2 MK: Yeah.

0:12:12.3 MH: I would say I went through a transition in my career where I was definitely… Our team, us versus them, like, “Leadership doesn’t understand us, we’re gonna be our own guerilla unit in here, and we all, we got is each other.” What I realized as a leader was that was increasingly unproductive.

0:12:30.5 TW: Moe, maybe we should stop talking about Helbs behind his back so much.

[laughter]

0:12:35.7 MH: As if you were manageable at any point in time, Tim.

[laughter]

0:12:41.7 TW: There was… This is now coming to me right now, ’cause one of the earlier sessions. So, everybody who’s just listening to this later, you missed an awesome one from Wil Reynolds talking, who his entire session was around asking questions. And I want that list. I want to refer it.

0:13:01.8 MH: Can I have the list as well? Thank you. Woo-hoo.

0:13:04.5 TW: Moe is gonna tweet the list. But the point I will now butcher it and [0:13:08.2] ____ watch will cringe is that instead of just asking one question, getting the answer and then having this urge to go and give the response, actually ask more questions, do more probing, force yourself to slow down. And I’m wondering if that’s, as you’re communicating results, if they’re resistant, instead of just jumping to, “Well, let me keep trying, let me keep trying,” instead, reverse it, and actually ask them questions, get information from them.

0:13:38.8 MK: Do you know it’s so funny, this actually came up like two weeks ago, and I’m gonna pat myself on the back ’cause I think I did alright. But we’re dealing with a particular stakeholder who has an experience, a lived experience, an anecdote that is very compelling in their mind of, “I did this thing and it worked, therefore, if I do this thing in any other situation, it’s gonna work again.” And it was really funny, we had this analyst in the meeting who just kept trying to argue with him, and I messaged him on the side and was like, “Woah, he’s really digging in, maybe you just take a step back and ask some questions.” And it was actually really funny because as we started asking questions, it really did unfurl and we could be like, “Well, don’t you think if we’re gonna roll out the strategy to 30 countries, it might be worth a test?” And he then was like, “Oh, maybe we should test it and not eyeball this dashboard,” which is what the intended plan was, which I had convulsions about. And it’s funny how that stopping and questioning thing does work.

0:14:48.8 TW: And Josh, just note for Josh, this is one Moe did not give enough identifying information, you can just leave all of that in, she handled it pretty well.

0:14:56.9 MK: Yeah, it was just a random person.

0:14:58.5 TW: That was good, yeah.

[chuckle]

0:15:00.7 TW: Good job.

[laughter]

0:15:02.7 TW: Truth, it’s happened a couple of times, there’ve been some edits. [chuckle]

0:15:05.6 MH: Alright. I’m gonna bring up one that I consider now actually fun. Early in my career, and I think for a lot of people when you’re asked something by stakeholders to come up with an analysis or an insight that’s just not in the data, and actually, some of the earlier session talked about this, it’s really difficult because you’re under the gun, it’s like, “Well, you gotta come up with something.” And so we really, we twist and turn stuff upside down and we try to produce something. And at this point in my career, I’m actually really delighted to just be like, “The data doesn’t support what you wanna know, let’s talk about another way to get to that answer.” But it took a long time. I realised I was at a dinner when we first got here and I was… I would ask everyone, “When was your first Marketing Analytics Summit or eMetrics?” And I was the person who’d been to the oldest one, and I was like, “How did that happen?”

[laughter]

0:16:01.5 TW: Show-off.

0:16:02.6 MH: I’m just old. I’m really old, okay?

[chuckle]

0:16:05.2 MH: But that is something I would love to hear from you two about as well. How has that changed for you?

0:16:10.7 MK: But Helbs, do you think that that just… I hate when I say this, that there are things that I just think take time, like you getting comfortable enough to be like, “Actually, the data doesn’t support that.” I think it’s very different when you are either a junior analyst or you’re starting out your career, or you’re at a new company and you don’t have that credibility yet to sometimes say stuff like that. I feel like… Yeah, it’s probably, it’s more of a you thing than a stakeholder thing, right?

0:16:37.5 MH: Oh, absolutely. But if you’re a leader, though, of a team, trying to figure out if somebody’s really going through the ringer on some numbers, maybe you can pull ’em aside or something and be like, “Hey, don’t be afraid to say that the data is not there to support the insight or the analysis we’re looking for.” I don’t know. I agree. I think it’s a maturity thing for me, for sure.

0:17:00.5 MK: Can I… I just wanna get a show of hands, which Tim can calculate on the fly for the listeners at home. Who uses the shit sandwich method when they’re giving bad news?

0:17:11.4 TW: Okay, that looks like about… Hold on just a second, about 327.

0:17:16.9 MH: Yep.

0:17:17.5 TW: Okay.

[laughter]

0:17:17.8 MK: Okay. So we’re gonna say, about 40%?

0:17:21.6 TW: Yeah.

0:17:21.8 MH: Yeah.

0:17:22.3 MK: Okay, cool.

[laughter]

0:17:25.7 TW: What is the shit sandwich method? Sorry. Bad…

0:17:27.5 MK: Like good news, bad news, good news.

0:17:28.6 TW: Good news… Okay.

0:17:29.9 MK: You might say to them in the course of the email or the note or whatever, “The data doesn’t support that.” But you’ll be like, “Oh, but I found this other nice tidbit, and oh, you’re doing such a good job at the end,” but in the middle, you’re like, “No, that’s not a thing.”

[laughter]

0:17:45.6 TW: Part of that too, though, is getting the stakeholder on your side. I don’t wanna… I wasn’t intending to wind up going back to that again, but that idea that it’s not the back and forth, that’s the… I’ve watched and I’ve certainly done it, the “I have a task, I need to do it, I’m gonna keep spinning until I think I can find something. I’m up against the deadline ’cause I said I’d have it by Friday, I’ve gotta find something. Let me just get something and then I just wanna throw it out… I just can’t wait to be done with this analysis.” And that’s this really unhealthy because then you want to just throw it over the wall, hope they look at it, answer any questions and then just move on. And that’s like fighting… Instead saying, “Here’s what I found, I’m not finding what I think you’d like, I don’t think I’m going to find what you’d like, I’m happy to continue digging into this.” And I think in-house it’s…

0:18:43.9 MK: Are you though? Are you actually happy to keep digging into this?

0:18:46.9 TW: I’ve…

0:18:48.0 TW: I am also having, I guess, in that scenario, the luxury of being on the consulting side and saying, “It’s your money… ”

[laughter]

0:18:58.2 TW: “I think we’ve exhausted this, I don’t think it’s productive, I’ve demonstrated that I’ve tried really hard, and I’m trying to give you anything of value that I can, to dig further is gonna be, it’s gonna be costly in time, and I’m happy to do… I am serving at the will of the client, but I think your money is maybe best invested elsewhere,” and showing them I’m trying to be a good steward of their funds but I’m not… But it’s just the facts. I think analysts, we get into this mode of like, “Well, in my mind I thought it was gonna take a couple of hours, I’ve been around long enough to know I should at least double that, I told the stakeholder four hours, I’m now 12 hours into it, I can’t see my way out of it. Oh my God, I’ve failed.” It’s like, “Well, okay, maybe your multiplier should be a little higher next time.” But being transparent, I think Riesling in her session said, “Yeah, admit that if you… If it didn’t work out… ” And I’m forgetting the exact anecdote, but it was like, yeah, if there was a mistake or it took longer… I don’t know. I just over-index to the transparency. And then, again, saying, “We want to collectively get to business value, so let us try to solve this.” And I can’t make incredibly messy data super clean and easy to get to with the snap of a finger.

0:20:27.9 MH: Alright. At this point, I think you’re on to the game we’re playing, so let’s see if any of the 817 people in the crowd here, based on our quick math earlier…

0:20:38.5 TW: Did you literally just punch that in?

0:20:39.1 MK: I was like, “Really? Do… ”

0:20:40.3 MH: Yeah, I did 327 divided by 0.4. And that’s got 8.117…

0:20:44.7 TW: Oh, good job.

0:20:45.6 MH: Thank you. I…

[laughter]

0:20:48.0 MH: They have calculators.

0:20:49.2 Audience Member: Is that the new math?

0:20:50.1 MH: Yes, the Common Core, it’s like, no…

[laughter]

0:20:52.8 MH: Whatever. There’s a lot of people in this room. Anybody have ideas or thoughts they wanna share or questions or… Oh, we’ve got a hand. Alright, perfect. Although I just learned, if a woman goes first, it’ll encourage more women to go. So consider carefully, Doug.

[laughter]

0:21:11.0 Speaker 5: Okay, life box ticked, I’m on the Power Hour. This is great, this is awesome. Thanks for having me. Moe, I wanna go back to the test results, Moe, that you were talking about earlier on.

0:21:19.5 MK: Oh yes.

0:21:20.5 S5: Yeah. It’s all upside. There is no downside. If we are talking about winning or losing on a test, or if it doesn’t go the way you want, then that’s a zero-sum game, and I don’t think that’s right. I think we are seeking to answer a question, and it is either, “Yes, you’re right,” or “No, you’re not right.” “But why?” And it’s the why, it’s the understanding of the reason for that result, that’s all upside. So we’re not gonna give bad news and say, “Sorry, your test tanked,” it’s like, “Awesome news, we found out some really cool stuff here.” And that goes for whether the metric went up north or whether it went south. As long as we can say why then we’ve got good news to deliver.

0:22:01.1 TW: So is that… Are you putting forth the like, “Oh, no test… Don’t just stop with the results, segment, slice, and drill until you found something”? Is that…

0:22:09.9 S5: Yeah, yeah, yeah. We don’t test to improve conversion rate, we test to learn.

0:22:14.6 MK: I do think that… If you asked me on another day, I would probably say exactly what you just said. I think it’s very difficult when there are strongly held personal views about particular marketing channels, particular tactics, and you’re trying to challenge that really deeply held belief by being like, “Look, we didn’t experiment, we learned something, but you’re still wrong.” I think it is… I do share your view, but I do think it’s more complex than that.

0:22:45.3 S5: It is, you’re right. I absolutely agree in that respect. But in that scenario, with that context, then you make sure the test design and the hypothesis is worded such that we’re exploring this deeply held view, in order to learn more about it, in order to learn what is the impact of that, what else can we do with it? How else can we turn it into opportunity?

0:23:08.5 MK: Yes.

0:23:11.5 TW: I think I still have a little bit of pause when it’s the… We design the test to answer a binary question with a degree of uncertainty or a degree of certainty, and if we say, “Well, we answered it but not how we wanted.” But the why troubles me a little bit because getting to why is getting to causation, and there’s a degree of saying, “I ran an experiment, and now after the fact, I’m gonna go wandering through the data.” I’m not saying you don’t wanna dig into it, but the fact is, it’s like, I think there’s a risk there that setting an expectation that when we run an experiment, we’re gonna guarantee learnings. It’s like, “Well, your experiment needs to be what were all the learnings you wanted to get.” If you said, “My experiment was to learn this, and I got a binary… My experiment was to learn whether this cause… It looks like it causes a difference or not. And if it doesn’t, and now I wanna answer why it doesn’t,” then that can be a slippery slope that you’re maybe setting expectations where, well, you may need to do some more experiments. I think you wanna look. I just, I think, we have a tendency to sometimes over-promise.

0:24:32.0 MK: And I’ll just add. The funny thing is if I thought the business was so resistant to a particular result, as in, it definitely wouldn’t make any changes based off it. I probably wouldn’t let the team do the experiment in the first place because I actually don’t think it’s valuable in that case, you’re wasting resources, you’re running an experiment that’s gonna have no impact, it would be… I don’t know. People are probably gonna tell me I’m wrong, but if I genuinely believed that one outcome would just be disregarded, I probably wouldn’t do it. Do you wanna just introduce yourself?

0:25:06.1 Lexi Keogh: Yeah. Hi, I’m Lexi Keogh. I’m the EWI… Sorry, I shouldn’t even say EWI. Gosh. I’m so used to introducing myself internally…

[laughter]

0:25:15.1 LK: That I have an internal acronym, so I shouldn’t have even said that. So I’m actually Lexi Keogh, I’m the Tagging and Measurement Manager at Eli Lilly. And I’m actually gonna ask a question on behalf of my newly made friend who was actually wondering, when you’re approaching a stakeholder, just on the subject of experiments, you’re as business as usual, and you’re going, “Okay, well, this is some great information,” but how do you even get them to buy in to running an experiment? Or does it really have to come from the stakeholder? Because an experiment is really gonna lead to learnings to then be able to determine what to do next, but if you’re just gonna continue business as usual, what do you do next? So, how do you get someone to really buy in to an experiment?

0:26:05.8 MK: I am happy to take this one but I’m…

0:26:07.1 TW: Please do, ’cause I’ve gotta ramp… So why don’t you take it?

0:26:09.6 MK: Oh, okay. I’ll let him get him get on his soapbox afterwards.

0:26:11.5 TW: No, I can…

0:26:13.0 MK: I actually think it’s all culture. And that’s a really freaking hard answer, but you… If you don’t have a culture of willingness to experiment in your business, you’re screwed. You might find one odd stakeholder that does wanna do an experiment, and that’s amazing, and I would try and partner up with those people. But it’s definitely an organizational thing. And I very rarely pick organizations that don’t have experimentation culture.

0:26:39.1 TW: But is it because they expect that you can get the answers without running an experiment, that they want the insights… Now you’re gonna have to…

[chuckle]

0:26:45.8 MH: That’s right.

0:26:46.5 TW: Do you work for a three-letter acronym group within your organization as well that you can…

0:26:51.4 Tabetha Carnahan: So, my name is Tabetha Carnahan, I work with Cornerstone Building Brands as the Digital Marketing Manager. Primary focus is demand generation, but they’ve looped in analytics with that. And we work as a center of excellence on my team across all of these different brands in construction materials and exterior building products. So, long story short, a lot of what I have come across in my career with this company is an assumption that they already know their audience, and a lot of that comes either from years of experience within the company itself and doing the same thing over and over again, expecting different results, and/or working with strategic partnerships where they come in and say they’ve done all this research, and this is direction you should go in, and then asking another agency to execute on that strategy. So, there’s a lot at play. And most of the time, the biggest challenge is getting them to realize that there’s a percent that they might be wrong.

0:27:52.7 MH: Yeah. One of the questions when Wil Reynolds was speaking earlier that he did… That he mentioned was, “Can you think of a time where it didn’t work the way you thought it would?” And I was like, “Man, I’m gonna use that question so often.”

[laughter]

0:28:05.9 MH: And I had at one point in my career a time where I was trying to evangelize the idea of doing additional experimentation and testing, and there was no culture of that. And it wasn’t that we were really… It wasn’t really acceptable to just say, “I wanna run a test on this.” But one of the things we did was we just started talking about what we saw as opportunities in experimentation more broadly, and as we got opportunities to test things, then it became a little more okay to discuss the idea of it. And I think my biggest huzzah moment was when we were in a meeting one day and the CMO just turns around and it’s like, “Well, how are we gonna A/B test that?” And I was like, “Yes!”

[laughter]

0:28:46.7 MH: It’s the whole, like top of the stairs in Philly, Rocky moment, and the freeze frame credits roll. But sometimes I think you have to take… ‘Cause I think Moe is absolutely correct. If the culture isn’t there to support it, then what you have to do is an oblique attack to come in and start to see the idea of experimentation into the culture slowly, which is not easy given the position that you described, ’cause you’re like, “Well, we gotta move, move, move right now.” It’s not like you’ve got six months to plot this out. But sometimes that’s the path that you might have to take.

0:29:25.6 TC: Thank you.

0:29:26.7 MH: Yeah.

0:29:26.9 TW: As you describe that, not to… ‘Cause, it’s… I’m hearing, I’ve run into cases where, it sounds like that where there’s silos of like, “We’re gonna have this group do the research and come up with the strategy,” and they’ll give us personas and audiences, and then there’s the media agency is gonna say, “Well, we can’t really give you that attitudinal audiences, but we’re gonna map it loosely.” And we assume their audiences are pure, so now we’re gonna do this execution. And so everybody along the line says, “I’m gonna stay in my lane, I’m gonna assume that I’m getting perfect information coming in.” What I’m actually doing is imperfect, but they’re handing it to the next group that assumes it is perfect information coming in. I’ve got a case with a big pharma company where they had done some amazing, clearly very expensive, really good research. They had built microsites that were interactive, that were amazing, and everybody forgotten about it. So it was the opposite that they had… You looked at it at one point like three years ago, somebody had come along and said, “Well, based on that, did this really, really crude mapping to how they were gonna segment their database, and they’ve been moving along with it with labels that came from that research.”

0:30:45.3 TW: I’d started asking like, “This came from somewhere, can we find where it came from?” And I pulled it up, and I was like, “Well, wait a minute, there are so many testable ideas in this, can we pull this back up?” ‘Cause, there’s this tendency to always wanna have like, “What’s new? What’s new?” And it’s like, well, they did this… You spent five or six, probably, easily six figures on this research, and it seemed solid, but it was like, “Okay, but we never actually tested it.” That was… You ran 70 meters down the race and you’re set up and then you just… Everybody walked off and went home. So, that wound up being another little bit of a steady drip of saying, “We found this other thing, aren’t you excited about this?” The brand managers had turned over internally, so we were pulling stuff up for them saying, “Do you realize that your predecessor commissioned this and it seems awesome?” And then asking them questions, “Why are we not still looking at that?” And then that ultimately did lead to some tests.

0:31:42.5 MK: So I’ve got one burning in the back of my mind. Advice welcome.

0:31:48.9 TW: Wait, a what?

0:31:50.1 MK: I’m gonna say it now, Tim.

0:31:51.1 TW: Oh, okay.

0:31:51.8 MK: Okay. We’re getting there. I’ve had a situation recently, and I think at every… Every data person has gone through this in their career at some point where people are completely resistant, not to the findings, but the tool that was used. So, I’m not gonna listen to you because it came from Google Analytics, or because it came from Adobe Analytics, or because it came from finance, an old mate, whatever his name… It’s the tool and how we found the information that is not believable. And I actually find that incredibly challenging. The thing that I’ve been thinking about recently is like, “What if I just obfuscate where the data came from so that they don’t know and make it look like it came from the thing that they like, and then maybe they’ll just believe it, and it will all go away?”

0:32:48.7 TW: No. No.

0:32:49.1 MK: That was in my head. I haven’t actually tried it…

[laughter]

0:32:51.7 TW: But I’ve been considering it.

[laughter]

0:32:54.2 MK: Judging by the laughter I probably shouldn’t do it.

[chuckle]

0:32:57.8 MH: Well, you probably shouldn’t talk about it on a podcast, that’s for sure.

[laughter]

0:33:02.4 TW: Well, but I think there’s a part of it, and I think Cory Underwood had chimed in with this one, it’s like, you do need to know the tool really well, and know the da… You do need to… You need to know that… I think, be able to defend the tools, not the right, explain… There’s the danger of getting sucked into, “We’re trying to have a meeting about strategic important thing, and somehow we’ve been derailed with how cookie expiration happened,” so that’s like the slippery slope to not head down. But I do think you need to unknow it and then somewhat defend the toolset that you have, which, but I think includes saying, “These are the limitations, I acknowledge these are the limitations,” but… I don’t know. But someone’s gonna bash a tool, like…

0:33:46.0 MK: I just… To be honest, I’m at that point where I’m like, “You know what, let’s just not use that tool.” If that is going to be the friction point to doing the… Or making the best decision, maybe we just use a different freaking tool. And again, not my hill to die on, but I don’t know.

0:34:04.0 Jim Sterne: So I wanna address that. This is Jim Sterne, I run a little conference called Marketing Analytics Summit, and also life goal, I’m on the Analytics Power Hour.

[laughter]

0:34:14.5 MH: Again.

0:34:14.9 JS: In the legal world, they say that you attack the facts, and if you can’t attack the facts, then you use the law, and if you can’t use the law, then you go after the personality of the person. So when somebody says “The tool isn’t trustworthy,” you have to figure out why they’re discounting your results because that’s just an excuse, they’re just hiding, ’cause probably they have no idea how trustworthy the data is.

0:34:39.5 TW: But still challenging, that can still derail the discussion, and maybe that is the, like, “I understand you have some concerns, I will set up time with you after this to go through the concerns, can we… For now, on the premise that the tool is okay,” or say, “You know what if you don’t… We’ll have to have that separately, I’m sorry, we have all eight of you here today, we’ll re-schedule for next week.” That would be brutal. I’ll never pull that off, said, just, “Well, you know what, let’s end the meeting.” “Fine.” If you’re gonna be an asshole, I just went straight to the personality part…

0:35:11.5 MK: So you’re gonna take your toys and go home?

0:35:14.4 TW: Well, no. But I mean, more say, I don’t wanna public… ‘Cause that is the kiss of death, having a discussion about the data and the data source, when that’s not the… Which, some of that goes back to pre-selling. If you know that person is gonna be like, ” Going on metric, that’s horrible.” And it’s like, “Okay, what can I… ” If I know that person who’s got animosity towards it, then I’ve got to do the painful, which I am terrible, ’cause I don’t wanna go have the… Like, “Okay, I’m gonna set up the one-on-one, I’m gonna talk this through and I’m gonna try to do it ahead of time,” and at least say, “I wanna make sure that what I’m saying we’re both comfortable with.” So that means if I’m gonna give and say, “I won’t say that,” or “I’ll say it this… ” And try to get them vested and on your team.

0:36:04.3 MH: Or as Jim Sterne recommends, impugn their character.

[laughter]

0:36:07.1 TW: Yeah, yeah. I think I like that.

0:36:08.1 MH: Alright, we’ve got another question.

0:36:10.0 LK: No, I was gonna make a comment about…

0:36:10.6 MH: Or idea, comment…

0:36:11.9 LK: About your issue. Lexi Keogh here again, from Eli Lilly. When we all started working from home, everyone was like, “Well, the data is now different because we’re gonna have so many people working from home, we can’t filter everybody’s IP address from home,” and I was like, “I get it.” However, here are some stipulations around that. Obviously, most of those people are gonna be visiting the site directly because you don’t need to be Google searching for us, hopefully not. Hopefully, you’re not clicking on our ads, on our own ads, and driving up PPC. So I think, my best guess and what I went around doing is gathering concerns and understanding what the concerns about the tool were prior to them delivering any report. And that goes for both Google Analytics currently and then Adobe, ’cause we had some fun issues when implementing Adobe that were different from Google Analytics, so then right now, super fun, get to a cross-reference between GA4, GA360, and Adobe and give…

0:37:14.5 TW: Oh, fun.

0:37:15.8 LK: I know, I know, I know. I know. Yeah.

0:37:18.1 MH: A classic.

0:37:19.0 LK: Only do that for a couple of brands, but really giving them that high-level view and going, “Here’s what I understand are the kind of issues between all those tools and all the stakeholders that have raised their concerns, and here’s still my recommendation and my understanding behind the data,” and it still would be the same no matter if we took those stipulations on or not. So, same recommendation, and I can still have a one-on-one with you later to go through the data and your own concerns, but definitely pre-gathering those concerns. That’s definitely a lot of effort, but that’s the best way I’ve handled it so far.

0:37:53.8 MK: I just love that you’re giving me advice that I would give someone else, but I still need to hear it from you, and I’m shaking my head being like, “That’s a great idea, I really need to gather concerns.”

0:38:02.0 TW: But I will say a little pet peeve because there are analysts who will have to do the IP… We’re doing the IP address filtering, and I would sometimes recommend math, especially if you’re a brand that has massive reach that, as an analyst, it’s like, “Oh, how can I solve that? Well, I can have an intranet site and they come to this and they drop it.” Or I can say, “If our employees are hitting the site in such volume that that’s meaningfully impacting what we’re seeing, we have a much bigger problem than not filtering out our internal traffic.” But I know there’s this tendency to say “We’ve got this clever way to filter out,” but the danger of that is starting to pretend that there’s precision in the data that there isn’t, so…

0:38:49.2 Barb Kalicki: Hi, guys and gals. I’m Barb Kalicki, I work for Publicis Sapient as the Manager of Data Strategy, and I wanted to add on to everything everybody’s saying. What I usually do, I came across that a lot, especially with our fabulous Google Optimize that we use sometimes.

[chuckle]

0:39:07.5 BK: I had a statistical analysis discussion about statistics and how statistics are statistically reliable because of one or two things, and decided that that route was not getting me anywhere because everybody was glazing over, like you guys are, like, “Oh my gosh, she’s talking about statistics.”

[chuckle]

0:39:28.5 BK: So basically what I just say is, it’s not meant to be like a one-to-one relational tool, it’s meant to give you a direction to further experimentation, and/or create more experiments, different experiments out of what you’re seeing. Also, that directional tool, if you’re gonna change the CTA of an add to cart button, it’s not gonna matter because if somebody’s gonna buy it, they’re gonna add it to the cart anyway. So I always… Especially Google Analytics, always, don’t worry about it if the employees are coming in directly because it’s just a way to take your marketing or your campaigns, or whatever ad spend you’re doing and use it as a directional way to show you how to get to the ROI you want.

0:40:19.9 TW: But it does trigger the challenge we all have, and I think a lot of us have to… I’ve had to slowly overcome it myself that, 20 years ago we started thinking that digital was gonna give us this level of precision and fidelity that it never was gonna give us. Now it’s getting worse, and there’s plenty of consultancies and vendors saying, “We’ve got technical workarounds that are expensive and hard, and they’re gonna get you back to some truth.” And it’s like, there was just that big fallacy, Jim Gianoglio talked about it a little bit yesterday as well, that it was never that great, but we let marketers get conditioned to think that because they see a very, very precise number, we actually think it’s very accurate.

0:41:08.8 MH: Yeah. Alright, I’m gonna switch… Oh.

0:41:11.4 Speaker 10: Quick question, and it goes to filtering. I have a mentee who listens to your podcast who joined a new organization and they do not filter bots, which they feel, this particular person feels, is about 30% of the traffic. How bad is that? Is that a hill to die on?

0:41:30.7 TW: I’d worry a lot more about the bots… Boy. You guys wanna take that?

[laughter]

0:41:37.1 TW: Talk less.

0:41:37.2 MK: I’m confused. I’m confused by, does this person have to die on a hill for this? I’ve had this problem in companies before, where most people are like, “Yeah, sure, we should fix that, that’s probably a thing we should fix.” But it seems like maybe there is resistance, and that’s why it’s a discussion about whether they need to go up the hill.

0:41:56.1 S1: The response is, “Yeah, we recognize that’s a problem, but it’s not a priority. So back down, don’t harp on that anymore, we’re tired of hearing it”?

0:42:05.6 MH: Yeah. There’s an indication in there, like how important is data to the success of the organization? And then as a result of that, they’ll have to make a judgment call about their role with the data and the organization and whether the organization is worthy of their time. And that’s a great place to be as an analyst in this modern time, where you can maybe talk to some other organizations about whether they value you and their data more.

0:42:33.7 S1: So it’s a hill to die on.

0:42:35.9 MH: Potentially.

0:42:37.3 TW: If you get the bots on the… There are bots on the website and there are bots that are false impressions, false clicks, there is media they’re paying for, that is streaming media for devices that are turned off, I would go and die on the hill of bots in your paid media and trying to find evidence of if there are bots on the website and those bots are showing up as coming from paid media, then they are… Their agency or their internal team is reporting a grossly misleading cost-per-click or CPM, and going back and saying, “If this is the issue we have on our website, let’s just apply that number to our media.” You think you’re pretty comfortable, you think you’re below a benchmark for CPM, what if, actually factored in this bot thing? You are throwing money away.

0:43:27.7 MK: And to be honest, when I do that, I often extrapolate out to a year, because sometimes if you do like a day or a week, people are like, “Yeah, whatever.” But if you do it for a year, it’s like, that’s the money you’re wasting. And… I don’t know. Maybe that’s messed up and I shouldn’t do that ’cause it’s manipulative, but it’s worked well.

[chuckle]

0:43:42.2 S1: Wait, we’re analysts. That’s what we do with it.

0:43:45.0 MK: Sure. Sure.

0:43:45.7 MH: A little bit of this, a little bit of that.

0:43:46.7 TW: Again, you didn’t name the role that you were deceiving, so…

0:43:50.3 MK: No, no.

0:43:50.7 TW: So Josh doesn’t have to edit that out…

[chuckle]

0:43:51.7 MH: So there’s… I wanna open up another little line of topic. So there’s two things that no one ever seems that interested in measuring, one is success, I wanna talk about that. The other is the metaverse, currently, ’cause it’s new and exciting. But no, when things go really well, a lot of times there’s not, a lot of people don’t want you to go into a big retrospective on it or they don’t care very much. I don’t know. How do you communicate…

0:44:19.7 TW: Don’t.

0:44:19.8 MH: Just don’t?

0:44:21.3 TW: Do I have to do X? You did X.

0:44:22.4 MH: But what if there’s… There’s two things I always think about, there’s, how do we maximize efficiency and how do we make big bigger? If something’s working well, shouldn’t I analyze it to make…

0:44:35.4 MK: Yeah. Look, I work at a company that’s very enthusiastic and positive, so good news is exciting. Yeah. For me it’s actually the inverse, I find it harder to be like, “How are we gonna communicate something didn’t go well than something did?” ‘Cause they’re all like, “Give me the hype real,” everyone’s really pumped, there’s music made and it goes along with the slides. So I definitely have the opposite problem.

0:44:58.0 TW: But you don’t have… We don’t have infinite resources. There is no world where we can just analyze everything. I push because I…

0:45:09.4 MK: Actually…

0:45:10.8 TW: There is a world where we can?

0:45:12.1 MK: No.

[laughter]

0:45:12.8 MK: But I’m gonna give you an example of where we did.

0:45:14.4 TW: Analyzed everything?

0:45:17.0 MK: No. So actually, we had some very good increases, which again, you might just be like, “Oh cool, hyper growth company doing really well. Growth, yay.” But what we really wanted to understand as the data team was, was this actually growth attributed to things we were doing or was it COVID? And I do think that that is worth an analyst researching, and we did.

0:45:44.2 TW: But that’s still starting that, to me, you just articulated a hypothesis of, “Wait a minute, let’s… ” ‘Cause right now we’re at a point where lots of companies just said, “We’re doing great, let’s keep projecting that,” and all of a sudden things are not going well for them. So that to me, that’s a very specific we framed a hypothesis, which is very different from saying, “This was really successful, can you just analyze why?” They just analyze…

0:46:07.9 MK: But wouldn’t you, in analyzing why have to come up with the hypothesis? Isn’t that the step you would take?

0:46:14.5 TW: Well, and let’s…

0:46:15.5 MK: Okay, hypothetically. Well, not hypothetically. This actually happens. I used to have an exec that I worked with, if I had any graph that just went kinda like that, they would flip out, and the first question would be, “Why?” Oh…

0:46:24.5 TW: For those listening at home, Moe just gestured a elbow…

[laughter]

0:46:29.0 TW: Pointing down.

[chuckle]

0:46:29.4 MK: So a graph that just goes off a cliff, right, and I always had to know why or have a good indication, or a timeline of the answer before I met with this person because that was the first question. The same should happen when the graph is going up. Apparently not.

0:46:48.4 MH: I did not know we were gonna run into disagreement on this one. Very exciting.

[chuckle]

0:46:53.6 MK: Can we… Should we go to a vote? Should we go to an audience vote?

0:46:57.9 TW: But again…

0:46:58.0 MH: I mean.

0:47:00.8 TW: I’m still questioning the framing a little bit. If you’re doing a campaign or doing something and saying, “We’re expecting this to drive growth, drive to go up,” and it goes up. If you’re saying “We’re doing nothing,” and it just surprisingly went up, or surprisingly went down. Google has that horrible, awful they’ve done, iteration number two or three of “We’ll give you… ” What’s it called? Anybody? Intelligent Insights or Automated… I don’t know what it’s called. It is the, “Go read that, go look at what that is,” and you’ll be like, “This is absolute garbage. Traffic… ” Revenue has gone up in California by 27%, while revenue overall went up by 10%, and that is the path you can get yourself sucked down. There’s only so many hours in the day…

0:47:51.4 MK: So you’re worried about more just going down a rabbit hole without a clear direction. Is this…

0:47:55.6 TW: I mean, I…

0:47:57.1 MH: Okay.

0:47:57.8 TW: I about lost my mind when Adobe Analytics rolled out anomaly detection, and there was a odd leader who was like, “Congratulations, analysts, their job just got so much easier because… ” And wrote a lengthy blog post, the analysts can come in in the morning, pull up, see what their anomalies were, and then dig into every anomaly and figure out the why behind every anomaly, and this person said, “That is the role of an analyst.” Well, if you say basic, 95% outside of a level as an anomaly, you look at 100 things, you’re gonna find five anomalies, right? So, point taken. If it goes way up…

0:48:40.8 MK: But isn’t this why all analysts have Spidey-senses?

0:48:44.9 TW: Well, but if you’ve got a stakeholder who says, “If it goes up or down… ” It always goes up or down. I think one time in my life that I’ve seen a number stay perfectly flat…

0:48:53.2 MK: But it’s versus your expectation versus your target. Like…

0:48:58.0 TW: Versus the normal variability in the data, right? I mean…

0:49:00.5 MK: Exactly. That’s what I meant.

0:49:00.8 TW: Yeah.

0:49:02.4 MH: So this is good, ’cause I think the hypothesis as part of the process, I think, is a really good idea and healthy, and I like both those things. Okay.

0:49:11.7 TW: Do you want this to be the last episode? This is where…

0:49:13.6 MH: No, yeah, that’s right. And we’re gonna die on this hill as it’s been spoken before. Alright…

[chuckle]

0:49:17.7 TW: Almost to 200.

0:49:19.0 MH: I’m gonna switch gears a little bit ’cause we’re about to wrap up a little bit, but I think one of the questions that was submitted by, I think someone attending the conference, is “What is your best piece of advice for new analysts who are looking to make a difference in the field?” I like the question, and I thought I’d ask you and Moe, Tim, what advice would you give?

0:49:41.5 MK: It depends how they define difference in the field because when I hear difference in the field, I think about the community, and that’s always actually been really important to me when I started out in data. Actually, many of the people that are in this room helped me out, they would get on Zooms with me, they would talk me through technical hurdles, and so for me, when I talk about making a difference, I try and give back to other new people starting out in the industry, whether that’s through a meet-up or a conference or mentorship because that’s the thing that drives me. But difference in the community can also be like contributing to open source code and things like that. I think this person needs to figure out of like, “What is the difference I wanna make?” Do they mean within their own role? Which is very different to, yeah, if they wanna have an impact on other people within the data space.

0:50:33.4 TW: I like that, it’s like, you shouldn’t be trying to make one difference, you should be probably pursuing a few… Well, no, to your point…

0:50:38.6 MK: Well… Okay. Yeah, sure. Yeah.

0:50:40.7 TW: If you say, “I wanna have a singular focus.” But I will… Cassie Kozyrkov who, I think, we all admire, at least the three of us, but she did a recent series of posts where the title… She has a wrap… Overall wrap on this, “10 differences between amateurs and professional analysts.” And I think if I had read that when I was five years into being an analyst, I would have thought, “Oh my God, she’s crazy, this is ridiculous.” I read it now, and there are still a couple of things where I’m like, “Hmm, I don’t think that’s right, so I’m probably wrong. I just don’t understand it yet.” But reading to actually think through what it means to be an analyst, ’cause there are so many people who get, I think, the idea of being an analyst and they’re not actually really analysts. I don’t know if that’s a good answer to the question.

0:51:32.4 MH: Well, I think, if they’re here, coming here, it’s probably a really great start, being around analytics people, and I think that works. And I think you’re gonna feel the road in front of you is so much longer than the road you’ve already traveled, and that feeling will never go away. In fact, it will only increase the distance, the wiser, more experience you come, the more you realize there is to learn. And so, as an analyst being here, you’re already starting to make a difference. Interacting with the community, learning your craft, and overcoming the challenges that we all face around, “I’m not… Who’s gonna listen to me? I’m nobody. Well, why should the CEO of this company take my advice?”

0:52:22.2 MH: Those are things we all have to get over in different ways. And probably in a lot of the sessions you’ve heard strategies, and tips and tricks, and we certainly cover that on the podcast. So, thank you for helping us be a little better, and making a little difference, and being part of this podcast today. And thank you, Jim, for inviting us. Thank you all for attending and contributing to this episode. Thank you, Moe, and Tim, for being my co-hosts. And of course, no show would be complete without a big thank you to Josh Crowhurst, who we’re sad couldn’t be here with us. He’s our producer. And usually, we do last calls, but we’d love to hear from you. We’re all here for a couple more hours, so seek us out, but you can also find us on the Measure Slack and what…

0:53:14.8 TW: I think Moe has to head to the airport.

0:53:15.9 MK: I’d have to go to an airport. But sure.

0:53:16.9 MH: Oh. You can talk to Moe at the airport.

[laughter]

0:53:20.1 MH: Or also, we have a LinkedIn page, and we’re also on Twitter. But you could probably find us tooling around at some Marketing Analytics Summit somewhere, sometime. And remember, one of the most important ways you can keep making a difference, and I think my two co-hosts would agree with me, Moe and Tim, you can keep analyzing.

[music]

0:53:41.4 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions, and questions on Twitter at @AnalyticsHour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.

0:54:00.5 Charles Barkley: Smart guys want to fit in, so they made up a term called “analytics.” Analytics don’t work.

0:54:06.1 Tom Hammerschmidt: Analytics. Oh my God, what the fuck does that even mean?

[music]

0:54:14.7 Announcer: These are our people. This is how we really know that we’re in the right room. But there are some fun facts that you don’t know before I ask Michael to come up and introduce everybody. Tim Wilson is a kayaking nature photographer. Moe Kiss is a gardening enthusiast, a new mother. She lives in Sydney, Australia, but has a cat named after her in San Jose, California.

[laughter]

0:54:38.9 S1: And Michael Helbling once bought a couch for $44.

[music]
[applause]

0:54:48.3 TW: Rock, flag, and data in Vegas, baby!

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares