#295: Research and Analytics: the Peanut Butter and Chocolate of Data?

Research and analytics: are they more like peanut butter and chocolate, or more like oil and water? On this episode, we dig into the surprisingly common (and surprisingly unfortunate) divide between these two disciplines with Stefanie Zammit, Global Director of Analytics and Insights at Bang & Olufsen. Stefanie has spent her career bridging the qual and quant worlds, and she makes a compelling case that the best insights come from putting both methodologies to work on the same business problems. From the “never ask a survey question you already have the answer to” rule to why personas are usually terrible (spoiler: it’s not the clustering, it’s the storytelling), we explore how organizations can break down the silos between research and analytics teams. Turns out, the fear of the unknown and a bunch of fancy terminology might be keeping us from some pretty powerful insights. Also, apparently 100% soundproof rooms are absolutely terrifying.

Links to Resources Mentioned in the Show

Photo by Vardan Papikyan on Unsplash

Episode Transcript

00:00:00.00 [Announcer]: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.

00:00:13.24 [Michael Helbling]: Hi everybody, welcome to the Analytics Power Hour and this is episode 295. You know a common phrase we hear in our industry is that the data tells us what happened, but not necessarily why. And by and large that’s true. We keep getting better at inference and some patterns of data are pretty well understood in terms of their meaning, but still there is simply something so compelling about observing how people interact with the things we’ve built, websites, products, etc. Personally I remember what an eye-opening experience it was for me 15 years ago, the first time I was sitting in a usability lab watching from behind a one-way mirror as people use the website that I measured every day with my digital data. So we wanted to talk about it, get into the topic bridging between research and more traditional analytics. And I want to introduce my co-hosts, Val Kroll. Welcome.

00:01:12.56 [Val Kroll]: Hello.

00:01:13.56 [Announcer]: Hello.

00:01:14.56 [Michael Helbling]: And I know this is a special topic for you because if you’re background into customer research.

00:01:20.08 [Val Kroll]: Oh my gosh, doing back flips. Very excited for this.

00:01:22.08 [Michael Helbling]: Yeah, I’m excited too. And Julie Hoyer, welcome. Hello. Hi. Have you done much like market research or customer research?

00:01:32.04 [Julie Hoyer]: Not myself, but I have gotten a chance to like utilize the outputs of some of those studies, which has been nice. So I’m really excited to talk about it more.

00:01:40.36 [Michael Helbling]: Yeah. Excellent. All right. And I’m Michael Helbling. And to bring additional expertise to this topic, I’m pleased to introduce our guest, Stefanie Zammit. She is the Global Director of Analytics and Insights at Bang & Olufsen. Prior to that, she led research and analytics teams at companies like Starbucks and Marks and Spencer’s. She’s worked both as a consultant in the space for many years as well. And today she is our guest. Welcome to the show, Stefanie.

00:02:05.24 [Stefanie Zammit]: Hello.

00:02:06.24 [Michael Helbling]: Happy to be here. Awesome. We’re so glad to have you. And I think this is a topic that while we cover sort of like data and analytics, research is one of those things that we really like. And so we’re really excited when we met you to sort of dig into this topic more. But to kind of catch everybody up to speed, I thought it’d be great to kick off with just you explain a little bit about your background and career and kind of how it bridged these two things and sort of, you know, what your journey has been across research and analytics.

00:02:40.40 [Stefanie Zammit]: Yeah. Absolutely. I’m very much from a pure hardcore research background. That’s where I started my career too many years ago. I started as, actually, my first job was at university. I didn’t even know what research was. I took a part-time job doing the national student survey. That’s like a thing here in the UK. It’s how the university rankings are put together. And when I graduated, I think a lot of researchers would have a similar story to this, that they sort of ended up accidentally in the field. And so I graduated during a recession, dark times, and I was desperate. And I thought back to that part-time job I had at university and like, what is that? Like, is that an industry? Is that a thing I could do forever? So I did some research and I was in the UK at the time where, luckily, there’s an amazing research industry here. There’s so many great consultancies. It’s a thriving industry. I took my first job at a company called Quadrangle, which was a management consultancy that leaned very heavily on their own in-house research function. And it was amazing, an eye-opening, and I immediately fell in love with so many aspects of it. I then went to Ipsos, which is one of the big five. That’s where I was like, I need some hardcore research skills. I need to learn about statistics and hit me with the heavy quant stuff, go to a big powerhouse. So I was there for a couple of years. And then after that, I built my own. I co-founded a research consultancy at this time I was in the Middle East, was a company called Intelligence Qatar, which is still very much there today, although sadly, I don’t get to play a part in it anymore. And I think that was where previously at research consultancies, they really divide the teams up. So you had your market researchers, and then you had your analytics department with all the smart people, the statisticians were all there, then you’d have your fieldwork teams, your data processing teams. And it was very siloed, and even market research was pretty siloed. You’d have your qual team, and then your quant team, and something that was really hard for me as I progressed agency side, you’d have recruiters say to you, like, do you want a qual role or a quant role? And I struggled so hard to give them the answer because I genuinely loved both. And my favorite projects were the multi-phase projects where you’d get the best of both. But I was also super kind of looking over the shoulder of my analytics colleagues and statistician colleagues, what are you guys up to? What are you doing? So I always naturally was interested in the entire spectrum. And then when I had my own agency, again, really investing in the analytics side, as well as the research side just brought me closer to those worlds. And then finally went client side, I thought, all right, I had enough of this consulting game, joined Marks and Spencer’s. And I was really lucky that they had research analytics together in one department. That’s all I’ve known because it was like that in that first company. And we had a great leader at the time that was very adamant that the best deliverables are worked on by both of these teams. So I took that to Starbucks as well, where again, the organization is huge in Starbucks, you’re looking at around 250 people, but they are all in the same department together, research and analytics, if subteams, at least it’s still in the same family. So it’s something that I’ve been so passionate about today, and I think really set me up for success to do what I do now, which is to lead both the analytics function and the market research function in one team.

00:06:09.76 [Val Kroll]: I love that. The one thing that you mentioned there about not being able to pick which you liked better, the qual or the quant in your favorite, where the multi-phase projects that start with the qual and then lead into quant, I feel like that’s like a little bit novel, like a little bit inside baseball for researchers. Can you describe what that is because I have a follow up question after you talk about that a little bit, but just kind of describe like why someone would have like a multi-phased approach to like their research project?

00:06:40.60 [Announcer]: Absolutely.

00:06:42.20 [Stefanie Zammit]: So every methodology has its benefit. Qualitative research is where you start when you don’t know much about a subject and you need to explore. It’s very exploratory. You are using it to figure out where your hypotheses even are. You might have one or two hypotheses, but they’re fluffy and you need to explore the topic more. So qualitative is where you do that in this kind of limitless way of a very open data generation phase. And then you need to validate because you’ve spoken to like, I don’t know, max 40 people if you’re doing a really big qualitative project. So you want to validate that and you need some statistics and some numbers behind it. So you want to do your quant. So I have my hypotheses now, I’ll run a survey to measure and size those truths and see how statistically significant they are. And methodologically, these require two different expertise because qualitatively you’re trained in moderation, projective techniques, how to read between the lines, how to read people’s faces and emotions and hear what they’re not saying as well as what they are saying. There’s also a much more kind of deep psychology to interpreting those insights because, again, you’re reading between the lines. Quantitative, you need to know about driver’s analysis and cluster analysis. In fact, you need to know all of the statistical models that give you the derived insights that are so, so valuable from a survey. So they are usually kept separate. And I think that’s a shame because the best projects are the ones that do both of these things for a really strong final insight.

00:08:21.36 [Val Kroll]: Yeah, that was very well explained. And I remember in my market research days, a lot of times, if you think about if you were to do a survey, you have to, a quantitative survey, and you’re picking the list of options that someone is going to select that is the right answer to the question for them, sometimes that list isn’t always as clear, like what should belong in it, even if it’s a list of competitors. A lot of times, clients will think about in-category competitors, about who your competitors are for an alcohol brand. But that same dollar could be spent on other things that are out of category competition. And so you can use QAL to help explore to figure out what even is the list, because you could miss so much if you start just with a quant without going broad first, to be exploratory, like exactly, except Stefanie, to figure out what your hypothesis is. And the reason I want to dig into this a little bit is because this is one of the ways that I love helping illustrate or describe, especially to people describe, to people in the analytics, some of the value of bringing these worlds together. Because it’s not just using one single methodology or tool. It helps you illuminate a different part of your question or your process. And so how those two things come together very naturally inside of research is one of the ways you can kind of illustrate the coming together beyond just the research or the direct consumer or B2B context.

00:09:49.28 [Stefanie Zammit]: Exactly. And I think from a research-only perspective, I was already at a very early stage so powered up by the idea that if you put these two together, you’re getting better insights because you’re starting broad and then you’re getting specific with the quant. And I took that into as I gained more seniority in my career and started working, especially in-house, where you have colleagues in other disciplines, which is all data and insight. I took that into that as well to say, well, how can analytics be part of this? And why are we asking questions and surveys that we already have the answer to? That seems like a huge waste of time. How can we be using all these disciplines to get the best insight? And all of this ultimately comes down to a passion to chase the best insight, right? Methodology should be irrelevant to it’s not about the journey, it’s about where you end up. And I think having that crystallized in my mind from the very beginning really helped me see the world in the way that I see it now, that who cares what you’re trained in. At the end of the day, we use the best method to get the best insight. And it doesn’t matter whether that’s in this team or that team.

00:11:01.92 [Julie Hoyer]: And I feel like you are one of the more rare data leaders out there that recognize it has to be problem focused instead of leaders coming to their teams with ordering a solution. They have a problem, but they’re not always communicating that to the teams that are going to help service them. They’re kind of coming with, well, I want you to pull these numbers or ask these questions type thing instead of to your point just anchoring on, I don’t care how you do it, you guys are the experts in that part. But what I’m facing is problem X and I’m looking for ways to solve it. And then letting people get creative with how do they maybe partner together to get them the best solution? I feel like we just run into that at kind of like all levels of the business. We’ve all had different experiences with like trying to overcome that hurdle. So it’s really refreshing to hear you so eloquently like talk about it in the way you frame it.

00:12:06.64 [Stefanie Zammit]: And 100%, that’s so normal everywhere. I’ve experienced it everywhere that you have stakeholders coming to you. We want to do some quality. We want to run a survey. I need a dashboard. I need they they’re very prescriptive. And I think it’s part of our job as insights folks, irrespective of our training to say, whoa, whoa, hold on there. Let me understand what you’re trying to do. What decision are you trying to make? The answer might actually not be in a dashboard. It might be in a piece of custom analytics or it might be. So our consulting work is a big part of this job to find the best methodology for the best answer. We can’t expect our stakeholders to know what that is, although they’re very welcome to make suggestions, of course. But it’s a broad broad spectrum of tools that we could use.

00:12:54.40 [Julie Hoyer]: Yeah, absolutely. And one of the questions I’ve been dying to ask you is, why do you think, and Val, I’d be interested in your take, too, because you’ve kind of lived in both of these worlds as well. Like, historically, why don’t research teams and analytics teams always play together at all? Or if they’re playing together, they don’t always play nicely together? Good question. Get into it.

00:13:20.64 [Val Kroll]: Stefanie, you have to go first.

00:13:22.72 [Stefanie Zammit]: It’s a great question. And I think the answer is fear. I think that there is the fear of the unknown. And there is an assumption that the other world is so mysterious and so different to our world. We don’t understand each other. It’s literally a form of othering within teams, within departments. And it’s a fear I myself had. I could never keep up with these data scientists and, oh, they’re just so smart. And they understand all these things in a way that I never could. And then you start working together. And you see that you actually have more in common than you do have differences. And they see it, too. They understand, they can learn from the research process and understand, oh, hey, I thought research was just qual. I don’t know if you’ve ever heard that, any of you guys, especially Val, that the assumption that research equals qual, that’s the qual work. And so I just did a survey with 8,000 people in this hardcore conjoint statistical model. You can’t call that qual. But there isn’t an understanding. There isn’t enough knowledge. And everyone assumes that the other side of the coin is so different. It’s a whole different world. And I think it comes from the way that agencies are set up and departments are set up to separate these skills when actually they’re stronger together.

00:14:43.36 [Tim Wilson]: Michael, why does every quick question come with a 20-minute origin story? Well, that’s because our metrics have, I don’t know, lore.

00:14:53.76 [Michael Helbling]: Conversions might mean three different things depending on who’s presenting and how close we are to the next quarterly board meeting.

00:15:00.72 [Tim Wilson]: And I mean, every time you switch tools, you have to re-explain the lore like you’re residing ancient prophecy.

00:15:07.20 [Michael Helbling]: On the seventh day of Q3, the trekking broke and lo, that metric was doubled for July.

00:15:12.80 [Tim Wilson]: That’s why we’re excited about askwide.ai in Prism, because it has memory that actually remembers. You don’t have to repeat the lore.

00:15:21.76 [Michael Helbling]: Yeah, the jam system keeps context across sessions what your org means by revenue

00:15:27.84 [Announcer]: or conversions, which tables the source of truth.

00:15:31.36 [Michael Helbling]: And the weird exception, you’ll always forget until it’s too late.

00:15:34.56 [Tim Wilson]: Plus, you can save your best workflows as skills. Portable expertise you can actually reuse like a human.

00:15:42.88 [Michael Helbling]: So I don’t have to manually normalize UTMs or fix or tweak that GA4 channel grouping or deduplicate leads without breaking down into tears.

00:15:57.04 [Tim Wilson]: Yeah, so instead of rebuilding the same process like every week.

00:16:01.28 [Michael Helbling]: Yeah, I guess I just run the skill and go about the rest of my day. Kind of happy.

00:16:07.84 [Tim Wilson]: So one end, go to ask-wide.ai and join the waitlist. It’s in beta, but you can get in on the ground floor.

00:16:14.96 [Michael Helbling]: Yeah, and if you use code APH, you’ll get pushed to the top of the waitlist. That’s ask-the-letter-why.ai and use code APH.

00:16:26.16 [Announcer]: All right, let’s get back to the show.

00:16:28.56 [Val Kroll]: And where you landed on that, I think is one of the drivers that I see. The drivers that I see in my mind is if you think about how these two disciplines grew up in the world, research used to live within marketing. It used to be marketing research. And so we would sit within marketing teams. All of my clients back in the day were CMOs or reported up to the CMO. My first web analytics job, I was in IT and it was very technology heavy. It was about the tools and the way the data was collected. And like surveys, people think like, oh, like pen and paper, like mailing surveys or someone chasing you down at a mall with a clipboard. Like, would you like to take a survey? There’s so much more to that, like with panels and different ways that you can contact people nowadays. So I think- Limes have moved on. Yes, yes. There has been an evolution, yes. And so I think it’s just kind of been just how it grew up,

00:17:25.04 [Announcer]: was kind of thought of differently, budgeted for differently, like research,

00:17:29.52 [Val Kroll]: I think has always has this like, wrap a little bit. I’m interested in your thoughts on this too, Stefanie. That like a lot of times it can be like costly, like not just dollars, but time. So that it takes a long time to, you know, every, you know, we only do the brand tracking study once a year because it’s such a big, you know, piece of research or like it’s actually not valuable to keep track of that on a more frequent basis where a lot of the costs from some of the other analytics practices or areas are some more hidden costs because they’re in the technology or actually the human solutions. And so especially when we’re talking about like in-house, I think that it’s just budgeted for differently. And so people aren’t really like connecting the dots.

00:18:08.16 [Announcer]: But I think the organizations where you can break down those silos,

00:18:11.36 [Val Kroll]: because I’ve actually never worked for a client that’s had the setup that you’re talking about, Stefanie, which would be like, oh, Nirvana to have them like coming together. But I always found ourselves making the suggestion about like, did you talk to that team? And they’re like, who? And they’re like, well, I was scrolling through your active directory and I found this person with this title, like you should reach out to them to see if they can help us. But yeah, so that’s kind of what I think is like part of the rationale that I think that I do hope that this is like a movement though, like this evolution towards thinking more flexibly about the methodologies and what’s the right fit for the question at hand and what’s going to serve the business best.

00:18:50.80 [Stefanie Zammit]: Yes. And I should add that the Nirvana, I may be making it sound more Nirvana-like than it actually is. I mean, again, when you look at huge organizations like Starbucks, which it’s just such a math, there’s thousands of people at head office. Although we were all in a department together, the reality is that the silos were still very strong. At the time when I first joined Starbucks, I was in service to the loyalty team. So the rewards program and the app, you know, and how our customers use the app, et cetera. And I realized that I was trying to serve these stakeholders with insights about app usage and loyalty program behaviors and all the rest of it. Meanwhile, you had folks in analytics who were also answering questions for the same stakeholders. And there was just such a clear overlap in, we’re both talking about behavioral data, we’re both talking about what the client wants, the customer wants and needs. And so what we did was we formed a little, within the department, a little community of people who are in service to this stakeholder group, irrespective of where in the mega data analytics and insights department you are, we come together and we talk about all our projects. We formed a little, I don’t want to call it a Sierco because I hate the word Sierco, very anti-allergic. Oh my God. But just like a little forum of round robin, what’s everyone working on? And then we can say, oh, you’re doing that. I’ve got a survey coming up which directly overlaps with the objectives of what you’re looking at. But I can tell you why and you’re measuring what. So why don’t we put them together and, hey, our stakeholder will get something that’s more complete and less confusing rather than 10 different reports that all overlap, but the actionability is lost because we’re pointing in different directions on similar

00:20:43.12 [Michael Helbling]: and yet not quite the same topics. That’s crazy because I literally have built something almost identical but coming from the data side to the research side at a company I used to work at, which was forming this little team that we met and we’re saying, okay, let’s get together and get all of us together so we have a coherent story. And it’s always stuck with me that why is the organization having to be managed sort of bottom up in that regard when the reality is the structure or the layout of the org should be thought through to enable that kind of capability from the very top. And it’s just one of those things that sort of sticks out as a sore thumb. And I don’t know if I have prescriptions for that, but I will say it’s like Val, you mentioned we walk into a client, you’re kind of looking around like where’s your research team and why aren’t we talking to them too, which I think is really apt. But also like certain companies like you’re going to do analytics work and there is no research team or nothing named as such. And it’s sort of like a lost function or a missing function. And it might be kind of Julie to your earlier point, the leaders of the org just sort of think they’ve got it figured out so they don’t need someone to sort of like think about what the customer actually thinks because they’re thinking for the customer, if you will, not a great plan. But anyways, I’m just curious, Stefanie, because you’ve done some consulting in this space as well, like how do companies sort of jump the chasm first from just not even having a concept for doing research like this? Because everybody’s got analytics, like we’ve all got snowflake and data bricks or something running the back end of all of our data, but a lot of companies have zero going on in terms of like either customer or market research.

00:22:45.76 [Stefanie Zammit]: And if they do, they outsource it to agencies. So they would do a one-off project that an external company will run. And this is where you write it as complexity, because it’s rare that a company would have a full function in-house research team, because the manpower that you need on a per project level is huge, right? We’re talking thousands of people like, okay, not thousands, but at least 50 going out, interviewing in different countries and then your data processing and then your statistics. And then so the per project value for money of that much manpower is just not worth it. So they outsource to agencies who have economies of scale. Those agencies then in turn don’t think to ask, hey, do you have a data team? Do you have an analytics team? They sell analytics. Why would they say, hey, we’re going to build a segmentation for you. We’ll do all the research, but we’ll hand over to your in-house analytics and they can do the clusters. Like they’re never going to say that. They’ll be like, yeah, we do end to end, you know. But so I think that’s the challenge. And what I would advise organizations to do, to mitigate against that, is to have even just one person. At Bang & Offsen, we have one person. He’s a superhero. He’s a one-man research department. One person to coordinate all research projects and you can still use vendors, but you have an internal knowledge bank being built and internal consistency even. And then that person knows to work with analytics and to blend the two together while also getting the economies of scale from the agencies.

00:24:21.92 [Val Kroll]: Closing the chasm, I want to spend a little bit more time on this and your thought of making sure you have someone to represent that perspective versus like, does anyone want to add any new questions to this year’s tracker? Like it has to be so much deeper than that for it to be meaningful. But my first boss, when I was in market research, she grew up, Lynn Bartos, if you’re listening, she grew up at Burke. And so she was like hardcore, like you were saying, like having those skills. And one of the things that she always talked about was that she spent two weeks touring all the different departments, like someone who sat within, you know, data processing, someone who sat within coding for all the open-ended responses to get an appreciation for the operations and like how the sausage was made, because that makes you smarter when you request your banners or when you think about like, you know, how do I develop the questions to get me to a driver’s analysis or things like that. And so we did that ourselves on her team. And I really had an appreciation for that. Do you think that the closing the chasm is cross-training people so that they have like a better appreciation for each other’s skills, like when they do both exist in-house? Or is it more, you know, pizza lunches, like dog and pony show of like results? Or because I love what you talked about of having the little non-steerco-steerco, that’s aligned to a stakeholder group, like I would have never thought of that. But I’m just wondering if there’s some other things, unfortunately, like you were also saying, Michael bottoms up that people could do to kind of close this chasm between the team. Like what things would you recommend to listeners for who have an interest on the other side? Absolutely,

00:26:03.12 [Stefanie Zammit]: yes. So for anyone listening who’s not managing a team but is in one of these roles, whether you’re in data science, you know, data engineering, research, whatever it might be that touches on data, I’d really recommend the best that that person can do for their own career is to have a natural curiosity for the methodologies and the training that the others in the department have. And I always say to people in my team, if you find yourself in a conversation where you have no idea what people are talking about, or it feels like scary or just very different to what your expertise is, that is where you will learn that you should lean into those conversations. We should go out of our way to understand, not in an annoying way, like, hey, dude, what are you doing every day? But just to have a natural curiosity of how does your work fit with my work? And that’s not even just for analytics and insights or for data. That should be for everyone in a corporate job working for a brand where we serve our customers. We should all be understanding how do our worlds fit together for the customer. But especially within data, because data is the customer, we represent the customer, have that curiosity. And for anyone who’s listening who’s in a leadership role, then yes, I really recommend fostering that within your teams,

00:27:24.16 [Announcer]: that natural curiosity, and getting people to take a moment to question if anyone else in a data

00:27:32.24 [Stefanie Zammit]: related role can contribute to the project to add further insights. So would the data science team know anything here? Maybe they don’t have a deliverable, but maybe from their investigations or their work, they would have context that would add valuable insight to my work. Would the research team maybe have had a project about this? Maybe not, but again, the data folks are in, you know, they’re hands dirty in the data every day. The amount of information that they process and that they gain exposure to, which is not reported ever, is huge, right? We could never report all the facts, but it’s there, it’s in their heads and in their experience. So it might not be reported anywhere, but that’s a good person to talk to because they would have context. Same with the researchers, you know, they’re out conducting intercepts, ethnography focus groups, not everything makes it into the final report. But if you sit down and talk to each other, you realize, oh, yeah, I know something about that. I remember seeing something about that. I have a good quote that brings what you’re doing to life, whatever it might be. See, I really encourage curiosity. My first job, I started in QUAL, actually, which is like the furthest away from data science and analytics. And I was so scared to move toward QUANT. And I got reassigned to a tracker. So I went from like the QUAL team, you know, ultra open to a tracker, like 100% only ever working on this one tracker. And at the time, I was so grumpy about it. I was just like, this sucks. Like,

00:29:03.20 [Announcer]: where’s the creativity? You know, where’s the art? But actually, it was the best thing that

00:29:09.52 [Stefanie Zammit]: could have happened to me, because I didn’t know anything about tracking. And I, you know, I would have been pigeonholed if I’d have just followed my natural heart. It forced me to learn about a different world. And then I realized, actually, this is interesting. Actually, QUANT. QUANT is interesting. Look at that. Who would have thought? Tracking is actually interesting. There’s insights here. But unless you’re kind of forced into it, or at least when you’re young, you need to be forced a little bit, it’s difficult to just naturally expect that you would find these other worlds interesting. But I guarantee if you really, you know, scratch at that or, you know, peek into those boxes, you will find a lot of very interesting things that will help your own role.

00:29:50.88 [Julie Hoyer]: You touched on this a little bit earlier too, Stefanie. And I’m curious. So when you’re leading your team and you have this great point of view on how these things can work

00:30:01.84 [Announcer]: together. And obviously, we talked about trying to encourage the curiosity of everyone on your team.

00:30:08.00 [Julie Hoyer]: But are there some more like formal processes that you’ve also put in place for your team, or how you guys pick up projects or execute projects with your stakeholders that really help these come together in the best way possible? Because you mentioned earlier, like the, you know, starting with the, the qual and then you get a hypothesis, then you follow up with quant. So I wanted to like dive a little deeper in that area and hear what some actual harder, like boundaries or processes you utilize to help.

00:30:39.12 [Stefanie Zammit]: Yeah, 100%. And there’s so many examples, I’m going to try to stay focused here. So first rule of thumb, no survey asks a question that we already have the answer to. And the only way to know that is to go and talk to the data team and know what we have the answers to. So just as a rule of thumb, and even from our customer experiences, right, customers shouldn’t be telling us like their demographics, if they’re signed up to us and, you know, we should know who they are. Second rule of thumb, every research project, a sub sample, even if it’s not in your objectives to interview clients, even if you’re, say, you’re doing a new customer acquisition piece. And, you know, you want to go out to like a purely external sample, you don’t want any internal sample. Even if that’s the case, a subsection of the survey respondees should be from internal customer, known customer sample. And especially if you have a segmentation or you have certain key questions or certain key profile, you know, data points that you want, you want to continue building that knowledge internally, use your surveys in that way. So for example, we’re doing segmentation work now, we’ve designed, we’re starting, we actually started with data. So what do we know about our client? To what extent can we profile them before it becomes a mystery? Right. That’s the point at which now we take it to research, we fill in the blanks with research, but we, we interview as much of our own customers as we can. So that then once the survey is complete, we bring all that data enrichment back in house, it’s tagged to non customers. And we can use both worlds to create the segmentation. So you have your attitudinal stuff, you have your profiling that you would never be able to know just from data, and you have your behavioral data as well. And we have this amazing analytics team, they can do the segmentation. We don’t have to use an agency for the entire thing. So we’ve saved money, you know, we know where we can stop the agency to make use of our internal skills. Now we’ve got money left over for a different project. Great. Let’s go do some call with these segments and get to know them, bring them to life, do ethnography, take video. Now when we go out to our stakeholders, we’ve got our segmentation, it’s attitudinal and, and behavioral. We’ve got all this great qualitative bringing them to life. Right. So, and this goes for every project. So you’re doing a piece of work with drivers analysis. Okay, great. We might do a survey, you know, you do your usual like drivers analysis. But then let’s say, okay, how can we, some of that survey was internal customer sample, how do we bring that back to the analytics team and say, well, we want to grow. So let’s identify these people and do an internal business drivers analysis based on the learnings from the survey to see if we can replicate those drivers in our data. Lo and behold, you’ve got an internal business drivers analysis that you can now track because it’s in our data. So how if we move the needle, did we actually grow, we can actually say, yes, that was that insight was successful. And we did actually grow. There’s many more examples, but wherever we can put both worlds together in a single project, I absolutely recommend we do. The other thing is communities. If, if you’re a business that’s lucky enough to have a customer community, which is a qualitative research tool, it’s like a panel of customers that you can do quick polls and surveys, it’s like a social media for your customers. And they, you get so much qualitative insight. Those communities, the best versions of those are built on top of internal data lakes. So you can follow the strings down to who these people are and how they are transacting. Every project you run in your community now has a behavioral data trail to look at, okay, we’ve got insights, the business made a decision. Now you can measure the impact of that decision because we can track these people. Did they actually spend more? Did they actually convert? So it’s, the possibilities are endless. And it honestly all comes from recognizing that we’re all after the same goal here. And, and especially research quant, research quant have analytics teams. They’re doing, they have very similar backgrounds to in-house analytics teams. Like I said, there’s more similarities than there are differences, but the in-house analytics teams might not naturally be tasked with, we’re going to run a con joint or we’re going to do, you know, things like Max Diff, which is research analytics. It’s not really as well known methodology in in-house analytics, but they can learn why, you know, why shouldn’t we bring those tools to in-house analytics. And then it’s interesting for the analytics teams to learn these methodologies as well. Over time, you save money because you’re spending less on external agencies. That’s awesome. I love those. And you have better data. Yeah, I love the rule of thumb. Yeah, I feel

00:35:22.80 [Julie Hoyer]: like it’s an accelerator the way you’re talking about. I kind of hate the word flywheel, but it

00:35:27.60 [Stefanie Zammit]: makes me think of a flywheel. But the researchers also need discipline. Like they, they need to be really close to in-house data analytics reporting. Like it’s amazing how many researchers I’ve met that have never used like the dashboards, you know, they’re not in the BI at all. And like, why wouldn’t you be, again, as a rule of thumb, if you want to conduct good research, you need your sample to be representative of your customer base. How do you know what that looks like? Well, there’s BI reporting that shows, you know, you should be feeding that to the vendors to build the sample plan and the waiting plan. So it’s all connected. It’s all one thing. And I think when

00:36:07.12 [Val Kroll]: you’re talking about this connectivity too, you can be smarter about like, you know, even breaking it up like an example, especially if you have the panel or if it’s a known population, instead of asking like, amazing, like how likely are you to buy this again over the next six months or how often, you know, I remember working on advertising awareness research for a cruise line. And they would always ask like, how likely are you just to plan a cruise for you and your family over the next year? I’m like, over the next year, like these people, like they don’t know what they’re doing, they don’t know what they’re having for lunch. Like, why are you asking over the next year? Like, let’s look, there’s got to be other data for this. But in the same way, like there’s so many people who will be building out a fallout report in Adobe Analytics, like looking at them, like trying to discover the friction points and like coming up with the why like on their own, like, oh, they couldn’t make it to this next step because and it’s like, well, did you ask them if that was part of the friction because usability labs, like, you know, pop up surveys, there’s like so many different tools or ways that we can connect with the customer nowadays that, you know, not trying to fill in the blanks, like there’s a lot of different ways that we could just find out

00:37:18.08 [Stefanie Zammit]: directly. And that is another of my rules of thumbs that I didn’t mention earlier is the power of derived research versus stated, honestly, you’re wasting your money on stated surveys, they’re just no one knows humans do not know why we behave the way we do, right? It’s all like deep psyche. We’re super weird creatures. We have all these quirks that we don’t understand. So there, yeah, there’s you’re wasting your dollars on how likely are you to book a cruise? Like, whatever, it’s bullshit. It’s not sorry. Oh, we can swear on this. Yeah. Oh, we’re explicitly ready to send it. Let’s encourage. Yeah. Exactly. Absolutely. Like that, that needs to be the best quant research has analytics. If you’re running quant without analytics in your surveys, I don’t know what you’re spending your money on. Honestly, it’s yeah, it’s just not good value. And that again,

00:38:10.32 [Michael Helbling]: that ties then to internal analytics. This is so good. So I’m sitting here from a data practitioner standpoint and just loving the conversation. At the same time, I’m going to admit to you that like you’re throwing out certain terminology that I vaguely familiar with, but don’t necessarily know. Like what are there resources that could help someone kind of like level up and get better understanding of just sort of topics, structures, stuff like that, any like good overall books or

00:38:39.36 [Stefanie Zammit]: resources online, like anything you might recommend? Yes. And this is a really important point. I think another reason for the othering that happens in these fields is purely language. And so you’re right. I’m using, I’m trained in research terminology, which is fine. I just am like

00:38:56.08 [Michael Helbling]: admitting that I don’t know all the words you used. So yeah, but the, the, a lot of the words

00:39:00.88 [Stefanie Zammit]: I’m using, there’s, there’s a analytic or data equivalent of it. It’s, it’s just that different term terminologies used. So, you know, data might talk about addressable audience, which is like sample plans. The best advice I can give is research is grounded in academia. It came from, you know, like scientific studies or social studies. And so for anyone who, who was at university doing research projects as part of their university degree, that is the foundation of modern commercial research, but there absolutely are some great tools. There’s a really great book that I recommend. It’s, it’s one book. It’s the only one you need. What’s it called? I think it’s called, it’s the market research society’s main, main book. I think it’s called intro to market research. And that’s, yeah, that’s your one reference of just looking up these words. And you’ll be amazed how many of them you look up that you’ll recognize as not actually that unusual. So like an attribution model, how different is that from a customer journey? A research team would run a customer journey study, a data science team or analytics team would run an attribution model or call it customer journey, but you know, depending on what, where your journey

00:40:07.04 [Michael Helbling]: is. It’s a, it’s a customer journey has been used all over the place for all kinds of stuff.

00:40:14.48 [Julie Hoyer]: Really, whatever you want. Just like use case. Oh God.

00:40:20.64 [Michael Helbling]: The journey of customer journey is a little bit tricky.

00:40:24.64 [Val Kroll]: That’s, that’s, that’s a, that’s a cartoon strip right there. Well, so there actually is, so to your point, Michael, there is one, because you were just starting to talk about this about the difference between like stated versus like derived importance, which gets to max diff, which I,

00:40:40.88 [Announcer]: I literally couldn’t love anything more than studies where we get to do that. Because there’s

00:40:45.68 [Val Kroll]: so many, there’s so many different applications. But when I was, I worked on the telecom vertical is my first job out of college. And we would use that to figure out like back in the day, like cable internet TV, what should be involved in that packaging and at what price points. And people would say things like, Oh yeah, I need access to like 600 channels. When we asked like, what’s most important to you, but when we actually did like the force ranking, or there’s like different techniques, that that actually is one of the things that fell to the bottom, that it was about, you know, how long is it going to take for the technician to install the, the cable box. And there was like all these other things that were like, not something that someone would say necessarily, but it really, when it comes out in the wash. But anyways, that’s just like one example. But could you, especially if you have an example that you can pull upon to talk a little bit about state of versus drive importance or one of your favorite examples of that? It’s a really fun one.

00:41:40.40 [Stefanie Zammit]: I mean, in a nutshell, the difference is asking someone, yeah, how likely are you to do this? And with, whereas with derive, you basically give them an exercise and then you observe behavior. So for me, derived research is the same as what would be happening in a data team where you’re observing the behaviors, right? Because in data, there is no stated, like you’re not stating it, you’re just watching people behave and you’re tracking their data. So it’s the research version of that, that we give them different exercises or force response between many, like many, many, many choices, again, and again, and again, mixing them up again and again, so they could never remember. Yeah, like you could never remember the pattern. And through the continuous, like, it’s a choice between these four things or a choice between these and again and again, different scenarios, again and again, you can derive what the true behavior is or will be. So it could be used predictively to say, when this is true, this is the behavior that we want. Similar to building a predictive model based on behavioral data, that you can, you know, take all your data and map it over time and start to say, you know, just like run correlations to

00:42:54.56 [Announcer]: sort of say, when this is true, this is more likely to happen. So again, very similar outcomes,

00:42:59.60 [Michael Helbling]: but just completely different methodologies. All right, I’ve got another question that’s probably going to reveal how much I don’t know about this topic, but I want to ask it. Why are personas so bad most of the time? Segmentation makes my skin itch. If people are

00:43:18.64 [Julie Hoyer]: like, well, let’s look at the segments. I’m like, can we not? Because I honestly think,

00:43:23.68 [Michael Helbling]: I honestly think this is also a driver of the divide in a lot of ways. Like as a data guy, like I see people come up with these persona studies and stuff, and they’re dog shit. Like they’re really like terrible. And I’m like, scrap your personas. It’s all behavioral based at all. Like, you don’t because what I observe people do is they just make up who they like their customer to be. And then they’re like, this is this is Sally. And she’s a hip mother of three. And she drives a van. And but she’s got this cool thing that we like about our brand. And so that’s one of our personas. And it’s like, Sally doesn’t exist in our database anywhere. That’s not our customer. And the people who buy the products you’re talking about don’t look like her at all. Like it’s no correlation. Anyway, sorry, I’m now getting into my rants. But

00:44:18.16 [Stefanie Zammit]: what’s happening there? Like, why is it so bad? I love it. And I think the word segmentation or persona in themselves can mean so many things that these are words that are overused and not necessarily always used in the right way. And I don’t think there is actually a fixed definition. I think it just depends internally on what definition you choose. But why are they so bad? I have run uncountable number of segmentations, mostly from my market research background, where I have more years of experience. And I think that the win or lose of a segmentation is in how it is translated or brought to life for the business. So you cannot have a good segmentation without really solid underlying data, you know, hardcore, like, like a good, the factor analysis and the clustering, like, and all that has to be and you had the right variables and you had the right ingredients, all of that is really important. But that’s not actually matters or has impact. That’s just designing the output. It’s like a BI report. You can have like, you know, the BI report with 1000 million, like every possible data point, it’s amazing. But unless it’s, you know, user friendly, then it just doesn’t have impact. It’s the same with the segmentation. So without, if you only had a segment descriptively, this is Sally, you know, and Sally does this and Sally does that without knowing why or without finding Sally in the data and like saying, look, this is Sally, like specifically look, we’re going to, we know what Sally wants. So we’re going to, we’re going to send her a, we’re going to do a CRM strategy around Sally, we’re going to sell to her. And now we found Sally in her data, we can actually say, look at her changing her behavior. That’s when a segmentation is really powerful. And, and I think the best segmentation is to get to that level, you need both your behavioral data and your, your research, because research gets to how does Sally think, like what matters to Sally, you’re never going to get that from just observing her in data. You need to get into her psyche. So you put the two together. Now the marketing team have a strategy on like, who is Sally in terms of her psychology, what’s going to get Sally really freaking excited and get her to behave the way we want to do, but it’s underpinned by existing data. So you can actually see her, you can maybe test with her and, and over time see the impact of your segmentation and video, video, like I cannot overstate the power of a customer inside video, just like Sally walking down the street, like you can read, this is her life. It makes such a difference stakeholders in understanding who that person is

00:46:58.80 [Julie Hoyer]: beyond her being a data point. You make it sound like so, I mean, it is so ideal, but it makes it sound so like, duh, if you just did this, you know, you’d get everything you want because then we go where I go work with clients and it’s like, they’re just so far from that

00:47:13.36 [Announcer]: point. And it feels like such an uphill battle to try to help them fit together the two worlds that

00:47:19.20 [Julie Hoyer]: you’re talking about research and analytics, you know, segmentation is all based off of outcomes. And it’s like, but you want them to change behavior to drive outcomes, but now you’ve split them by outcome already. And then you ask them questions about outcome, it just feels like something’s been lost in a lot of the situation. And it’s, it’s sad to see because they would have to, I really think like start from the ground up to get it to where they’re utilizing analytics and research in the right way to get the benefits you’re talking about. Research hurts because every

00:47:52.16 [Stefanie Zammit]: time you do a study, you need to pay money, especially because like I said, no one has in house research teams, right? Not like fully intense. That’s not a thing. I think maybe Sky TV have one, but like most companies don’t. And so you’d have to make a business case to say, why should I spend money doing something that the analytics team can do internally using this data? Why is that not good enough for stakeholders to understand that without having ever seen what good looks like is really difficult. And it’s something I find really challenging in my job actually, just explaining the value of something without having it to hand. So it’s like hypothetical to a stakeholder, right? And this is where research teams are lucky if you have good relationships with agencies that will send you case studies and they feel sort of safe enough to send you examples that you can use to build your business cases. But when you’re talking in hypotheticals, it’s very difficult to get the budget and it’s not cheap. Market research is expensive. It’s slow. It’s a big investment for any company to make. But what I do find is once you start investing in it and putting the two together, showing the impact of that, the stakeholders will then understand like, wow, I get it so much more now because I’ve got my data, but I’ve also got my why and like my… I get how this person thinks. Put those two together and it suddenly you think, how did I ever make a decision without knowing this full picture? And then that’s where you will wet the appetite and it snowballs from there. But it is very difficult to do the very first one and hopefully agencies can help with those case studies. No, I have kind of a random question. So I’ll save it to Michael when he wants

00:49:30.72 [Val Kroll]: to wrap. Okay. I love this. So I actually had a very… The opposite experience of you, Stefanie. I started on trackers and I wanted to kind of branch out of that. And so I got thrown on the iHuts, which are in-home usage product testing, which is like could not be further from the tracking world. But I love that. Like when you said the power of video, Gillette was one of our clients and they sent out these new like beard trimmers to like men and they asked them to do videos of them shaving. And it was just so funny that like they were watching like, oh, like why in God’s name are they putting that clip? Don’t hold it like that. You’re going to cut their nose off. Like, oh my gosh, we need to change our directions. But it was so funny, including like testimonials, like the voice of customer, like in some of those reports that could be like leveraged 100 different places. We also had Haynes as a client and they were trying to… There was testing all the tag lists back in the day when everyone was switching. So it was like t-shirts and underwear. So we were sending out boxes and boxes of like whitey tighties and asking people like, tell us about… Did the tag scratch your butt? Like that was like the question. But the quote that we got, I’m like, I really wish I could see this used in like the

00:50:43.60 [Announcer]: internal decks, like where this went. But anyways, to your point about like it’s an investment in

00:50:51.12 [Val Kroll]: the first one, if you only think about it as like, I’m going to send out a survey and I wonder what they’re going to respond on like likelihood to agree to a certain attitude or statement versus like thinking more creatively about the different ways that you can interact with the customer, I think you can get people really excited about ways it can be injected in. So if you take one thing away, don’t think myopically about what research is and the ways it could be applied because it can actually be pretty fun and pretty enlightening. So… All right, Julie,

00:51:23.20 [Michael Helbling]: lightning round. Random question time. Lightning round. Because we have to start to wrap up here

00:51:27.68 [Julie Hoyer]: actually. Fine, Michael, we have to start to wrap. Okay, my question, because you were saying

00:51:35.20 [Announcer]: research is not fast and it is not cheap. But nowadays people like fast and people like cheap

00:51:41.52 [Julie Hoyer]: and people like AI because AI is… Oh, that was literally my question, Julie. So I have a slight spin. Let’s see if you went this far too, Michael, because maybe we were totally, totally parallel

00:51:51.60 [Announcer]: thinking, which I love. Because there were the two things, the faster and the cheaper. So

00:51:58.64 [Julie Hoyer]: we’ve had an episode in the past about synthetic data. I was curious like your thoughts on using synthetic data in this space, maybe some pros and cons. But then it immediately my brain jumped to, well AI in general is the fast and cheap option. And both of those things feel like people are very quickly going to grab for them to fill the gaps of classic research. But we’ve spent this episode saying that we’re weird creatures. And like to actually figure out the why, you can’t just ask them why, you got to take the time to observe. And those things are just at

00:52:29.68 [Stefanie Zammit]: such opposite ends of the spectrum. So it depends who you are as a company, how relevant or how useful synthetic data would be for you. So for example, at Bang and Alson, we work in small data, our transaction volumes are relatively low. You know, we’re lucky if we get a data set of,

00:52:52.24 [Announcer]: I don’t know, a couple of hundred thousand rows. So our client is so niche and so

00:53:02.88 [Stefanie Zammit]: under, I’m so misunderstood, or so not yet understood, because we are a luxury brand in the consumer electronic space. We can’t learn from other consumer electronics behaviors, but we can’t learn from other luxury behaviors. So synthetic data is just never going to be relevant for us as a brand. There isn’t enough volume and there isn’t enough lookalike profiles. And we’re still exploring the category, you know, being the pioneers of the category. If you’re a CPG company, and you sell in, you know, the typical supermarket or grocery store,

00:53:36.40 [Announcer]: then yes, absolutely. That makes sense. I would say though that there is a watch out that

00:53:43.44 [Stefanie Zammit]: we’re living in a changing world. So my rule of thumb when it comes to any kind of behavioral insights work is that they have a shelf life of around three to five years. But that’s been my rule of thumb since pre-COVID. And I do think that the world is changing more quickly now post-COVID than it was pre-COVID. So you have to think culturally, is the world the same enough for me to rely on synthetic data, which might go back, it depends where your cutoff is, right, where you start your data set from. So I would warn against caution to think about that. If there’s a huge world event, you’re probably going to need to go in with fresh questions or fresh exploration. And yeah, just the world we live in right now, it is so turbulent that, yeah, it’s not, the three to five year shelf life thing might not

00:54:31.60 [Michael Helbling]: be applicable anymore. Oh, that’s really good insight. And yeah, that was basically Julie, the question I was going to ask is about AI and its place in this, because I’ve seen startups going around, you know, being like, we can create a 100 personal digital twin research panel for you on the fly with AI and you can do your pre-research research with it and stuff like that. And I think there might be a place for it. But like, like you said, Stefanie, you have to kind of like think through the applicability. And I like the way you specified, like, hey, for our brand, we understand how unique we are. So a group of averages is not going to get us to an insight that we could use, which is, I think, very, very relevant. That’s really good. All right, we’ve got to start to wrap up. This is so fascinating. So thank you so much, Stefanie, for joining us. And it’s very good, educational, and really fun to talk about. And I know Val, you probably also are loving this episode. So, okay, what we’ve got to do last calls, something we do every show, we just go around the horn, share something that might be of interest to our listeners. Stefanie, you’re our guest. Do you have a last call or a couple you’d

00:55:45.68 [Stefanie Zammit]: like to share? I do. I have two last calls. And you know what, of all the prep I did for this episode, this was the thing that stressed me out the most, because you guys’s last calls are so good, as I got it coming with something good. It can’t just be any old thing. I did lose some sleep over these. But I think I got two good ones for you. So the first one is it’s actually a game. I’m a gamer. I love any type of game, board game, video game. And I attended a leadership training that was organized by our amazing HR team at Bang and Olson. And we played, it’s essentially a simulation game. You’re given a group of employees and they have to deliver a project together. And you know, it’s a bit like Moenopoly, you get chance cards and things go wrong. And it’s like, oh, the project, you know, somebody went on stress leave, what are you going to do? How are you going to keep to the time and the budget? Uh-oh, your main stakeholder has suddenly decided that they forgot what this was all about. What are you going to do?

00:56:42.72 [Julie Hoyer]: That sounds traumatic. I was like, this is giving me like stress.

00:56:51.44 [Michael Helbling]: It was so fun. I play that game every day, Stefanie. What are you talking about?

00:56:58.72 [Stefanie Zammit]: Sorry. No, but you’re, you’re sorry. We do play that game every day, but the fact of having the safe space where you could have these, oh shit, like everything’s going wrong in my project moments, but you’re learning how to deal with those in the safe space so that when it comes to your real life game, you’re prepared. I thought it was a great idea. It’s the game that we played was called the Playmakers game. It’s made by a company called the Works, Works with a Z. And I think they have a bunch of other sort of professional world simulation games as well. Super recommended.

00:57:31.12 [Announcer]: Yeah, next onsite. And it was a great way to like build rapport with your stakeholders as well,

00:57:37.44 [Stefanie Zammit]: right? Cause safe space and you play the game with a team of actual colleagues. So it was good

00:57:42.88 [Val Kroll]: for the bonding. Oh my gosh. You should like reverse roles. Like I get to be the stakeholder this time. Yeah. That would actually be hilarious. What should I do with all this power?

00:57:53.60 [Michael Helbling]: My problem would be like, really? You’re going to do that? Like no.

00:57:59.28 [Stefanie Zammit]: You have to all agree on the decision. That’s the game. Like how are we all happy? All going to do it. Oh no, we lost the team to stress leave. Damn it. Like we, so yeah, it was

00:58:11.36 [Announcer]: that’s awesome. That’s very cool. What else? Well, I have measured my second one just because I

00:58:17.20 [Stefanie Zammit]: am really excited that this is like hot off the press. It’s literally just gone live. I think two weeks ago, as you know, I worked for Bang and Offsend and we’ve just opened the factories up for anybody who’s interested in audio to go experience the manufacturing of our products. And honestly, if you’re a sound nerd or an audiophile, it’s a incredible experience. Like a really just sort of once in a lifetime immersion into the world of audio in a very beautiful part of the world. So yeah, that was my second one. Fun. Wait, Stefanie, I did see your note. You have to say the freaky part. Oh, the freaky part. Yes. So it’s a tour all through the, you know, like how the tonmeisters are called, how they find the perfect sound. And there’s a lot of different rooms that are created in a way for you to experience sound in different ways, which is how the products are developed. And one of the rooms is the 100% noise-proofed room. And it is the scariest place. Like honestly, I couldn’t stay in there with the door closed. You wouldn’t believe how scary 100% soundproof is. You can hear your blood flowing through your

00:59:27.44 [Julie Hoyer]: veins. It’s terrifying. Insane. I was like trying to imagine that. And I was like,

00:59:34.96 [Stefanie Zammit]: that’s just like breaking my brain. Honestly, 10 minutes and you’re like, get me out of here. Like I’m going crazy. Yeah, I bet. Okay, now I need to go do this.

00:59:45.68 [Michael Helbling]: I know. Stefanie, great job. You’ve upheld your end to the Les calls by far. Yeah, 10 out of 10. Absolutely. Yeah. Who wants to follow that? Val, what’s your Les call?

00:59:59.28 [Val Kroll]: So mine is going to be a little research-related. So I thought that one of the things we might discuss, and I do think we spent a good amount of time on it, is how to be curious in how to get broader in your understanding of these different methodologies or teams that might exist inside your own organization. And so my recommendation is to look out for some different communities that you could be a part of or join. The one that I’m still a part of today is the women in research, the wire group actually started in Chicago a long time ago. But I have benefited so much from my different mentorship conversations. I still stay in touch with the mentor I was assigned with, I think 13 years ago now, 14 years ago, Sheri Binky, shout out, I know she’s a listener. But there’s like, I’m a part of like women in product groups, and it doesn’t have to be like women only groups too, but they have so many different great events where you can get out and talk to people. So get out, touch some grass, talk to people, learn about how people are putting some of these different ideas to use inside of organizations, because that can totally be an inspiration for the way that you bring it to your own work.

01:01:12.96 [Michael Helbling]: Outstanding. All right, Julie, what about you? What’s your last call?

01:01:17.04 [Julie Hoyer]: Fine, it’s a little bit random. But honestly, this is something I’ve definitely heard mentioned before, quantum computing. And I do think this is like the next leap probably after AI, if my naive take on it is anywhere close to true. I was reading a newsletter that I always get, and there was this one mention of like the next big leap, like leap, I think it was called. And so I clicked on it, and it was all about quantum computing. It was an infographic about I’m like, well, I’ve heard it mentioned, I don’t know what it is, like, sure, I’ll take a look at infographic. And it, it was really good. And the way they broke it down within like not a very long read, I totally have a new appreciation of what this means and why so many companies are going after it. Pretty much they’re saying like, instead of using electrons for zeros and ones, like, we’re going to use a subatomic particle that is like super finicky to keep stable. But pretty much we go from being able to compute things one at a time linearly to doing things what’s it called simultaneously. So all these computations simultaneously that you could do and you think about how much computing like AI is doing or different industries like finance or supply chain even, and they, they walked through like a supply chain example. And it was amazing to think that you could go from something that would take a normal supercomputer, like they even said like a trillion years in their example to doing it so much quicker with quantum computing. And they said that some of this quantum computing power could actually happen in the next two to five years. And some of these people working at companies were like, I was told I wouldn’t see some of the milestones we have hit recently in my lifetime and like they have hit them. So it was one super interesting. I finally feel like I kind of understand what it is. I was a really quick read too. So it got me kind of freaked out and excited. And I just felt like, Oh, I learned something.

01:03:16.32 [Announcer]: Sounds good. I like it. Yeah. What about you, Helbs?

01:03:20.56 [Michael Helbling]: Well, I, as per usual, love everything I read on CommonCog.com, Cedric Chin, and he wrote an article recently about how to make sense of everything that’s happening at AI, because it just sort of feels overwhelming most of the time. And it actually sort of ties back to the episode in a way, because one of the points he was making was sort of like, don’t listen to what people say about AI, watch what they’re doing with it in the real world, to use that as a guidepost for how you should be responding to AI, which kind of goes back to sort of like user research. So anyways, the article is really good, but it’s very practical in terms of just better sense making around, like, okay, there’s sort of this hype and concern and all these other things. But like, look at actual detailed examples of how people are actually using it, and then ask some questions from there, like, what other outcomes are possible? What actions could I take? What matters most in my context? Those kinds of things. So anyways, really good article, just to like, take some of the pressure out of what I think a lot of us are feeling about AI, like, half the time it’s like, is it going to take my job? And the other half of time, this is so cool. I can’t believe I don’t, you know, I’m just doing everything with AI now. So there’s some balance where you have to find a balance. Otherwise, we’re going to blow up. Anyway, so that’s my best call. Blow up. All right. Yeah. Tune in. What’s the old, whatever, I won’t try to remember what the hippies used to say. Stefanie, thank you so much for coming on the show. This has been so fun.

01:05:01.12 [Stefanie Zammit]: Thank you. Yeah. Thank you for having me. It’s awesome to get to talk to you guys and talk about

01:05:06.40 [Michael Helbling]: nerdy topics that I love. So thank you. Yeah. No, it’s been great. And I’m sure as you’ve been listening to the show, you might have questions or you might have ideas. We’d love to hear from you. The best way to reach out to us is through the major Slack chat group or LinkedIn or via email at contact at analyticshour.io. And please feel free to reach out and leave us a review on the platform that you listen to us on, whether that’s Apple or Spotify or whatever, you know, whatever one we’d love to hear from you. We love getting feedback on the show. So definitely do that. And we’re still asking you to give us some questions just a couple of weeks to go until we’re going to be recording a show live at Marketing Analytics Summit on April 29th in Santa Barbara, sunny California. And we’ve got a survey which is out on the show notes page. Go fill it out. I mean, perfect example. Hopefully we did a good survey. I’m pretty sure we probably did. I know, right? Although again, I just want to caveat each time I had nothing to do with the art at the end of that survey. You’ll have to fill the survey out to see what I’m talking about. But it was I had no editorial control whatsoever. But if you have a question, we want to gather lots of great questions from listeners, either if you’re be there or not, we’re going to let answer them live on the show when we record it there at Marketing Analytics Summit. So

01:06:29.92 [Announcer]: looking forward to that is just a couple of weeks away. All right. Great show. Very fun.

01:06:37.76 [Michael Helbling]: And I know I speak for both of my co-hosts, Val and Julie. And I say, no matter what your

01:06:42.96 [Announcer]: market research says, keep analyzing. Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at At Analytics Hour, on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst. Those smart guys wanted to fit in. So they made up a term called analytics. Analytics don’t work. Do the analytics say go for it, no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.

01:07:24.72 [Val Kroll]: Julie has this mic that like, she could be across the room and she’d be like,

01:07:28.88 [Announcer]: it’s like so loud.

01:07:35.28 [Michael Helbling]: Julie inadvertently must have bought one of those ASMR mics or something.

01:07:39.92 [Julie Hoyer]: Literally, I have like the game turned as far down. It’s like more than on arms length away for me. I’m like speaking, you know, I’m trying to speak on the softer side and I’ve turned on my volume in reverse side. No, I think it sounds good. And I’ve made sure my humidifiers are off. So hopefully no background humming. I unplugged my wine fridge, like all the things. Yeah.

01:08:03.28 [Michael Helbling]: And the worst part though, Stefanie, is we have this really great engineer, Tony, who goes through and does like all the audio editing and he gives very specific feedback about who’s audio quality was terrible. And so we’re going to use notes back from Tim of like, oh, Michael’s awful this episode. And we’re like, oh, thanks. That’s so great. So that was like so cautious. You sound a little soft, but you sound okay. Yeah. Anyways, no, it’s just one of those things where you’re like, you think we’d have it nailed down after so many years, but we’re still like

01:08:38.00 [Val Kroll]: every episode we’re sort of like fine tuning. You know what y’all need?

01:08:42.32 [Julie Hoyer]: Carabangan Olsson’s just saying. Yeah. That’s right. Yeah, you’re the person to ask about that. I

01:08:48.80 [Announcer]: should go look. Well, I’ll wait for Val to stop typing. I know this. Usually it’s Moee and she’s

01:08:58.16 [Val Kroll]: like keyboard cat, like you’re like Moee. Yeah. We’re a very serious professional podcast.

01:09:06.48 [Michael Helbling]: That’s right. Bringing it all together. Here we go. All right, I’ll give us a five count and

01:09:10.80 [Announcer]: we’ll get started. We’ll go in five, four, three, rock flag and two worlds, one family.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#295: Research and Analytics: the Peanut Butter and Chocolate of Data?

#295: Research and Analytics: the Peanut Butter and Chocolate of Data?

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_295_-_Research_and_Analytics_-_the_Peanut_Butter_and_Chocolate_of_Data.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares