Data that tracks what users and customers do is behavioral data. But behavioral science is much more about why humans do things and what sorts of techniques can be employed to nudge them to do something specific. On this episode, behavioral scientist Dr. Lindsay Juarez from Irrational Labs joined us for a conversation on the topic. Nudge vs. sludge, getting uncomfortably specific about the behavior of interest, and even a prompting of our guest to recreate and explain a classic Seinfeld bit!
Photo by Richard Stachmann on Unsplash
[music]
0:00:05.8 Announcer: Welcome to The Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language.
0:00:15.1 Tim Wilson: Hi everyone. Welcome to The Analytics Power Hour. This is episode number 271. I’m Tim Wilson from facts & feelings, and you’re listening to me because you made a decision to smash that play button in your podcast app. Why did you do that? You could have scrolled through Instagram, checked Bluesky, popped into your favorite media site and checked the latest news. Or perished the thought, put your phone down and gone outside. Maybe even touched some grass. Really, though, I’m glad you’re here. I’m pretty sure you’re not listening. Because we ran a perfectly targeted Google Ad, or because we built a predictive model that told us exactly what content to create and when to publish it, that, as a result, enabled us to manipulatively induce you to listen. Nope. You’re a human, and humans make decisions for lots of reasons. At best, we might have nudged you a little bit. Which brings me to a trivia question for one of my co hosts for this episode, Moe Kiss. How are things going at Canva?
0:01:14.3 Moe Kiss: Pretty great.
0:01:15.6 Tim Wilson: Awesome. So, on that topic of nudging, can you name anyone who won a Nobel Prize for their work on that very subject?
0:01:24.5 Moe Kiss: Well, it’s possible. It might have to do with one of my favorite books about nudging, and one of the authors was Richard Thaler.
0:01:31.6 Tim Wilson: Ding, ding, ding. You are correct. That’s right. And he is not our guest today, but Val Kroll, my colleague at facts & feelings. Can you name the psychologist that Richard Thaler met at Stanford in 1977 and went on to collaborate with quite a bit? Well tip. He personally responded to an email from Moe several years before he passed away.
0:01:54.3 Val Kroll: Ooh, that would be Daniel Kahneman.
0:01:58.4 Tim Wilson: That is correct. The greatest guest we never quite got. So Thaler Kahneman and Amos Tversky and others blazed some real trails when it came to behavioral science. And that’s the topic of today’s show. Luckily, I’m going to use that word again. We were able to nudge our guest to agree to hop on the mic. This may be the first time where the nature of the episode itself will allow us to ask her to dissect why she made that choice. Lindsay Juarez is a behavioral scientist and a director at Irrational Labs, which is a behavioral product design company founded by Kristen Berman and Dan Ariely in 2013. Lindsay uses behavioral insights to design and test interventions in the financial and health domains. Before she joined Irrational Labs, she was a senior behavioral researcher at the center for Advanced Hindsight. Love that name at Duke University. Prior to that she worked for the Federal Government Accountability Office and at NYU on a series of multi site school interventions to reduce the racial academic achievement gap. She has a BA in Psychology from Reed College and an MA and a PhD in Social Psychology from the University of Virginia. And today she is our guest. Welcome to the show, Lindsay.
0:03:14.5 Lindsay Juarez: Thank you for having me. I let myself be nudged. I was very excited to come.
0:03:21.2 Tim Wilson: By nudging. I think Val and I were twisting her arm at Experimentation island and she was like, okay, I’ll be there. So maybe a good place to start is to actually define what behavioral science is like in layperson’s terms. Can you maybe take a swing at that Lindsay?
0:03:41.8 Lindsay Juarez: Yes. So I would say behavioral science is using insights from psychology, from behavioral economics, from neuroscience and some other social sciences to understand and then predict, maybe even change human behavior. And so if you think about the way you take actions or you make decisions, there are times when you are just systematically not doing what you want to do, or maybe you don’t do what you want to do. And it can be systematic. And so using an understanding of behavioral science, you can say, oh, let’s anticipate the problems that you’re going to have in exercising every day or eating healthy and then what can you do to ideally change it around so help you achieve your goals.
0:04:31.1 Moe Kiss: And this is like hands down one of my favorite topics. I feel like people have often heard of behavioral science, but maybe don’t have like a super clear understanding. But there always seems to be like a couple of examples in the industry that people are like, oh, that study. What is your favorite like that study?
0:04:50.6 Lindsay Juarez: Good question. It’s probably a while ago, I mean, Google is always doing a lot of behavioral science, but they were doing it in-house and looking at healthy snacks because famously, right, they have snacks everywhere and they don’t want people to get too indulgent. And they were basically looking at how can we ensure that snacks or that coffee breaks don’t have too many treats coming with them. And they had basically a corner kitchen and people who happened to have an office on the left side versus the right side would come in and the coffee station was closer or farther to their snack bar. It was like an L-shaped kitchen. And basically what they find is when people have the misfortune to have an office on the side that then is like the coffee and the snack station are I think it’s 12 feet apart versus 21 feet. I might be getting my math wrong, but basically if you are just steps closer to the treats, more of your coffee breaks involve, I’ll take a muffin too, right? And so it’s just proximity, it’s ease, it’s temptation being readily available, but it matters to the decisions you’re making.
0:06:07.8 Tim Wilson: Can you also, I mean, I think the other like study, you use this one and study is definitely using it in the loosest terms possible, but can you talk about Jerry Seinfeld and nighttime Jerry versus, talk about somebody presenting and then callbacks for the, it was a short conference and there were multiple callbacks to it and I’ve now used it multiple times. So maybe that’s another, can you share that one?
0:06:30.0 Val Kroll: That’s good. Can you do Jerry Seinfeld’s bit?
0:06:32.4 Lindsay Juarez: Exactly. I cannot believe you’d put me on the spot like this. So he’s been workshopping this joke, I think for 20 years because you can see it in different clips and you like watch his hair change. But so he does this perfectly, but the premise is there’s morning guy and night guy. And morning guy is responsible, making thoughtful decisions about the day, evaluating choices and night guy, what he says, I got to try and do it a justice. He’s like party rikes on for night guy. And so night guy is impulsive and doing what feels good and staying up too late and spending too much money. And it’s this tension between sort of my best future self and the here and now and how wildly more compelling it is to do something that feels good now than to abstain and think about your future.
0:07:29.1 Moe Kiss: Oh my God, I’m night guy.
0:07:32.6 Val Kroll: Permanent night guy? You never changed to morning guy.
0:07:35.9 Moe Kiss: Often night guy, often.
0:07:39.6 Tim Wilson: But I mean, to me, that’s actually, I mean, I’m sure… I’m not sure. I think part of that resonated because it talks about that in a human at any given point in time and what the stimulus is and what the intervention is and what the… That we’re not like one human being. We’re different ones based on the immediacy, which if we flip it to kind of maybe the world of analytics where we’re just looking at maybe behavioral data, like how do you think about and the work that you do just having hard data on what somebody did kind of blended with behavioral science as to what their motivations are and even recognizing that their motivations may… Their motivations are different, their decisions are going to be different based on where they are, when they are, how hungry they are, like all of those other factors. Is that a fair question or did I just walk all the way, in a big circle?
0:08:47.1 Lindsay Juarez: Was the question how do I think about human behavior or the different factors?
0:08:54.4 Tim Wilson: Well, I guess, how do you think of like human behavior and how that fits in with kind of behavioral analytics, I guess, or just data that just tracks people versus what the psychology side of behavior and decisions are.
0:09:11.0 Lindsay Juarez: I think data that tracks behavior is often better than asking someone what they think or what they want because you are particularly prone as a human to tell stories around your choices and your decisions and you can tell and explain things that you notice but certainly there are tons of things acting on your environment that you may not notice, like right proximity.
0:09:38.0 Lindsay Juarez: Proximity is the biggest, most obvious influence and we just don’t think oh I got a donut because it was right there. You think well I was hungry I didn’t have breakfast I’ve been working out a lot and all of those start to feed into the story you tell and so if you are a researcher doing user interviews then you’re very likely to hear something that people say is a reason for their behavior might be one of a factor in their behavior but is not going to be the one and only thing that matters and so I think it’s very important to think about what inputs am I getting as I think about research, have I just asked people what they feel or what they think they did and then even having data sometimes you’re still getting this moment and it’s abstracted from everything else that was going on in their life and other pressures the time constraints are they distracted about something else, are they worried about finances, does that lead to a different choice, it’s just how do you figure out as much as you can about the situation in the context.
0:10:46.0 Val Kroll: So I have to say so I started my career in market research and we would ask a lot of times especially in the CPG world you know how likely are you to purchase this again over the next three months whether it was vodka or yogurt or a pair of jeans and so I was asking people for the future and when we were reporting this up to like marketing organizations they were thinking about that in terms of as a input to their media planning and what the take rate would be and so it was like seen as this like very tried and true thing without the context of, is your competitor going to be on sale at the moment that I go to the grocery store or is that kids yogurt going to do a collab with frozen and I shop with my kid and now we’re definitely not buying your brand anymore right. So I always crave but like but what actually happened and so I felt like when I made the shift into digital analytics I’m like oh, now I’ll know, now I’ll be able to see like what people are doing the choices they’re making.
0:11:41.1 Val Kroll: I’ll have you know this BI data at my disposal but then I found myself just craving the other side of the same coin like I didn’t know why they made some of those choices or like what was the context like you were just mentioning Lindsay. And so it feels like the your answer is interesting about like that being able to see like what they actually did and abstracting that from like the other context. It just always feels like no matter which one I have, I’m always craving the context of the other side of that coin. So what are some of the ways… I guess the question in that is like what are some of the ways that you could think about pairing those different types of data or analysis together to give you a fuller picture or to help support better decision-making?
0:12:23.7 Lindsay Juarez: I think you’re right that ultimately you have to use them all or there’s a time and place for every single one. I came up from a social psychology background and grad school, our department had a particularly strong quant area, so I think for a long time I was a little bit of a hater around quality search, I was like, people couldn’t tell you anything and now older wiser more knowledgeable. We do a mix of all different methods right, you run a survey, you try to just track, clicks on a website, something more objective you do interviews. I think it’s nice to use the initial clicked data to say oh here’s where something is happening, whatever that might be and then you can use interviews if you have them but I think it’s it’s interesting to come up with a hypothesis before you talk to a customer perhaps, because then you can present them with hypotheses, you could put an idea in front and get that reaction rather than just saying, so tell me are you going to buy my product next week? Do you like me?
[laughter]
0:13:32.9 Val Kroll: Yeah, especially with the bias of like wanting to please the interviewer in some of those qual studies, but even like that question if it, back in the day when like online surveys weren’t such a difficult tool to be leveraging, I’ll leave that there, that that would be something that you would track like every quarter month over month or with each new product release or new concepts that you’re buying and so it was always looking at the the likelihood to purchase trend over time. I always thought that that was such a hard thing to pin down especially, again like how much advertising are you doing like there’s so many factors in that.
0:14:07.8 Val Kroll: But I guess conversely the one thing that you also just called out about the hypotheses that you have from the data before you talk to your customer, I notice so many times and I’ll see myself falling into this trap that I’ll look at descriptive statistics about something that has happened and I’ll think like oh here’s why this happened, like, oh, it’s because that wasn’t above the fold that month. It’s because they didn’t see it, when in reality even if they did see it, even if it was a pop over message that locked them into the website like if it wasn’t a good offer then maybe that was the reason and so I think that there’s this sometimes intensity even with business partners to pull out like the jump to conclusions Matt about like why something happened versus being curious or even framing it as a hypothesis in the first place, like it could be the placement, it could be the offer, it could be you know some other context of where they arrived from. But I find it’s really hard to get people to always think about that idea of like why that happened that’s actually just a hypothesis like and it’s framing that way I find is a healthier approach but do you notice that it’s hard to get people to think in that way or with your clients they’re like, oh, no they’ve got that in the bag. It’s their strong suit because I’ve been working with them.
0:15:24.6 Lindsay Juarez: Oh, put them on blast. No, I think, people because behavioral science is still and like behavioral consulting is kind of niche, I think clients have bought into the premise a little bit and so they are open to the idea that we’ll suggest you run tests and what does the data say and people may not have told you exactly what they’ve been doing. But we’re still a consulting agency and so people will say, please help us redo this landing page and we’ll come in and we’ll look at the landing page and then what is it that someone is being asked to do what does the user have to do upon clicking sign up and we’re like, I don’t know if it’s the landing page I think it’s this entire flow that is so difficult. And then there is this tension they’re like, no it has to be this problem, like, maybe it’s a system maybe there’s more.
[music]
0:16:22.2 Tim Wilson: Hey Michael you know what every company chasing AI has in common?
0:16:26.9 Speaker 6: A team of retired data engineers.
0:16:31.3 Tim Wilson: Well that and mountains of messy data which is why next-gen companies are going data lake governed structured real-time the works.
0:16:43.5 Speaker 6: Yeah but without Fivetran, your lake might turn into, I hate to say it, a data swamp, swamp monsters at all. And that is where Fivetran comes in. And now that Fivetran has acquired census the reverse ETL wizards, they’re officially the first end-to-end data movement platform.
0:17:03.9 Tim Wilson: Translation, they’ll load your data into your lake, not your swamp, make sure it’s clean, cataloged, and wait for it even push it back into the business apps where magic happens. Goodbye swamp, hello governed wonderland.
0:17:21.6 Speaker 6: Fivetran, where your data gets serious and stays smart. Go check out the interactive demos on their site at fivetran.com/aph that is F-I-V-E-T-R-A-N.com/aph.
0:17:40.1 Moe Kiss: Okay. So we’ve been talking a little bit about using the data analytics that we have, and behavioral data. We’ve talked a little bit about surveys and research and we’ve talked about behavioral science. I sometimes like don’t really fully understand where like one thing starts and ends and the other one begins and particularly I suppose when it comes to more like the research and behavioral science side. It does feel like there is this level of rigor in behavioral science that is much higher and it’s much more about the why. But can you help me nut this out just to make sure I’m getting it right?
0:18:22.5 Lindsay Juarez: I guess behavioral science historically, especially if you look at like Richard Thaler and Kahneman and Tversky have really come up through experimental contexts and so I think that’s where the real focus on rigor and let’s run a study and let’s run a RCT, let’s run an A-B test, so then we can feel more confident in the conclusions we draw, is sort of built into… Okay it came up through a soft science but still a science and it’s focused on experimentation and then I think also because so much of the content and the findings itself are people think they’re doing one thing and then it turns out it’s something else, that that also really invites, okay, we have to we have to feel more confident in these conclusions. We need hard data, we need a test, we need to untangle these elements.
0:19:17.5 Moe Kiss: I think the thing that I feel like I’m up against, like, I’m not going to lie, one of my favorite books is Work Rules. Like, I love all the Google experiments. They are just, like, the funnest thing ever. And I swear in another life that would be my job. But it sometimes does feel like the outward looking, as in, like, trying to understand our customers always takes priority over, like, using experimentation to understand our own employees. Is that something that you feel, like, play out in the consulting you do? Is there, like, a preference of, like, where we most want to use behavioral science?
0:19:54.4 Lindsay Juarez: Good question. We mostly are targeting customers. There have been a handful of projects looking at especially employee well-being and happiness, productivity and engagement. I think as AI continues to develop, that’s a piece where thinking about behavioral science is really interesting. And so if I can use AI tools to summarize emails more quickly or put together a report, that’s great, but there is a real recalcitrance to adopt it. I think people are worried about, will it take my job? Does it do it well? Is it destroying the rainforest? How do all of these pieces come together? And so, like with anything, how do you get someone to adopt a new behavior is partly a behavioral science challenge. And so I think that’s sort of the next wave, perhaps, in behavioral science consulting, especially internally.
0:20:56.2 Moe Kiss: Oh, I’ve just had this huge light bulb. I feel like I’ve been thinking so much more, especially with AI usage, how do you understand productivity gains. But it sounds like the much more interesting question is, how do you understand adoption and particularly reluctance to adoption.
0:21:15.3 Tim Wilson: Well, it should be both. I guess coming from psychology, social psychology, behavioral science, it feels like there are… I hate to say frameworks, but I’m going to say frameworks, but perspectives where you break down a problem, like how do we get people to get past this page or take this action. And there’s kind of a marketer or a UX or a designer feels like the ideas can be kind of crude, like, oh, we’re just going to throw a pop-up banner in front of them that they have to click out on, which is kind of based on loose human intuition or it’s based on what they’ve seen done in the past. And these thing, let’s make it blink, let’s make it flash. This is where I put like behavioral science up on a pedestal feels like there’s a level down of, it may still result in saying, let’s put a pop-up up there, but it’s kind of grounded on, let’s try this thing because it’s a proximity thing or it’s lowering a barrier or it’s offering a benefit. So like when you’re presented with the challenge or even the example you said earlier, where it sounded like, if a client really wants to do this thing, so they’re trying to define the problem as being that thing that they already know how to fix.
0:22:47.5 Tim Wilson: And then it sort of sounds like you may come in and say, well, that’s, if I take a more thoughtful process and look at it through lenses of trying to influence behavior, it’s going to point me to somewhere else. Like, do you have sort of… Is that how you’re working with kind of mental models of I need to think about, you talked about that experimentation, I don’t know, like the 3B framework, are there things like that and others that you say, I’m just going to assess it first through a few different lenses and see what it bubbles up? Or am I completely botching how that sort of work occurs?
0:23:33.0 Lindsay Juarez: No, so the 3B is that you allude to are sort of the way we approach a problem and the diagnosis. And it’s what is the key behavior that you’re trying to drive? What exactly do you want the user to do? At what time do they need to repeat it? Is it just like getting, we call it uncomfortably specific, very, very precise on your key behavior. That then makes it much easier to understand what’s keeping them from it. What are the barriers, that’s the second B. And then how can you increase motivation, increase the benefits, the perception of benefits that someone will get, third B. And so for me, I think the hardest piece is that first one, the behavior. Is it, I want people to click on the landing page, is it that, well, actually I want them to click on the landing page because I really want them to link their bank account. And so understanding what it is that really, really… You want the user to do, what do they want to do, what are their goals to accomplish, I think that is sometimes very, very painful. And the behavior that you need to fix is not necessarily the one that is the easiest to address. But I think that that’s where it gets interesting. That’s where it’s hard. That’s where it’s fun.
0:24:55.5 Tim Wilson: Interesting. Got it.
0:24:56.9 Lindsay Juarez: Is that a satisfying answer?
0:24:58.7 Tim Wilson: That’s a… Yeah.
0:25:02.5 Moe Kiss: Do you find that some… There was a little comment you mentioned before about companies being bought in. Do you feel like companies have to get to a certain stage of maturity in terms of… It sounds like there’s a really strong emphasis on experimentation, so that needs to be kind of the backbone already that people have a strong understanding and willingness to do experimentation. But I suppose that I’m just thinking of people that I might work with who would be like, oh, but we would just do an experiment. We would make the change and then just do an experiment. Do you feel like some companies are resistant to engaging with behavioral science and that the ones that you do engage with, they just get it and they’re kind of more mature in their thinking here?
0:25:47.0 Lindsay Juarez: I think people are at all different stages. Sometimes they saw a talk. Sometimes they’re deep into it. They’ve read all of the books. They’re excited. You have some champion. I think we have the most success at trying bigger swings when it’s a smaller company because it’s just easier to take risks, I think, when you don’t have giant infrastructures and thousands of people. So that can be, I think, just tough when there are levels of bureaucracy and teams are siloed. So if you do think about this system in an entire flow, it can be difficult to make the sorts of changes that you might want. But I think, I probably flatter myself, but I think behavioral science and the idea of nudging is becoming more popular, more palatable. And so I think there is an appetite, especially if you can say, we think this will make it easier for users. And especially if you couch it in terms of friction, if you’re using the right language for a design team, for the engineering team, I also think that makes it easier to then take those insights and incorporate them.
0:26:53.2 Tim Wilson: So how often… I mean, is it… There’s someone coming saying, ah, do behavioral science and give me like the magic, the magic answer. And you say, okay, the first thing we need to do is figure out what behavior, the first B, and we’re going to get uncomfortable. And do they get, is there pushback? Do you actually find that they’re like, oh, wait a minute, we actually aren’t really clear exactly what we want to have happen. We just want more customers or we want more monthly active users or, and does that wind up being kind of a sticking point that you’re like, oh, we haven’t even gotten to figuring out the behavior. We just got to figure out what behavior we’re trying to influence. And you’re getting asked to apply your brand of behavioral witchcraft without clarity on like what the result looks like, or, because you said it was, could be uncomfortable. So that made me think, I’m like, oh, is that…
0:27:51.3 Moe Kiss: Yeah, uncomfortably specific.
0:27:52.6 Tim Wilson: Yeah, uncomfortably specific makes it sound like people have gotten uncomfortable, so why?
0:27:56.3 Lindsay Juarez: I think it’s uncomfortable because you have to prioritize a group or you have to prioritize a particular step. And that can be hard then to, to like kill your darlings and not have these other areas. And it doesn’t have to be forever, right? You can have multiple key behaviors. You can put things in a backlog. You can say, we’ll test this one first and then this other, but let’s try to hit the most high impact option that we have in front of us. So I should say like, what is uncomfortably specific? It would be something like we, I’m thinking about my son’s daycare. We want parent to pick up their child by 5:55 every day. And so there, like you have the timeframe, you have who they’re doing it, when they’re doing it. And then that invites, okay, well, is it the timing that’s the problem? Is it that they’re sending other people? The example’s going off the rails, but I think that sort of specificity then in the tech context means you have to think about where are people coming in from? Is it a particular keyword group? Are we interested in just people who land on the page? Are we interested in referrals? That then can let you design more specifically, right? Designing [0:29:16.8] ____.
0:29:18.1 Moe Kiss: Yeah. Can I interject? So with this daycare example, is the problem that businesses come to you and say, oh, our staff are doing overtime and we want to fix the fact that we need to pay staff longer hours? And you’re trying to narrow it down to be like, we want parents to pick up children by 5:55 because that is the narrower scope. Am I thinking about this the right way?
0:29:43.3 Lindsay Juarez: It is nice to narrow your scope and get so specific on the behavior because then different solutions arise. Actually, I’ll give an example. This is a case study. This is work we did with TikTok now a couple of years ago, but they were really interested in reducing potential misinformation on the platform. If something is verifiably false, TikTok would want me to say they take it off because they do, but things in that sort of nebulous, I wonder if, what if, those sorts of things are developing situations. How do they get people to not share that, to not watch it, to not post it in the first place? You can see already reducing potential misinfo on the platform could come in a couple different ways, right? Is it stopping a poster from posting? Is it a poster takes something down? Is it I don’t share it? Is it I report it to a moderator? All those different mechanisms and potential key behaviors, what do you want to drive? And so we worked with them. It’s an iterative process and we landed on they want viewers watching to not view it and not share. So you sort of stop the spread now that it’s out in the world, but not farther up the funnel.
0:31:00.0 Lindsay Juarez: And so worked with them then to design an experiment. They ran it in product and then eventually rolled it out all the way. But basically, if you put a flag, if you put a banner on there labeling that this is potential misinfo, potentially inaccurate, I think is the exact word that they used. And then if someone clicked the share button, there was an additional pop-up, add some friction, are you sure you want to share? And the emphasized button was no, cancel, go back. Then you could reduce shares, I think it was by 24%. And the banner reduced views and likes by I think five and 7%. So depending on your anchors, that’s impressive. For us, it’s impressive. And so that’s the kind of thing where a particular key behavior invites a particular design. If you were focused on creators or some other part of the funnel, different intervention, different tests, different outcomes.
0:31:53.0 Tim Wilson: But I mean, that example does seem like it’s getting to on the one hand, without thinking of it through one lens, you’d say, well, we’re just going to ask them to confirm what they’re doing. It sounds like you said, oh, we’re specifically adding friction, which is the counterbalance to or the flip side of…
0:32:15.5 Tim Wilson: Benefits, yeah.
0:32:16.4 Speaker 1: Benefits, so looking at it, it seems like it would open up. We’re just trying to add friction here in a way that is not offensive or like you would be adding friction and be like, if you said, well, you’d be stupid to share this or something that’s like, you couldn’t, you couldn’t do that. It does seem like that’s the sort of thing that… I mean, I feel like as an analyst to stop and try to think through what… We want to inject friction, let’s stop and have… Like, we know that if we inject friction, that that will reduce the number of people who get through it because the flip side, if we’re running a checkout process, if we remove friction, more people will go through it. So focusing on ideas for, in that case, adding friction seems like it would open up a broader set of testable hypotheses of the specific mechanism for adding friction, as opposed to just saying, let’s put a second prompt. I don’t know. Like it, it feels like really…
0:33:24.3 Val Kroll: Like really canceling your gym membership?
0:33:26.4 Tim Wilson: Yeah. I mean, it just feels like there’s trying to add a little more thought then kind of expands the possible solutions. And in many ways is probably often expanding kind of testable solutions, maybe. I don’t know. Who wants to go next?
0:33:49.8 Val Kroll: Well, one of the things that I was thinking about, because we’ve talked about a little bit, you referenced the funnel and the steps. And so Tim and I were just talking about how we still can’t believe that we see people posting things like, the funnel is dead. And it’s like, when we’ll be saying the funnel is dead is dead? Because we know that we can’t just talk about it as like, oh, this is an awareness problem. Like, just wave our hands and increase our ad spend, and hopefully that fix all the problems downstream, right? And so it seems like it’s a step towards more mature, more maturity for organizations to think about more nuanced journeys and where these different audiences are coming from. A primary care physician who is out in Idaho versus like a specialist who’s practicing inside of a hospital system in a large city, those are not the same doctors. We need to consider them very differently. But I noticed that once we can get to that conversation of this more detailed nuanced journey, it actually becomes sometimes just as hard to use as saying like, oh, awareness to consideration, like, what’s that drop off?
0:34:53.7 Val Kroll: And so I’m assuming that journeys and thinking about those discrete paths to purchase or, the different conversions is a part of your work. Like, what are some of the best ways you’ve seen clients or you’ve been able to work with journeys to help people kind of understand some of the nuance and complexity and like the background or context like we were mentioning earlier to help figure out what those B’s are and how to apply some of that.
0:35:19.8 Lindsay Juarez: Thinking about segmentation or?
0:35:23.5 Val Kroll: So yeah, so basically just like some of the best ways you’ve seen using journeys and the exercise of creating a journey to help inform next steps in work or apply some of those practices or principles.
0:35:37.4 Lindsay Juarez: So I guess thinking about behavioral maps and then understanding every step in the flow, it can feel, I think, very obvious, especially when you come in and you’re like, oh, we’re going to do a behavior map, people are like, user journey, we know that. And then you get into it. So what we do in the behavior mapping process is basically all of us on the project are just going through this process over and over again, signing up, screenshots of every page and then trying to identify, okay, on this screen, what’s going on? What do we think is happening from a psychological perspective? What do we know about the way these choices are, about mindset? What have they sort of put forth and what mental model are they building? And so you end up with these very dense things. And I guess I’ll use this example. We worked with One Medical. They are sort of a UX interface for then getting primary care, seeing a doctor. And they’re an employer benefit. So an employer would hire them and then the employees get a letter, an email, sign up, please join One Medical. And they were really focused on, can we get people to sign up.
0:36:51.2 Lindsay Juarez: And we said, after they sign up, that’s it? And they’re like, oh no, actually we need them to then receive care. That’s better for the employee. That’s better for us because then people are actually utilizing the product. The contract gets renewed. But you could see that they hadn’t thought about it in that way before because they were really centered on, as soon as you got through the signup flow, you landed on a homepage and the homepage said, great, basically, you’re done. And so now you have told the user in turn they’re done. And so trying to disrupt that process to then actually get you to have a checkup, to see a doctor, it can be very, very powerful.
0:37:31.6 Lindsay Juarez: So what we did basically is redesign that landing page. If you could see it, you would say, wow, so beautiful. But changing around with the CTAs and helping people understand what’s the next step and what’s the progression can be very powerful because we’re always looking for feedback and coaching. And I mean, that’s probably an exaggeration, but especially when I’m trying out a new product, I’m going through the onboarding, I need to be told how to get the most out of this. And so saying, Val, your next step is to sign up for an appointment. And here we have already chosen a doctor. You can change your mind, but here’s someone that we think is in your area. They’re going to be right for you. Here are the available times. And so now for Val, the choice is no longer, what do I do next? What was One Medical? Why did I sign up? It’s like, oh, is 8:30 or is 10:30 a better time for me to go in and get seen? Right? You have shifted all of the different questions.
0:38:32.7 Moe Kiss: Okay, I’m going to go rogue. I have been, had this topic has been like swirling around in my mind for the last few weeks and months. And a really good friend who works in the like belonging and inclusion space had lunch with me and was talking about this. And I, you know when you’re like thinking about something, but you’re kind of not there yet. And then just randomly I happened to start reading the book called Make Work Fair at the same time.
0:39:00.1 Moe Kiss: And I think what is shifting in my perspective is that I felt that part of my responsibility was to try and I guess, quote unquote, I’m doing the air quotes, “help people understand why like fairness and equal opportunity is a good thing in the workplace”. And I think what’s really being disrupted in my thinking is like, no, you shouldn’t spend your energy on trying to convince people of that. You should spend your energy on trying to set up processes and practices to help people make a better decision. Like whether they believe in it or not is irrelevant. It’s about getting them to make the best choice. And I’m just trying to like triangulate that with the exercise example, or sorry, the medical example you just shared, right? Like the point is not to convince someone that seeing a doctor is going to be good for their health. The point is to just get them to make the appointment because it’s going to be good for their health. So like, do you see the like crossover here or is it just my mind that’s still processing all of this?
0:40:04.9 Tim Wilson: And do climate change next, you know.
[laughter]
0:40:09.2 Val Kroll: On our next episode…
0:40:12.5 Lindsay Juarez: No, no, a 100, a 100%. I, as a behavioral scientist, do not, I don’t really care why someone is doing something if they are doing it. And so we have a phrase, we call it doing the right thing for the wrong reasons. And basically, if you, as an ethical company or as a person trying to improve health, you’re an insurer, whatever it might be, if you are getting people to exercise more, to eat more salads, to save for retirement, it doesn’t matter if they’re saving for retirement because they really care, they understand financial literacy, they value savings and interest rates, or it’s just that when they signed up for a checking account, you also gave them a saving account and set up direct deposit, right? That’s great.
0:41:09.6 Tim Wilson: Well, and that’s the world, like on the financial, the opting out versus opting in, it’s just, it’s like, is there language for kind of some of those principles? Like you mentioned proximity earlier with a physical space and, or the snacks example, you’ve said friction, like, are there kind of macro pillars or like these or immediacy, I think is kind of another… Are there kind of like pillars or high level things that you’re like, look, these are the things, anything else you’re doing that doesn’t fall into one of these buckets, you are just playing around in the margins, because this is always, these are the things that are going to trump any of those other sorts of things you might do. Does that make, is that a fair, I have no idea if the answer is like, no, I have, I went to a lot of schooling, because there’s really complicated and it’s nuanced, I’m not going to give you the four pillars to behavioral science, it doesn’t exist.
0:42:12.4 Lindsay Juarez: It’s both, it’s both is the answer, is that certainly you want to make things easy for people, you want to make it fun, you want to make it, the reward as close as possible to the behavior as you can get. So insurers are always saying, right, do this difficult, hard thing, and we’ll give you 5% off of your next six months out premiums, like, whoa, no, of course people aren’t doing this, it’s never going to happen. And so certainly whatever you can do, basically, I think always thinking about the here and now, and what is making it simpler to do, what is making it more enjoyable, that’s where you’re going to get the most.
0:42:54.8 Moe Kiss: So it’s like, I’m going to give you the 30 day trial first, not give you 10% off in a year’s time?
0:43:01.6 Tim Wilson: Well, that’s why I’ll give you a 30 day free trial, which you have to put your credit card in, because then you have to turn it off. So you introduce…
0:43:10.7 Val Kroll: Yeah, I’m not going to send you a reminder.
0:43:12.4 Tim Wilson: I’m not going to…
0:43:13.1 Lindsay Juarez: Yeah. That’s behavioral science, but bad. That’s not the goal.
0:43:17.8 Moe Kiss: Wait, what’s other bad behavioral science?
0:43:21.3 Val Kroll: Yeah, using it for bad.
[laughter]
0:43:25.7 Tim Wilson: And give specific kind examples, please.
0:43:30.9 Lindsay Juarez: So it even has its own catch names. So there’s nudge, right? Behavioral science for good. And then there’s sludge, is when you use behavioral science to slow you down, you’re making it harder, you can’t get out of it. I think Dilip Soman is to thank for that term. And it’s so intuitive. Now you’re like, oh, I understand. You’re making it harder for me to cancel. You make it so I don’t understand the terms. I click the wrong thing. It’s all of that.
0:44:01.8 Moe Kiss: I remember when I was, like, first getting into CRO, about a hundred years ago, there was a like, web series, I don’t know if you guys remember this, with Pep and Ali Gardner, and it was called Page Fights. And they would oftentimes invite someone, like a special guest to join. And people would submit, like, landing pages or homepages to be essentially ripped to shreds by them. But it was so funny because it would be like like, remember those, like, trust or verify signs? Like, you can submit here. Like, why would you put that so far away from the submit button? Or, like, you should never put this below the proverbial fold. And so they would, like, take all these, like, principles and, like, layer them on top of each other when you’re like, who knows if the context of all these things together actually leading to any, like, good or bad performance. But I just remember, like, it was just one after another, just, like, throwing it at these pages. Just, do you guys remember? Is that. Am I the only, I’ve seen…
0:44:58.6 Moe Kiss: No, but it sounds fun.
0:45:00.5 Val Kroll: It was. I mean, they’re hilarious. So it was always a good time. But I remember thinking, like, those had some of the… It was before, like, the canned SPAM and all that. So, like, auto check, the, email me and things like that. And so I was reminded of some of that. Now I’ll remember that as sludge, thanks to you, Lindsay. Some of them were good ones, too. It wasn’t all bad ones, but just thinking about how they’re used in combination or again, like, all those layered assumptions just getting pushed in together.
0:45:30.3 Moe Kiss: Okay, thanks for the, like, the trip down memory lane. But still, I’ve got, like, 50,000 things to go through with Lindsay. So, like, onto the next topic.
0:45:36.7 Tim Wilson: Okay.
0:45:38.7 Moe Kiss: You mentioned earlier that sometimes it’s a lot easier to have, like, big impact at a small company. One of the things that’s top of mind for me is like, sometimes it’s like, really good intent, especially when it’s nudging. And also for the betterment of our employees. The one challenge I do have is, and I thought the inverse, I thought the bigger your company, the easier it would be because you could experiment more easily because you would have bigger samples. It gets very tricky at a smaller company, especially if you want to test something specifically on employees. I imagine this happens all the time when you work with companies. Right. Like, I’m thinking we changed something recently in our last round of performance reviews, which we wanted to do to better recognize contributions that people were making that, I don’t know, you could call it glue work, kind of unseen work, that sort of stuff. And like, my desire is always like, let’s experiment. And then there are these like technical barriers of like, oh, you can’t do it because the system doesn’t let us like do one questionnaire for one group and one for another group. Do you find that that is often a blocker or do you just… Is it like if people have such strong desire to experiment so that they understand why the outcome happened, that happened. Like people find a workaround.
0:46:58.4 Lindsay Juarez: Yeah, sometimes, you’re right, you can’t test it well. You can’t have a perfect RCT from the tech, from the sample size, whatever it might be. I think any measurement is probably better than none or it’s better than like vibes only. And so I think that’s what we push for. And right there sort of tiers of experimentation and the analytics that you can take on and so that, it’s trade offs, right. I think the nice thing about a smaller company is simply getting the buy in and it’s much easier to all have the same language and the goal, whereas once you’re in something really big, it’s so siloed and different KPIs, different metrics, everybody’s getting evaluated differently.
0:47:47.7 Tim Wilson: So I’m sure Moe has a million more questions, but unfortunately we are running out of time, so I’m going to have to nudge us towards a wrap. So this has been a really interesting discussion. We didn’t have any mention of Katy Milkman and Choiceology, the book and the podcast.
0:48:07.9 Moe Kiss: I was very surprised Tim, you held back.
0:48:09.9 Tim Wilson: That’s why I just slipped it in.
0:48:11.3 Moe Kiss: I do, I do love Katy Milkman.
0:48:13.3 Tim Wilson: Because I was thinking of some of the things where she talks through the studies and we won’t pursue the whole. Like when studies are done in a academic setting, it can be the challenges and risks of trying to figure out behaviors. But at least I’ve said it, so now I’ve put it.
0:48:30.1 Lindsay Juarez: You covered it.
0:48:30.9 Tim Wilson: Without being a question. But this was a great discussion. Before we wrap up, we like to do a last call where every go around and have everyone share an article, a thought, a post, a movie, whatever that might be of interest to our listeners. Lindsay, you’re our guest. Would you like to go first?
0:48:53.2 Lindsay Juarez: Yes. I am so excited to share this. This is an academic paper that I… I know.
0:49:00.4 Tim Wilson: Spoke like an academic.
0:49:02.6 Lindsay Juarez: It’s going to be worth it, I promise you. This is a delight. So this is by a researcher who really focuses on communication and especially miscommunication and where we don’t realize that things are going so awry. And so it’s very clever. He had Chinese speakers come in and read ambiguous phrases. He’s recording them. And so it was something like the question was, oh, what have you been up to? But they were told to say it in a particular way, right? That’s ambiguous. What have you been up to? Oh, what have you been up to? And so they were assigned a particular intonation and then listeners listened to those recordings and they were supposed to guess which intonation it was. And when it is Chinese speakers, people, or Chinese listeners, so they speak the same language, they understand the words being said, then you get people to say, I think I’m right 85% of the time, but they’re actually right 44%. Chance would be 25%. But then he also got English only speakers to listen to these recordings. So recording in Chinese, I only speak English, I only understand English. And then the people have to guess which meaning it is. And they rated their confidence. The English listeners are getting it right 35% of the time, but they think they’re right 65% of the time. And it’s that 65%, I don’t speak this language, but I feel confident that I know what’s being said is wild to me.
0:50:39.8 Moe Kiss: It is wild.
0:50:42.7 Val Kroll: Wild.
0:50:45.2 Moe Kiss: Outrage.
0:50:45.3 Lindsay Juarez: Yeah, it’s just nuts.
0:50:46.5 Tim Wilson: What’s the name of that paper?
0:50:48.1 Lindsay Juarez: What is the actual title. The Extreme Illusion of Understanding. And then they also asked the speakers, do you think people will get this? And they were like, yeah, at least half the time. So basically everyone is always wrong. You’re not communicating nearly as clearly as you think.
0:51:09.9 Val Kroll: In conclusion.
0:51:10.7 Lindsay Juarez: And you as a listener are just so confident that you’re getting things and you’re not. Even when it’s in a language you don’t speak. I love it.
0:51:20.2 Tim Wilson: I think that describes so much more of my life than just even reading intonation. Oh, wow. That sounds fascinating, an academic paper. I feel like your description of it might have been better than me trying to read an academic paper. But we’ll see.
0:51:39.9 Moe Kiss: I know. I feel like your description was amazing.
0:51:42.2 Tim Wilson: I’m going to give the abstract a shot, but we’ll see.
0:51:45.5 Lindsay Juarez: Report back.
0:51:48.1 Tim Wilson: Moe, what about you? What’s your last call?
0:51:50.9 Moe Kiss: Well, I did have a different one, but I’ve swapped because I did just mention this book. So I have been reading Make Work Fair. I’m going to butcher these poor women’s names, but Iris Bohnet, I think I did that okay. But Siri Chilazi is the second author. And yeah, I’m just loving it because like I said, it’s really challenging my perspective on, I guess I always thought a big part of Making Work Fair was like about winning hearts and minds. And it’s like, it’s not, it’s about like, what are the processes and policies that you put in place. And it’s been like really influential. Yeah. It’s like really changing the way that I approach this and how I spend my energy. So definitely recommend a read.
0:52:32.3 Tim Wilson: Interesting. You reading it or are you listening to it? Are you listening to it or reading it?
0:52:35.6 Moe Kiss: Oh no, this one I’m listening to. I don’t read workbooks anymore. My Kindle is purely for whatever trash I want to read.
0:52:45.3 Tim Wilson: What about you Val?
0:52:48.5 Val Kroll: So mine is a Medium article that was published on the UX Collective called UX or PX, Why Naming Matters. And so this was inspired by some time ago, the VP of product experience at Duolingo, like how this like big post that they renamed from their whole UX function to product experience. And you think about that, it’s like a pretty big switch from user experience to product experience, but it caused this like within their realms, like this big kind of uproar. And it even inspired Jacob Nielsen to respond to, he’s like the father of UX, Nielsen Norman Group and all the heuristics and the norms, right. And in reading this post, it was interesting like why they break down that shift and like what that could potentially mean. And if other people will follow suit and there was like a whole section in the article talking about like why naming matters and why UX and PX folks care about names. And it reminded me a lot about how we hate the term CRO, but we also can’t stop using the term CRO for conversion rate optimization. Tim asks me all the time, like, I’ve heard that we don’t say that anymore, but what do we say in its place? Like, I don’t know, experimenting, experimenters? It’s like such a nice shorthand, but anyway, so it’s fun to kind of step a half step away from the analytical experimentation side to see that other people are considering and thinking about those things too. So an interesting read.
0:54:18.2 Tim Wilson: But it matters more if the product is having a good time with itself than it matters if the user is having, is that what the product experience is? Is the product fulfilled?
0:54:27.7 Val Kroll: Is the product satisfied?
0:54:29.3 Tim Wilson: Yeah.
0:54:29.5 Val Kroll: Well, because if you always think about like UX or even CX, right? It’s like the intersection of whatever it is, design, the product and what’s good for the business, which was even actually a part of like your introduction to the way that you apply behavioral science, Lindsay, when we first opened this call, which I thought was interesting. So it’s prioritizing the goals of the product and what that needs of the business more than individual users. But the byproduct of that is a good user experience. So it’s a reprioritization. So it’s interesting. It’s an interesting shift. All right, what about you, Mr. Wilson? What do you got for us for our last call?
0:55:05.8 Tim Wilson: So I’m going to do the quick bit of log rolling and just call out that Analytics the Right Way: A Business Leader’s Guide to Putting Data to Productive Use, is now available as an audio book, I think on most major platforms. So if you haven’t bought that yet, that’s the book I co-wrote with Joe Sutherland.
0:55:23.7 Moe Kiss: Or you could do both.
0:55:25.1 Tim Wilson: You could do both.
0:55:27.3 Moe Kiss: I have both. I bought both.
0:55:27.5 Tim Wilson: Buy one, buy ten. No, buy fifteen. Buy every possible… You can get an e-book. You can get a physical book. So buy the book. When it comes to things that are written by others, I will plug a new sub stack that I came across that I think will appeal to a certain audience. And that is the Can’t Get Much Higher sub stack. And it’s Chris Dalla Riva. But it’s basically the intersection of music and data. There seem to be a whole bunch of people kind of cropping up who are doing cultural things with data and going in like he did the one that I found him on was the greatest two hit wonders because like, oh, everybody knows one hit wonders. But what about like two hit wonders. And it was interesting. A lot of times it’s where do you get the data? How do you actually define a two hit wonder? Iterating on kind of how that breaks down.
0:56:21.2 Moe Kiss: You have to get uncomfortably specific on that definition.
0:56:25.0 Tim Wilson: You do. Well, it’s actually one where it’s like you pull at it and then you’re like, wait a minute. I think the greatest two hit wonder. I’m going to forget because I’m just not enough of a music person. But it had a couple of bands that I like totally knew of that his first definition. He was like, well, they didn’t count because they’re more album bands, but nobody would consider them to be a two hit wonder. Like they’re a legendary band. And of course, my personal music history is terrible, so I can’t remember which bands those were. But so anyway, so with that, Lindsay, thanks again for coming on the show. This was a fun and interesting discussion. It’s got me thinking. For listeners, hopefully you enjoyed the show as well. If you have any questions or thoughts or suggestions for show topics, you can reach out to us on through our LinkedIn page. You can find any of us on the Measure Slack. You can just use good old fashioned email at contact@analyticshour.io. If you’ve enjoyed the show, this episode, the multiple episodes, we would love to have a rating and a review. We’ll be talking to Lindsay offline about how we can nudge more users to do that. Do it now. Do it now. Urgency. We will give you a 5% discount on the…
0:57:45.6 Lindsay Juarez: No!
0:57:46.6 Tim Wilson: Seven years from now. It’ll be perfect. So no show would be complete without thanking our producer, Josh Crowhurst, who is still working his way through some changes in our production process, which we deeply appreciate him doing. And we will continue to nudge him to be awesome, which he doesn’t need any nudging to be. Michael was so much better at these wrap ups than I am. But with that, I know I speak for my co-hosts on this episode, Moe Kiss and Val Kroll. No matter who you’re trying to nudge or whether maybe you’re trying to generate some sludge, you really shouldn’t be doing that in most cases. You should always, always keep analyzing.
0:58:37.7 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour, on the web @analyticshour.io, our LinkedIn group and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.
0:58:56.6 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics. Analytics don’t work.
0:59:01.8 Speaker 7: Do the analytics say go for it no matter who’s going for it? So if you and I were on the field, the analytics say go for it. It’s the stupidest, laziest, lamest thing I’ve ever heard for reasoning in competition.
0:59:16.9 Val Kroll: You sound good, Tim.
0:59:18.6 Tim Wilson: Okay. Well.
0:59:20.8 Val Kroll: A little quieter than usual, but clear with no static.
0:59:25.9 Tim Wilson: And not even my background hum, which is why I got rid of this mic in the first place. Awesome. Fantastic.
0:59:33.0 Moe Kiss: You do sound quieter than usual, but maybe it’s just because he’s not on a soapbox yet.
[laughter]
0:59:45.3 Val Kroll: Rock flag and nudge or sludge!
[music]