#154: Podcast Movie Club: The Social Dilemma

We didn’t want to have a discussion about Netflix’s The Social Dilemma, but, somehow, we just felt compelled to do so. It was almost like we had a generally unlikable character from a TV series about advertisers’ attempts to manipulate consumer behavior in the 1950s and 1960s transplanted in triplicate into an AI that was optimizing Netflix’s reach and engagement by getting us to talk about the movie. OR, it addresses a very real issue (a…dilemma, even?) in an approachable manner that, if you’re like us, has alarmed your friends and relatives. It certainly seemed worth a discussion, so we had one about it!

Packages, People, and Podcasts Mentioned in the Show

Photo by Daria Nepriakhina on Unsplash

Episode Transcript

0:00:04 Announcer: Welcome to the Digital Analytics Power Hour. Michael, Moe, Tim, and the occasional guest discussing analytics issues of the day, and periodically using explicit language while doing so. Find them on the web at analyticshour.io, and on Twitter @AnalyticsHour. And now, the Digital Analytics Power Hour.

0:00:27 Michael Helbling: Hi, everyone. This is the Digital Analytics Power Hour, and this is Episode 154. Okay, grab your popcorn, and since we all work from home right now, you’re probably already in your comfy trousers. It’s movie night at the Digital Analytics Power Hour, but first, make sure to go to iTunes and give us a rating and review, it helps the algorithm. Also follow us on Twitter at AnalyticsHour because that’s good for the algorithm, too. In addition, join our LinkedIn group. We don’t actually know what that means, but I’m sure it helps something. We are discussing the recent film, The Social Dilemma, that came out a couple of months ago on Netflix, something we’re trying out here at the Power Hour, so hence the joke about all of our social channels. Okay, Moe Kiss, Head of Marketing Analytics at Canva, you are my co-host. Welcome.

0:01:17 Moe Kiss: Thanks, nice to be here.

0:01:19 MH: Great to talk to you. Also discussing the film is Tim Wilson, Senior Director of Analytics at Search Discovery.

0:01:27 Tim Wilson: And you can like that or retweet it.

0:01:31 MH: You can smash that subscribe button.

0:01:34 TW: Smash that subscribe button.

0:01:36 MH: If we can get 10,000 likes on this episode, we’ll do a giveaway of a T-shirt or something. That’s what the YouTubers say, right? All right. And I’m gonna say something. I am Michael Helbling, the founder of Stacked Analytics. Okay. Who wants to go first in denouncing social media? I mean, discussing The Social Dilemma. I thought, to kick us off, Tim, we elected you to give everybody an overview of the film just to get us started, and then we can get into some of the topics in deeper discussion.

0:02:07 TW: Just based on my gift for concision? That would be the most expedient way to get through that. Hopefully, many people have watched it, ’cause it did organically come about. But for anyone who didn’t, just the structure is kind of interesting. It is a combination of interviews in classic documentary style, but it’s former product managers and engineers and executives from a whole range of social media platforms centered around one of the main co-founders or the main co-founder, Tristan Harris, of The Center for Humane Tech, who used to be at Google. That’s kind of the exploring the algorithms within social media, asking them a lot of questions in that style. But then that’s interspersed with a fictionalized family account of a husband and wife and three kids. And basically there’s the one older sibling who seems to be the voice of reason, has more of a clue than her parents or her younger siblings. And then the two younger siblings, the one is like a high school teenager, the other one, I think she’s maybe middle school aged, and both of their experiences with social media and how things go awry.

0:03:20 TW: It’s pretty well done. It’s probably, I don’t know, two-thirds of it to three-quarters is the interviews, trying to stitch together the evils of social media, although I would say framed is called out that some of those are unintentional evils, but all of the various downsides of social media, but then tries to make it real with this family. And then maybe the third component is it’s got Vincent Kartheiser? I don’t know the actor’s name. They basically have the AI representatives, three different people, so it gets into the AI, which I would say is one of the areas I had a little bit of a beef with. But that’s also part of the fictionalising piece. But it basically tells this narrative of we are doomed, thanks to the algorithm.

0:04:12 MH: I did watch that movie and become very interesting in finding out more about what is extreme centrism, so at least that is something. Sorry.

[laughter]

0:04:25 MK: I think one of the things… I have a friend actually who’s incredibly passionate about this topic. And so for me it was really interesting watching it because it’s a topic I’ve been talking about for probably a year now with this good friend. She actually went to the Center for Humane Technology, and brought back this concept of humane tech dinners.

0:04:46 TW: If we’re not naming this friend, can we say this friend might have not been allowed to talk about this topic publicly, and therefore will remain the unnamed friend.

0:04:55 MK: She’s a very dear friend. Let’s just go with that.

0:04:56 TW: I wasn’t gonna reveal the gender, but if you wanna narrow it down to half the population, then…

0:05:01 MH: Let’s just remember that anything that Moe discusses on the Digital Analytics Power Hour podcast, or its associated conference topics, are not necessarily the views of her employer. Okay, there you go.

0:05:14 TW: Or her friends.

0:05:15 MK: Or my friends.

0:05:15 TW: Or her friends’ employers.

0:05:17 MK: But one of the things that we did talk about quite a bit was getting momentum behind people caring about this problem, about what are the negative sides of technology, and what is it doing to our society, to our relationships. The movie did a very good job of making this relatable and understandable to the mass population. And ultimately there are so many people who participate in the movie because that was the real aim of it. It’s like this is a discussion that has been happening within the tech industry and hasn’t been getting as much traction as it should because it doesn’t have the buy-in from the wider population. And I do think this was a way to make it relatable. I’ll get into the pros and cons of the relatable-ness, but that’s another thing.

0:06:05 TW: The reality is I’ve got an aunt who… I adore her. We don’t have a lot of regular communication, but she actually emailed me and said, “Hey, my group of friends have watched this. We’re really concerned.” So, I think it definitely did that. And then she was like, “You kind of are in that space. Have you seen it? What’s your take on it?” So, I definitely think, partly between Netflix as the distribution platform, frankly, partly is Netflix’s algorithm bubbling it up, and I don’t know how much is the algorithm and how much is Netflix saying, “This is one we wanna get behind a little bit.” Facebook, of course, came out and was like, “Trash Netflix for doing this,” which was an interesting side dynamic. But, yeah, that’s a great point.

0:06:52 MH: Well, it’s interesting. And as a matter of principle, I did not watch the recommended next movie that Netflix thought I should watch after watching The Social Dilemma. I was like, “You’re not the boss of me. It does look interesting, but I’m not gonna watch it.” It’s very interesting because I think one dynamic that came out in a lot of the discussions, and there’s a number of interviews with these people who helped build these companies, like Google and Facebook and Twitter and Pinterest and other companies, is that they didn’t start out with any malice or intent really.

0:07:26 MH: It was pretty clear they didn’t begin with this intensity of, like, “Let’s go capture people’s interest and really make them dependent on us or maximize our gains at their expense.” But they all, at least the people that were interviewed in this film, have all come to realize that the end result of their pursuit, and they specifically call that growth hacking, and I just wanna put that on the record. Growth hacking, bad. Okay, good, moving on. No need to discuss that more. But the end of that pursuit was where we’re ending up, which is this intense manipulation not just of people’s habits on the platform but even their neuro-scientific psychology, whatever brain levels, dopamine, that kind of stuff.

0:08:12 MH: And it seems like there’s two parts of this that seemed to stand out to me, and I don’t know if I’m diving into the philosophical side of it too quickly. The first is the responsibility of the company, and the second is the agency of the individual. In other words, I have a cell phone and they made a lot of… The cell phone figured heavily in the movie, which it should, ’cause it’s a pretty ubiquitous device in at least the western, or in the developed world, but I have to make choices about my own cell phone usage. I’m certainly not being helped by these companies, and that they’re really trying to suck me in and get me to… The word “doomscrolling” is a word I learned during the pandemic, and it’s pretty apt sometimes. You just start going through Twitter, and Reddit, LinkedIn, Facebook, whatever it is, and you’re like, “It’s 2:30 in the morning, I should be in bed.” But it’s like, “Yeah, how much of that do I own, and how much of that is on them?” And that part is what’s really hard for me to figure out.

0:09:15 TW: That’s a slippery slope. ‘Cause you’re essentially saying that it’s gonna be hard… That is, if you take it to the extreme, the person who blames the alcoholic for alcoholism, the person who blames the drug user for…

0:09:31 MH: Well, yeah.

0:09:31 TW: For drug use, right? And, oh, it’s the kid, by the way.

0:09:35 MH: Well, or blames McDonald’s for obesity. There’s just all these…

0:09:39 MK: But I do think there is some responsibility on companies… Well, not some but a big part of it. And to be honest, it is the culture of growth hacking, whatever you wanna call it, where the startup culture or… They’re not really start-ups anymore, they’re scale-ups or now huge businesses, they become obsessed with growth and they motivate their employees for deeper engagement, more users. And that, I think, is the crux of it. It’s like, yes, maybe there wasn’t ill intent, but if you are always incentivizing your employees to increase those metrics, then they’re gonna do whatever they can to increase those metrics.

0:10:22 TW: So, I just wanna clarify that I said you’re heading on a slippery slope of blaming the person with the addiction. And you said, “That’s like blaming McDonald’s,” and it’s exactly the opposite of that.

0:10:29 MH: No, no. I’m saying there’s lots of different ways of… I’m agreeing with you and providing a counter… There’s all kinds of different ways of looking at this. I agree, when someone has got their brain so wrapped around stimulus and response such that they can’t really put their phone… That is an addiction, and that’s different than making a choice. And again, I’m not a psychologist, I can’t really explain the details of that, but I know there is a difference. And that’s the thing, to the same extent, like we all kind of live… Well, none of us live… Tim, you lived through the cigarette companies basically lying to the public… Tim knows all about that. No, I’m just kidding. [chuckle] I’m trying to make an old guy joke. Anyways, but it’s sort of happening again in a certain sense in that… But the weird thing is, is it didn’t start with them doing it. But, Moe, you made a good point. It was like, “We’ve gotta grow super fast. We gotta scale, scale, scale, scale, scale.” And then they arrive at these really unintended outcomes, and as a society, we’re not ready.

0:11:38 TW: But back up to how they got to that, because that’s… The point was made. I think it was actually the Pinterest former President Tim Kendall, called out that when they got into this, they weren’t really sure how they were going to make money, and what happened is that advertising not just became a good way to make money, it became… It is such a dominant…

0:12:03 MH: It is the way.

0:12:04 TW: It’s not like, “Oh, we’re gonna have a balanced portfolio. We’re gonna sell some of this, and we’re gonna make some money this way, and we’re gonna do some advertising, and we’re gonna have some memberships.” It became such a no-brainer, so lucrative that… And by then they’re public, and once they’re public and they’ve gotta be looking out for the investors, it just says, “Well, we kinda have to do more of that.” And then the leap from that is, “Well, how do we keep growing? Because we’re public and we have to keep growing, and the only way we’re gonna grow at scale is to drive more of this.” Well, okay. I did love the way they broke down their AI. I liked and did not like aspects of how they represented the AI, but when they broke it down into, “We’ve gotta get more people,” which you get just by… Generally through referrals, people getting peer pressure to be on, getting over the tipping point to where you get critical mass, but you gotta drive more people. You’ve gotta keep them on as long as possible, that’s more impressions that you can sell. And then you have to sell them ads that are effective. And that when you do your supervised learning algorithm, to say, “This is what we need to do, and this makes perfect sense on an Excel spreadsheet. If we do more of this, we get more of this, and everybody wins.” And that’s where it’s everybody except for the kids who are getting fucked up and the adults who are getting fucked up by it.

0:13:35 MK: I had a moment watching it where I actually was on my phone scrolling, and I was like, “Woah,” and dropped my phone and had this, like, “You’re doing it right now. You’re watching it, and you’re doing it right now.”

0:13:48 TW: You become hyper aware.

0:13:50 MK: Yeah, like this tiny second where you’re not interested in what you’re doing, so I’ll turn to my phone. And I’ve tried to institute tech-free nights in my house, but you have been a…

0:13:58 TW: Yes, I’ve just leveled up on Candy Crush! Oh no, oh sorry.

[laughter]

0:14:02 MH: Huh? What was that? I was just checking my phone.

0:14:07 TW: Sorry.

0:14:08 MK: But, yeah, they have been… Tech-free nights in our house have been a total failure, total failure.

0:14:14 MH: I’ve got a niece who… And this was seven years ago. She’s about to graduate from college, but when she was in high school, which was… Not that long ago, her friends would do the… When they went out together to a meal, they would all put their phones in the middle of the table, which I know was a common thing. She’s only the only teenager I have personal experience with, but that took a collective action, “We’re all gonna not do this,” which is, in the film, it’s like, “Well, that’s great if you’re… All your buddies who you’re hanging out with anyway, your FOMO is not gonna be bad, but, oh, put me with my parents and it’s like, “Yeah, I’m pretty sure there’s something else going on.”

0:14:54 MK: But that’s the… Okay, I need too small anecdotes. Number one, I now have to lecture my mother about this. It’s actually gone the other way, where she will answer the phone, she’ll look at text messages during dinner. I don’t look at my phone during meals. If I’m with someone, I’m with someone. The other family member who’s a total pain in my ass is one Lee Isensee, because we play that game where we put our phones in the middle of the table and it’s like, “Okay, whoever gets their phone first has to pick up the bill.” And Lee is like, “Fuck it, I’ll pay for it, because I’d rather have my phone.” And I’m like, “That defeats the whole purpose of this exercise.”

0:15:32 MH: I feel like I have to come to Lee’s defense, but I don’t know.

0:15:36 TW: Why?

[laughter]

0:15:36 MH: I don’t know. Stay strong out there, Lee. I don’t know. Maybe get off your phone sometimes, but…

0:15:43 TW: My mother got wind of the exchange I had with my aunt and asked me to forward the email to her, and my mother has not actually watched the film, so I’m like, “Okay, there’s a little bit of… ” when you get into the retweeting without actually clicking through on the link, I’m like, “You’re kind of exhibiting a little bit of the problem they allude to of sharing without knowing, without the independent thought.” She’s also my example of watching polarization politically. To her credit, in the last month or so leading up to the election, she started taking a somewhat active effort to try to understand other perspectives. It is one where the… And I think this is part of Facebook’s objection. They’re like, “We did not create political polarization.” They’re like, “That already existed.” And there is that kind of famous US watching the parties drift apart. It’s like, “Yeah, you did. But at the same time, you literally are dropping a feedback loop, like the… ”

0:16:49 MK: You’re exacerbating the problem, significantly.

0:16:51 TW: You’re exacerbating the problem, and you’re just accelerating it.

0:16:54 MH: It’s like, “Oh, the bridge was already vibrating, but you just created a perfect harmonic for it to massively accelerate the vibrations.”

0:17:05 TW: What did you guys think of the ways that the algorithm was explained? ‘Cause it was kind of personified where they where they were…

0:17:13 MK: I hated it.

0:17:13 TW: Okay. That really bothered me.

0:17:16 MK: I had such serious issues with it because I feel like we’re people who know exactly what this movie is talking about, right? But for the lay person, which is 99% of people who are watching this, I don’t think they would have understand the symbolism of it being an algorithm. They would have thought that there is a literal engineer at this company that is targeting them as an individual and not looking to basically modify or enhance behavior of aggregated groups based on their behavior. And so I felt this real visceral reaction to how they explained or how they visualize the algorithm, because I felt it could be really misleading, particularly from that really individual focus.

0:18:03 MH: Yeah.

0:18:04 TW: I actually missed on the… I didn’t catch it the first time. The second time I watched it is prepping for the show. The AI does, at one point, actually say, “Oh, we’re losing him. If we do this, there’s a 62% probability that he’ll re-engage.” And the other AI says, “That’s not good enough. What else do we have?” And I was like, “Okay, there’s a nod to the probabilistic world of models.” But for that exact same reason, that’s where I get worried that the masses, that anybody who walks away, which many people will, that every company that has data and has machine learning going on has this identified human idea and the… And it is. It’s a nuanced thing to try to explain, that trying to figure out the next best action for an individual is still different from this, like, “Oh, this guy is about to get up and walk downstairs to the kitchen, so let’s put this much delayed notification on his phone.” I’m like, “That’s not helpful, even though it makes for drama.”

0:19:13 MH: Yeah. I guess I didn’t sit back and think about it in the way you just described, Moe. So, it didn’t particularly bother me, but I can definitely see where you’re coming from. Now, I just was like, they’re trying to make the concept of AI understandable. The interesting thing is, it’s like… And again, I’m going back to that same guy you were talking about before, Tim Kendall, he’s basically… At Facebook, we had users, engagement, and revenue, and we basically had them on dial. We want more of this, we’d just have a dial. They even talked about how let’s set up dials in Mark Zuckerberg’s office, he can just… “We need more people in Korea today, that dial.”

0:19:54 MH: The thing that I was thinking about is, so ad-driven businesses in information and in entertainment have been around since radio started, maybe even before that, ’cause maybe we could even make a case for newspapers. And radio programs and television, broadcast television, are ad-driven, and they do research and find out, like, “How many ads can we put between the show would get you to stick around and not leave to another program? Or how do we create shows that make you wanna watch more television? How do we make our Thursday night line-up make you watch more TV?”

0:20:34 MH: And so in a certain sense, it’s like the thing that seems to be scaling here, massively scaling, is the precision to which the AI is able to go after you as an individual and tune all of these billions of opportunities for engagement and content specifically to you to draw that out even more. Whereas we used to go to target demographics and things like that, like, “Okay, what are we gonna put on after Seinfeld?” “Oh, let’s put on this. We know everyone’s gonna watch Seinfeld, so let’s use that to give a bump to this show and plan out… ” I remember when I was a kid, we’d look at certain nights of the week on NBC, it was like the comedy night, and other nights were other shows or whatever, I don’t know. But it’s not a new concept, is what I’m saying. It’s just that the computer algorithms we’ve developed have allowed us to just solve the problem at scale so much more effectively.

0:21:33 TW: But if you had to apply… The thing is, is because you can do hyper-focused, you can actually… When they talk about… They’re just trying to find which rabbit hole they can take you down, so that gives you this… And it is, marketers, we talk about this one-to-one bullshit all the time, I wanna make stuff that’s relevant, and then you see somebody who’s not in marketing says, “No, you’re trying to manipulate me.” The marketer sees it as… We tell ourselves the story, we’re giving you something that is relevant content, but why do we wanna give you relevant content? Because we wanna get you to click on what we have put in front of you to drive this action and a purchase. So, yeah, marketing is doing the same thing. Nowhere near as effectively, although I think that’s what… I do get worried, based on the clients I work with, are a million years away from what Facebook is doing. And that’s not just because Facebook is smarter. Facebook is a walled garden with an obscene amount of data, and they’re still not doing what was metaphorically represented.

0:22:36 TW: Not that we’re not getting towards that point, but that’s why all the locking down, preventing the data that marketers have and can use, it’s like, well, maybe this does get back to maybe a little bit more mass appeal. Maybe now you’re doing… You’ve got Schitt’s Creek out there appealing to… It’s gotta hit enough broad people that you get a few people to be like, “Oh, we can have these gender-fluid relationships, and maybe that’s okay.” If this was all hyper-targeted, there would be a ton of people who would never get drawn into a sitcom that is a little different, a little quirky, and may actually round out their world view a little bit. I think that’s the… ‘Cause it focuses world views and to the exclusion of everything else, which I think that’s where it’s different from when you had three channels.

0:23:29 MK: See, the ads feed didn’t stick with me the same way it has with both of you. And obviously I work in marketing analytics, so maybe I’m just, I don’t know, in my bubble, but it wasn’t something that I consciously thought a lot about. I was much more drawn on the side of the dopamine hits of the like buttons. And I think the way that the younger sister in the family, particularly it affected her self-confidence, I thought they actually did an amazing job of demonstrating that, and it made my heart hurt for all of our children and our future children, like, “What is gonna be… ” I’m okay with getting a car ad if I’ve been car shopping. But the idea that my entire self-worth could be tied up to how many people like a particular post, that left me really upset.

0:24:24 MH: Yeah. So, that was probably the hardest hitting part of the movie for me, too. I’ve got two kids, 14 and 12, and luckily they’re both boys and they couldn’t care less about social media so far, but that’s gonna change. It’s gonna change. And as a parent, I don’t know that I know how to do this, and Tim, you can tell me I’m wrong, ’cause he’s better at this, but I recognize pretty early on in having kids that there is a whole new competency that was required of me as a parent that my parents never had to grapple with, which was, how does digital play a role in our lives? How do we manage screens and screen time? And what’s our perspective on that? How old should our kids be… When you were young, there was, like, what age are you allowed to start dating? And that’s still a question, I guess. It just doesn’t seem like people date anymore, I don’t know what’s going on.

0:25:17 TW: And, Michael, you’re not allowed to date once you’ve been married for a decade.

0:25:21 MH: Oh, my gosh. Maria and I go on dates. Is that allowed?

0:25:27 TW: You’re allowed to go on a date with your wife.

0:25:28 MH: Oh, okay, that’s good. Thanks, Tim, appreciate that. But the next question is, “How old should my kids be before I get them their first cell phone?” And I would say in their social group, right now, they are the ones without the cell phones. Most of their friends have them, but I’m like, “I’m gonna hold on for as long as possible.” And then watching that in The Social Network, I was like, “That’s why I’m holding off, right there, middle school sucks.”

0:25:52 MK: All of the people they interviewed said they won’t give their kids access to social media and phones, which I would…

0:25:56 MH: Yes, at the end of the movie.

0:26:00 TW: Well, one, they have an enormous amount of… Somewhat easier said than done. And having had children with mental health issues, there are… It is putting kind of an impossible burden on the parents when you say, “Oh, you’re gonna be the one who is gonna… ” In absolute terms, sounds great on the whiteboard, we’re not gonna let them have… Well, that does actually mean… And I’m not saying… You wanna hold off as long as possible, but recognize that that is putting a degree of… You’re fighting the most challenging time in a socially, social experience for the kids, and you’re kind of hampering them on that. My oldest went through… One, he’s a guy. Two, he could give two shits about anybody’s opinion. Never got sucked in. I’m like, “Yeah, piece of cake.” Well, guess what? That set unrealistic expectations for the others. But I would say, at the same time, we do have… Is they talk about the increase of mental health…

0:27:12 TW: Causality is a tough thing. We don’t really know what is causing the increases in all of these, but it’s the similar thing, that if you are starting to have some depression issues, social media is not gonna help. If you’ve got… If you’re on the autism spectrum, you’re gonna look like, in electronics, there’s a good chance you’re gonna look like you’re addicted to electronics very quickly. And the question is where… Again, you’re fighting chemicals in the brain when you say, “Oh, we’re just gonna not do this,” which I think the movie… To bring it back to that, the older sibling who’s like, “Aren’t you gonna go to soccer practice? Aren’t you gonna do X? Aren’t you gonna do Y?” She tries. That’s kind of the sense with parenting, that you can have that battle and you can rip that cell phone away, but it’s not gonna end… Oh, the mom doing the right thing, “We’re just gonna lock this up and try to have dinner.” You just took somebody off of their drug cold turkey with no warning, with no… And how’d that work out for you?”

0:28:15 MH: No. And trust me, in our house, there are many conversations about screens and screen time and usage and those kinds of things that don’t go well because, frankly, we’re not great at it. And then I spend all my time working on a computer and on a phone, and those kinds of things. Maria is like, “Well, they mimic you.” And then I play video games on top of that, and it’s like, “Well, then they’re gonna wanna play video games any chance they get.” And I’m like, “Well, there’s nothing wrong with video games.”

[laughter]

0:28:51 TW: And the video games… I’m not a gamer, but having the kid… That’s how my oldest, for years, as he and his friends scattered to college, they’re playing a game and they’re hanging out, and it’s an activity while they’re hanging out. So, where do you get between that and the other kid who’s playing a video game and talking to people, but he doesn’t know them, he doesn’t have a real connection, and he’ll point to that and say… And I think, Moe, you were saying earlier, that’s what’s bothering you more than the advertising, but that’s what the movie is making the point, they’re completely interlinked. When you have an algorithm trained to maximize one, it’s going to maximize the other.

0:29:32 MH: And I think that’s the thing in the peer group setting, where the peer group is now not defined by community constraints. So, if Sally says something mean about me, I can at least go to her mom or I could go talk to the school, or something like that, were the bounds of those things. And again, lots of different pictures of that around the world, so I’m approaching that from a prude-y western white guy point of view of living in the burbs and whatnot. There’s lots of different ways to look at that, so I wanna be sensitive to it. But at the same time, the internet opens the door to a bunch of other people who are coming in sideways, who may or may not have any relationship or ability for you to interact and be like, “That was not kind what you just said, you need to apologize.” ‘Cause that’s not gonna happen on… All you get is the negativity. And so if you’re out there, the number one profession apparently for kids now, or was a couple of years away, was to be a YouTuber. And yet talk to any YouTuber and they’ll be like, “The hate in the comments just wears you down, it just wears you down. I had to step away from YouTube, I was getting so depressed by all the angry comments and the hate.”

0:30:42 MK: Whose responsibility is this? I feel like the companies do have some responsibility. And the truth is we’re talking about algorithms, but algorithms can also be used for good, they can find hate speech and, I don’t know, a bunch of other awful things that appear online. Oh, my God, I would never wanna be a moderator, that would be the world’s most awful job in the entire world.

0:31:06 TW: Well, those people have had… Facebook’s panel of people, I remember. Yeah, that was not good for your mental health.

0:31:13 MK: But algorithms can be used to do that, but the thing that I’m finding really frustrating is the algorithms that are all about how do we get people to engage more, how do we get people to see more ads seem to be killing it. But the ones that, when it comes to hate speech or other nefarious activity, the companies are all saying it’s really hard to do, “This is too hard.” To me, that just doesn’t sit… They don’t sit together.

0:31:35 MH: Well, and I’m somewhat cynical about it too, because the companies that are doing the most out there, like Facebook, and Facebook published that response to the movie, Tim, that you mentioned already. And they’re like, “Well, we’re doing this, and we’re doing this.” And my reaction to that is, “Why didn’t you design the system for that? You were too focused on how much you could get. And isn’t it great that now all of a sudden you’ve… Now that you’ve scorched the earth, you care about climate change.” It’s that kind of thing, it’s like, “Well, we’ll stop burning the rain forest just as soon as we get all of our economic goals met.” And it’s like, “That’s not gonna cut it.” And I feel like it’s pinpointing for me…

0:32:19 MH: And I don’t know to what extent this is a part of American culture or whatever, but this idea of scaling growth as fast as possible versus sustaining growth in a meaningful way. A sustainable company versus a scaling company is this thing I’ve been thinking about, because it’s sort of… I feel like as we move forward with technology and where we’re headed with technology, and I always use the Neuralink Company as an example, where they’re trying to create an implant that goes right in your brain, that pumps information in and out. If we don’t think about privacy and neuropsychology and those kinds of things, before we do anything with that, what are we gonna do to ourselves? What are we gonna do to humanity that we don’t ever intend to, and we’re sorry later, and we’re gonna do our best? To your point, Moe, “Oh, it’s super hard for us to mitigate this hate speech and whatnot.” And people are gonna find each other on here and make these negative communities, and we can’t shut them all down, or do these political things and we can’t get to everything. And so disinformation and fake news will proliferate. It’s like, “Well then maybe you should have thought of that before you started, think through the problem.”

0:33:32 MK: Yeah. I agree, but I don’t agree. You’re putting the responsibility back on the technologist who’s sitting in the company. And I actually had a conversation with one of my analysts the other day who has been doing this really amazing piece of research on what’s the right metrics to measure engagement. That’s a pretty common startup scale-up question. And I said to him, I was like, “The next question though that needs to come after that is, is more engagement a better thing for our users? Are we going to have happier users? Are they going to get more value out of our product? Because we think that them coming back more often or them creating more designs or whatever it is, more engagement is good. But is it actually good for the user?”

0:34:14 MK: And I remember having this conversation at THE ICONIC all the time as well, where particularly male users would come twice a year, they would do a big shop, they’d buy four pairs of pants, three shirts, four pairs of jocks, two socks, and they’d come back twice a year. And everyone Is like, “Great, but so how do we increase their frequency? We need them to come back more often.” And I’m like, “Why? If they come twice a year and buy everything they need and they are perfectly happy, and have had a great experience, and tell everyone how much they love our company, why do we need to get them from coming twice a year to coming four times a year?” But the point is, that conversation is then down to the people in the company. It’s not even at the leadership level. You’re then putting that responsibility on every single person who’s building tech and asking them to lead that conversation for their companies, which, yes, I think there’s a role in that, but I also think that some of this needs to be coming from the leadership of these companies as well.

0:35:10 TW: I think there’s a degree of, “If not us, then who?” But I also think the slippery-slope, I just went and looked. I joined Twitter 13 years ago, a little over 13 years ago, and I basically met Michelle Kiss through Twitter. Not to be the old crump pining for older times, but I still have quick little exchanges with Doug Hall in the UK. Even on Facebook there are friends from high school… There is the positive part. If you ask me to define what do I want versus what do I not want… And it does, it’s like, well, that goes into what I personally feel, it goes into my personal mood at the time. Am I looking for a pick-me-up? And I’m looking for somebody who agrees with me? Am I looking for somebody I haven’t heard from in a long time? I don’t think it’s impossible, but I think it is easy to see how the snowball happened. And it was literally like, “And that’s the last horse out, we better get those barn doors closed now.”

0:36:21 TW: And it’s like… I do believe that there was… They were starting with a, “We can do this, it’s a good thing, we’ll make money in one of these few ways.” And then it just… They’re very, very complex systems. And even with your example of a retailer, it’s like, “Well, if we don’t get them to come here, what’s gonna keep one of our competitors from getting to them first, and then we’re gonna lose them entirely. And by the time they don’t come back, so it is this competition for attention and everybody just wants more attention, and unfortunately, there is just no good way to say, “That’s terrible.”

0:37:01 MK: But Tristan. Trist… Trist… I can’t say.

0:37:05 TW: Tristan. Tristan.

0:37:07 MH: Tristan.

0:37:08 MK: Tristan. Even the heartbreaking example that he goes through of where he creates this amazing memo internally at Google, and everyone is like, “Oh, wow, this is ground-breaking.” And then nothing happens. I get that the problem… The problem has evolved beyond what the initial intentions were, but it’s like what’s the circuit breaker now? Because now we’re getting into really snowball dangerous territory. It’s rolling downhill so fast, no one can catch up to it. What is the thing that gets people to care about this now?

0:37:44 MH: Well, I think the thing we’re describing is sort of, “Well, if I don’t do it, somebody else will, and I’ll lose my place in the market.” So then markets tend to, for the good of the consumer, be regulated by some function. I think that’s the place of where regulation and legislation come in, right?

0:38:05 TW: Ugh.

0:38:05 MH: I’m just saying that’s kind of the way I look at it. It’s like Facebook didn’t stop at a certain boundary because there was no boundary, and they just kept going and going, and nobody told them, “Yeah, you need to bring that in.” And they go before Congress and they’re like, “We don’t sell anybody’s data.” Well, no, you don’t sell it, you just use it for yourselves. What you sell is advertising off of that data. And so it’s difficult because, in the US anyways, our politicians, well, frankly, are just a huge waste of everyone’s time, most of the time. And so it’s hard not to feel a little defeated. But at the same time, I do believe that’s the proper equation, in my head anyways, philosophically, is there are rules and that everybody has to play by those rules. And then that creates then the sustainability of the inter-relationship between the consumer, the company, and the state of play. And that people who leave that playing field should be tracked down and punished. But to create the playing field, the problem I see right now is that the companies that created the problem would like to be the ones to define the playing field, and that’s not okay to me. That’s all I’m saying.

0:39:22 MK: I don’t think they wanna define the playing field. You look at Zuckerberg. He doesn’t want to have to set rules and regulations for his company, because then he’ll be held responsible for them. He wants someone else to make the rules and regulations, he just wants to be able to influence them.

0:39:36 MH: Well, they wanna put a finger on the scale.

0:39:39 MK: The thing that’s frustrating me is that the companies are now getting so big that if they don’t like the decision the government is making, they’re like, “Okay, we’ll pull out. We’ll pull out of this country, we’ll pull out of this state.” And then all of the users are like, “Actually, we really want Uber to be able to drive cars here, it’s really convenient.” Or like, “We wanna be able to use Facebook because that’s how I talk to my aunt in England.” And so the companies themselves have been able to become these really powerful lobby groups, essentially, which means that, yes, I agree, I do believe that legislation can play a big part in this, but it also becomes impossible to legislate some of these companies that are so powerful.

0:40:18 TW: Well, the thing is the wheel of… Legislation moves slowly anywhere. And obviously I’ve got more experience or visibility in the US, but it was… One, the laughable hearings where you’ve got these ancient white dudes asking about the tubes.

0:40:39 MH: And white ladies, come on. There’s some women in the Senate.

0:40:44 TW: Yeah, there are.

0:40:44 MH: They ask bad questions, too.

[chuckle]

0:40:46 TW: Somewhat, but maybe not…

0:40:48 MH: Anyway, sorry.

0:40:50 TW: But what happens is that… And right now in the US, there’s the Communications Decency Act, which was 1996, and there’s this Section 230, which was, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Basically, that got passed and then interpreted as Facebook, Twitter not responsible for what gets shared. And all of a sudden now it’s like, wait a minute, we need to revisit and re-interpret that. The kicker is that, again, the horses are out of the barn. If anyone says, “This needs to be done,” we have such polarization that whoever says “This needs to be done,” there is a default, “Well, that must be for you to get one up on me, and I’m going to fight it.”

0:41:37 TW: So, there are these comical things where you’ve got total adversaries both opposed to Facebook for completely different reasons, which is like, “Well, that’s great, that means they can work together.” No, because their two solutions are gonna be completely opposite because it’s gamesmanship. I don’t know. The regulatory, until there is some degree of taking the temperature down, some degree of who is a recognized body… I honestly do think it could be still a privately, or could be non-profit, that says, “We are… ” Just like Mozilla has said, “We’re here not to be driven by the almighty dollar.” Some organization that says, “We aren’t trying to make money, we’re not trying to do this. We’re funded well enough, we can get scale and get critical mass.” The challenging part is they’ve gotta build that trust out, and get to a level of scale with people who fundamentally don’t agree with each other, but then agree that that third party has the world’s best interest at heart. And I don’t see that happening.

0:42:52 MH: I wonder if that’s why the movie was called The Social Dilemma.

0:43:00 TW: Oh!

[laughter]

0:43:00 MH: ‘Cause you make a really good point, which is one side picks a path and the other side, politically anyways, is like, “Even if that makes sense and aligns with my interests, I’m, on principle, gonna just disagree with it and call you evil, because that’s what we do now.”

0:43:16 TW: I think New Zealand can figure it out, and then everybody else can just slowly wanna be like New Zealand. It’ll be just like the pandemic.

0:43:23 MH: Yeah. I struggle with those comparisons ’cause everybody is like, “Well, Norway does it.” It’s like, “We’re not Norway, just in case you didn’t notice,” whatever. I guess, I doubt the efficacy of a private group for the same reasons, like, how will they ever get the credibility? It’s like the horse is so far out of the barn, to reuse somebody else’s expression, I can’t remember who said that already, but it’s maybe the horse will come back on its own someday? We’re so far out. And I tested myself the day after watching that movie. I sat down to work, and my two social media outlets that I’m pretty regular on are Twitter and LinkedIn. I don’t use Facebook anymore. I deleted my account back in the beginning of… Was it 2019 or 2020? I can’t remember. 2020, I think. But I tried to say, “Okay, until noon, I’m not gonna look at any social media.” And literally my brain was just like pulling me over there every 10 minutes.

0:44:36 MH: It was incessant. And I was like, “Oh, Michael, you are hooked.” I’m now putting somewhere on my calendar, I’m like, “I’m gonna do a dopamine fast withdrawal. I’m gonna take my technologies, take it somewhere, and go out in the wilderness for a week to try to come down off of the behaviors.” That’s the part, Tim, I say that’s our human agency. To the extent possible, we try to do the things that will help our health. I agree with you, not everybody is in a place where, A, they understand the problem, B, have the capacity to do it. Much in the same way, if I’d like to lose weight, I know what I need to do. I need to eat a better diet, I need to exercise more. Those are pretty fundamental things. Not everybody is gonna have the knowledge of that or the capability to do it, so not everybody’s situation is the same.

0:45:27 MK: I do think I’ve been approaching the situation from a personal responsibility perspective in terms of I’m gonna do what I can do at my company to shape things to go in the right direction, and I’m gonna do the things that I can do in my private life to shape my own behavior in the right direction. I feel like, yeah, that is, I guess, a big burden or whatever you wanna call it, or a lot of responsibility, but even… We were on holidays the other week, and I always have my phone on silent. And my mom actually… Well, let’s say an amusing but scary story, too, she ended up in hospital. And I didn’t have my phone on me for eight hours, and she was like, “Oh, well, you obviously don’t give a shit about me because I’ve sent you a photo of me in hospital.” And I’m like, “Oh, jeez, mom, come on.” And I found that people would start calling my husband to get a hold of me, which to be honest, doesn’t really bother me, but you can put up these boundaries yourself, but then you feel like people come through with a sledgehammer and try and knock them down. And so I also try to have hobbies that see me not using… I find I get to the end of the day, and I don’t wanna look at a freaking computer, so all my hobbies are based around not being on technology.

0:46:38 MH: So, let’s just note that there was time that we had to contact Michelle, who contacted Jamie to get you because we had a little timey screw-up on the podcast.

0:46:46 TW: That is true.

[chuckle]

0:46:46 MH: That did happen, that’s true.

0:46:49 TW: But, Michael, to your… My son with ASD, the program that he went through, that was a pretty common… There was a degree of, you’re in a wilderness program… He wasn’t, but there were a lot of people in the program he wound up in, where residential treatment, pretty intensive. You’re in a living environment that’s residential, and there is no electronics, and you’re surrounded by other people who don’t have electronics. And then one of the criteria to get your electronics, which is then pretty highly regulated as you progress through it, is that you actually do have to go through education, you have to actually understand what’s going on with your brain and your neurology. You go to him now and he’s got plenty of challenges, but he can say, “Yeah, I should not sit and play a game for more than an hour without getting up and doing some physical activity for at least 15 minutes.”

0:47:41 TW: Some of that information is there and it can be educated. Unfortunately, the resources to do that at scale, when you have people saying, “That’s also a hell of a good babysitter. Give them the iPad and I can get my work done for four hours.” But there is data and study and protocols that say, “Yeah, you can educate, you can educate that, help people develop the strategies, even kids, to use electronics in a healthy manner, but it does have to be deliberate.” And that, we have not gotten to at all. And you can’t really legislate that. That would set things off for another reason, but that maybe gets us back to where the point of The Social Dilemma, is get people recognizing this is a real issue. The next part is, “Okay, what do you do about it?” Because staying with its dilemma… One of Michael’s former co-worker, my current co-worker, Yonatan, has done a lot of educating in his community around, these are the things you can and should do, these are worksheets you could use to come up with this plan. But I think that works for people who have the education, the knowledge, the bandwidth, the capacity to actually go through all of that, which unfortunately is gonna lead to… The generation that’s gonna come out better is gonna be the same generation that was successful for the last 100 years. They were born into privilege of some sort, so we’re back to being first.

0:49:11 MK: Yeah. Particularly with COVID and, God, I would not… I feel for all the parents going through COVID, especially when daycares and the schools are closed, and it’s like you do what you gotta do so you can actually get your job done. And even privileged families were in that situation. I just feel like it’s freaking hard.

0:49:31 MH: Yeah. No, I’m well past any judgment of any parent ever. Everybody is just trying to survive. I remember, this is a stupid story, but before Maria and I had any kids, we would sometimes visit friends and we were driving home or whatever, we’d be like, “When we have kids, we’re gonna put them to bed at this time and not let them do this or that.” Every single thing we ever said, we did not stick to, not even close. [laughter] So, I am past… It’s just all… Yeah. And you gotta do what you’re gonna do, so no one should feel guilt. But it is… Yeah, that’s the part about this whole thing, Moe, and you kinda touched on it, it gives me the hope is like what I can do and what I can help with locally. I can talk to my friends, I can help my kids understand dynamics online, I can do things to try to not make it such a focus in my own life. At the same time, globally, I think, then will we just be the same ones who see the world burning around us and that’s what’s gonna happen? And so this bigger picture is pretty depressing most days of how do we solve this pretty big systemic problem that’s causing some pretty big havoc in, at least, society in the United States, I would say.

0:50:55 MK: And one of the things I was chatting about with my good friend was also the idea that there needs to be a relationship between places like the Center for Humane Technology and technologists who are working within these companies, because ultimately they are going to be the driving force of asking the questions, pushing the product in a better direction. And I think we all, as people that work in this industry, can have a role to play in driving those conversations and being an ally, I guess. ‘Cause when you think about it, the people who are in The Social Dilemma, they can’t influence this stuff within companies anymore. They need partners within companies to work with them and to understand why this stuff matters.

0:51:45 MH: All right. Well, this is a great discussion, but we do have to start to wrap up. I’m glad we were able to solve the problem of The Social Dilemma.

0:51:53 TW: Next episode, Die Hard with a Vengeance.

[chuckle]

0:51:55 MH: That’s right, next movie. Yeah. Hey, if you want us to review a film, please do let us know, we’d love to hear from you. Anyways, one thing we do wanna do is do a last…

0:52:05 MK: Wait, hang on. We haven’t done, would you recommend it to a family member or a friend?

0:52:10 MH: Oh, okay.

0:52:10 MK: This is how I finish book club. You need to give it a score out of 10.

0:52:14 MH: Oh, score out of 10? Moe, you’re so right. That is the missing element. Would you like to lead off for us? Give it your recommendation and your rating.

0:52:23 MK: I’m going to give it a nine out of 10, because I think it does do a good job of explaining some of the issues to the mass population. I took one point off because of their depiction of AI, which I felt could be misleading.

0:52:36 MH: That’s solid.

0:52:37 TW: I feel like when you said that you and I were gonna disagree on something, I had in my mind an eight and a half, and for the exact same reason. I would recommend it and say, “But please talk to me afterwards because I think their depiction of AI is a little… Is begging to be over-interpreted.”

0:52:53 MK: Oh. I can’t believe we agreed.

0:52:56 MH: And I’m gonna give it an 8.76, because I wanted to try to find a different number than either of you. But at the same time, it is a movie I would recommend because of the awareness that it starts to create. And I think that awareness, at least that awareness, is super important. And I think… I think probably…

0:53:19 TW: Oh, I just leveled up again. Oh, sorry.

0:53:21 MK: Oh, Jesus.

[chuckle]

0:53:21 MH: Dopamine for Tim. No, the awareness that it creates is actually pretty important ’cause that’s the first step for a lot of people, and we gotta do a better job. And I think the three of us, when we were talking about even doing this show, the fact we were sort of interested in it bespeaks the fact that this is something we have some awareness of, but how do we go broader with this message? Anyway, I appreciate…

0:53:50 MK: You do realize, by giving it an 8.76 or an 8.5, neither of you would recommend it based on an NPS scale, ’cause only a nine or a 10…

0:54:00 MH: Oh, we’re not using… NPS is bullshit.

0:54:03 MK: Yeah, I am a big NPS girl.

0:54:05 TW: Okay, 9.2, Moe. Are you happy now? I do recommend it. I think people should watch it.

0:54:12 MH: Well, I wouldn’t be able to give it an 8.5, ’cause actually technically there is no 8.5. So I would have to give it… It would be a nine.

0:54:19 MK: I was gonna say, technically, you need to give it a whole number, but then you were on a tangent.

0:54:24 TW: Oh, my gosh.

[chuckle]

0:54:25 MH: Then it would be a nine. Fine. The scoring rubric was not explained well ahead of time, and for that, I apologize to our listeners, ’cause as a measurement-focused podcast, that’s a miss on our part. Anyway, we need to do last calls. We love to go around the horn. Share something that you should check out online, because now that that’s… Well, anyways, what’s your last call, Moe?

0:54:50 MK: It’s a really funny one today, just given the topic we’ve been discussing. But I have an amazing analyst internally that’s built basically a causal impact package for us all to use. We’re using the one package whenever we do this type of analysis. And it also leverages the Prophet package from Facebook, which is open source, which is available both in R and Python. Basically, it creates a Prophet class, and then it fits and predicts different methods and models. Yeah, I don’t know. I was reading up on that the last couple of weeks, and found it really interesting and just thought, “If that’s a problem that you’re facing, I’m not smart enough to build the actual CausalImpact package like __, but I’m definitely a grateful user of what he’s built. But it was just really interesting rating, and I guess it was kind of nice that Facebook have made it open source given we’ve just spent an entire show slandering them.

[chuckle]

0:55:52 MH: They’re not all evil. It’s a dichotomy.

0:55:55 TW: That’s a good one. Why don’t you go next, Michael? You wind up…

0:56:00 MH: Okay, I will. Thank you. Mine is something I actually got from my good friend, and Tim’s co-worker, Stewart Schilling, and he showed me this and I’ve been fascinated with it over the last week or so. It’s a mapping company that takes every 9 new meters on the planet and associates three words with it. It’s called what3words. And basically their idea is that location is this weird construct of streets and city names and numbers, and that’s not the right way for the brain to think about it. And so if you can remember three words, you can plug that into your phone, and it’ll take you right to that spot within 9 meters. And it’s actually… ‘Cause I thought about it. I was like, “That’s actually brilliant on so many levels.” And then I was looking up, “Where am I sitting right now? What three words am I sitting in?” And I was like, “Can you change them. Could I buy three words that fit my… ” Anyways, it’s a fascinating website, started up something like that, what3words.com, with the three being of the number 3. But thank you, Stu, do for pointing that out. It’s been fun to think about and just type in random three word groupings to see where it throws me in the world was kinda neat. What about you, Tim?

0:57:15 TW: Well, I was gonna have two podcast episode recommendations, but now it’s gonna be three, because there was an episode called “The Address Book of 99% Invisible,” where they talk about addresses and the history of addresses. And they actually wind up, towards the back end of that, talking about what3words and where that fits, and what works and what doesn’t work about what3words. I wonder if Stu heard about that through 99 PI.

0:57:39 MH: It’s possible.

0:57:41 TW: So, that was podcast episode recommendation number one. New podcast, which, as of our recording, they only have two episodes, but I was gonna recommend the first one anyway, is called Brave New Planet. It’s a guy named Eric Lander, and they basically take big issues, big topics, and they look at the benefits and the risks of them. And the first episode, which was the one that I was gonna recommend, but… The second episode’s on climate change, and it’s pretty good, too, but the first one is all about deep fakes. They talk about GANs, they talk about GPT-3, then they come at it from various different directions and do a really good job of explaining the nuts and bolts of deep fakes. And then my third recommendation, ’cause it also came up, was past guest, and she keeps cropping up, but Emily Oster, she’s…

0:58:30 MK: Oh, my God! She does! I was gonna talk about her today.

0:58:34 MH: Yeah, she’s good.

0:58:35 TW: And she had her… She was on a recent episode of Planet Money where it’s called Opening Schools and Other Hard Decisions, and she outlines her five things for making a decision, which she’d written about in “The Atlantic,” but then she talked through the new project where she just started with Google Forms and just trying to collect data on schools. I was just struck by… She, once again, reminded how the Thinking Like An Economist is just so amazing, but she was once again delightful and brilliant. And so there is an episode of Planet Money called Opening Schools and Other Hard Decisions, so it’s kind of broader applicability, and she is fantastic on that.

0:59:18 MK: She also has lots of links to articles about that in her newsletter, which I highly recommend subscribing to.

0:59:24 TW: Yeah. And she tells the story of her newsletter, saying, “Hey, we didn’t have any data on this, so I’ll just send up a Google Form,” and in that, she quickly got stuff. And even if you watch her on Twitter now, it’s just mind-boggling. It maybe gets back to The Social Dilemma. There are people criticizing her for basically collecting data, and she’s like, “Fine. You got a better solution?” In a direct and somewhat humorous, but it is one where you’re like, “Why? Where are people getting this? That’s where they need to spend their time? Let’s criticize without any effort to come up with an alternative.”

1:00:00 MH: Yeah, data used badly by somebody, it must be data collection is the problem of any kind, right? Anyway, nice last calls, everybody. All right. You have probably been listening and thought, “Wow, this episode makes me wanna like and subscribe… ” No, I’m just kidding. [chuckle] But we would love to hear from you, and there are ways in which we do that until we figure out a better path. We do have a Twitter account, we are on the Measure Slack, and on our LinkedIn group. And if you have thoughts about the movie or you’d like to share them, or something resonated with you, or you’ve got other ideas you’d like to share, we’re always open for that. If you think there’s other topics like this that you think we should go deeper into, we’d like that comment as well. We actually get, from time to time, some really great show topics from listeners, and so that’s something we’re really open to.

1:00:50 MH: I think that this episode is really interesting, but you know what else is interesting? Our producer, Josh Crowhurst, who we always wanna remember to thank because he makes this show come out on a regular, timely basis. The fact that you’re listening right now means Josh did something nice for you, and really nice for us. Okay. So, if you haven’t watched The Social Dilemma, it is a nine out of 10, unanimous recommend by the Digital Analytics Power Hour, and we’re kind of a big deal in data and analytics ’cause we have Moe Kiss and Tim Wilson, the quintessential analysts. And so, go watch the movie, you’ll like it. If you don’t have Netflix in your neck of the woods, then go download it illegally off the Internet… I didn’t say that. Just go use somebody else’s login, but you should watch the movie if you get a chance. All right. I know I speak for my two co-hosts, Moe and Tim, when I say no matter how much dopamine you’re using to get Twitter likes and YouTube followers, just remember, keep analysing.

[music]

1:02:00 Announcer: Thanks for listening. And don’t forget to join the conversation on Twitter or in the Measure Slack. We welcome your comments and questions. Visit us on the web at analyticshour.io, or on Twitter at Analytics Hour.

1:02:14 Charles Barkley: Smart guys wanted to fit in, so they made up a term called “analytic.” Analytics don’t work.

1:02:21 Thom Hammerschmidt: Analytics. Oh, my God, what the fuck does that even mean?

1:02:29 MH: The initial introduction to what we are going to be discussing today, Tim, you have two minutes, your time starts now. Your mission: We’re gonna listen to Tim and not interrupt, and then we’re gonna listen to Moe…

1:02:46 TW: And then I’ll go on for three minutes and you can keep interrupting me, and I’ll keep ignoring you.

1:02:51 MH: Tim? Tim?

1:02:51 TW: And then I think what you should do…

1:02:53 MH: Tim? Are you listening?

[chuckle]

1:02:56 TW: Oh, shut up.

1:02:57 MH: Is that a fly? Oh, come on, man.

1:03:02 TW: Oh, come on, man. Okay.

1:03:07 MH: The election is over as of this episode. It’s not over as of the day we’re recording. All right, are we ready?

1:03:16 TW: Did you see the TikTok of the guy doing the KFC? He was like making gravy, and it’s like… It was written, it was like, “Oh, this is… People like disgusting and horrifying….”

1:03:30 MH: Here’s words I thought I would never hear from Tim Wilson. “Did you see the TikTok?”

[chuckle]

1:03:36 MH: Rock flag and forests pink banks. That’s what3words, somewhere in Facebook’s headquarters.

One Response

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares