#196: Offline Customer Data in a Connected World with Angela Bassa

Every consumer is now aware, at some level, that they are constantly generating data simply by moving through the world. And, every organization that puts physical devices or digital experiences into the paths of consumers has to make decisions about what data they collect, how they manage it, and what they do with it (both the immediate plans and what unknown plans may emerge in the distant future). The questions, decisions, and mindsets that this reality brings into play are just one big gray area after another. Angela Bassa grapples with these issues on a daily basis both professionally and personally, so we sat down with her for a lively and thought-provoking discussion on the subject.

Books, Articles, and Other Items Mentioned in the Show

Photo by YoonJae Baik on Unsplash

Episode Transcript

0:00:05.7 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered, conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.

0:00:22.2 Michael Helbling: Well, it’s another day and it’s another wonderful one, but it’s time for Episode 196 of the Analytics Power Hour, and a lot of visions of the future, our refrigerator automatically orders us more eggs and milk because it knows we’re out and our lawn mowers might schedule its own maintenance when it needs it, and our vacuum cleaner sort of just pops around the house whenever necessary sweeping things up, and probably some of those things already happened, but as data people, we know all of that orchestration can only come on the back of a lot of data collection and automation work, and that’s not really all that glamorous once you start digging into it, sort of like doing a podcast. Hey, Tim?

0:01:08.6 Tim Wilson: Exactly.

0:01:10.1 MH: And Moe how are you doing?

0:01:12.6 Moe Kiss: I’m good, thanks.

0:01:13.9 MH: And I’m Michael, well, that’s what we wanted to talk about, but we needed a guest, someone with some real experience in this subject. Angela Bassa is the senior director of data science and analytics at iRobot and a data advisor at Mira.

0:01:28.7 MH: She’s also worked in leadership roles in the data consulting space and advises companies on data and analytics topics, and today she is our guest, and I’ll say finally, because we’ve been keen to have her on for many years. Welcome to the show, Angela.

0:01:42.4 Angela Bassa: Thank you so much, I’m so glad to finally be here.

0:01:46.7 MH: We’re delighted. I’ve been a long-time follower of you on Twitter and enjoyed so much of the things you share there, so just great to have that conversation, but I think this is a very interesting topic because certainly as analytics people, we do a lot of work to try to connect people’s behaviours in the digital realm with their purchase likelihoods or things like that, but also there’s all this other data that exists in the world, that there’s sort of this dream of like we can put this all together and make this… Usually it’s either utopian or dystopian. And depending on the way you kind of envision that future, but I’m curious kind of your take on this topic, I think your direct experience and maybe even from a big picture perspective, and then we can drill into some of the details and experiences you’ve had with this.

0:02:34.9 AB: Well, first, thanks for having me and thanks for talking about something that I love so much. This is perfect. Also, I didn’t realize that this was a perfect episode, so it’s 196, is 14 square. I feel so special.

0:02:53.3 MH: Oh wow. That’s right. And we were saving that one, and…

0:02:57.0 AB: I know you were.

0:02:58.1 MH: There is a subset of our listeners who you just became like the one number one best guest you’ve ever had. Awesome job, so.

0:03:07.4 AB: So back to the topic at hand. I think, like you said, sometimes it’s thought of as dystopic or utopic. I think as an analytics person, sometimes you say the glass is half full, sometimes you say the glasses half-empty, and I like to say the glass is twice as big as it needs to be, so we need to just be really aware of the potential, and we need to not be negligent in ignoring it, both in ignoring that it exists and ignoring what it is once we’re aware of it, so data… I think I said it in the… There was another Australian event that I participated in right before I had my daughter last year, where I did a digital provocation talking about the fact that data is not reality, data is not ground truth. Data are artefacts. Data are like fossils. They’re like the bones. So if all you have is bones, you go, You know what, dinosaurs are like lizards, they have reptilian skin. They have scales, and then you do a little bit more work. It’s like, “Oh no. Maybe they had feathers. Could they have had cartilage?” What is all of this stuff that isn’t preserved in the fossil record, isn’t preserved as a digital exhaust, so what we see is part of reality, it’s a projection of reality it’s a two-dimensional projection of a four-dimensional world, and so we need to be aware that we’re looking at part of what’s really going on. And in terms of the AI specifically, there’s a lot of really awful things that are happening right now but not in the future, not…

0:04:51.7 AB: Maybe right you have faulty recidivism models being used by judges in adjudicating sentencing, you have mass labelling efforts being done in the global south for pennies an image, and all of this unseen work is what allows for things like computer vision at sort of planet-scale to be attempted. So long-story-short, yes, there’s a lot going on that’s wrong, and there’s a lot going on that’s good. I come from, I was born and raised in Brazil. And I look back at how it was when I was 10, 11, and how it is now. There’s so many more services that are possible because they’re automated, and yet that’s not perfect by any means, and that makes it so that sometimes things go wrong and there’s nobody to complain to because you’re complaining to an automated system that doesn’t… That takes no reply at government.gov, right? But like that’s the price for tens of thousands of people getting a benefit sometimes. And I always wonder, Is the cost worth it? I don’t know, it’s like this massive trolley problem, right?

0:06:11.8 MK: I don’t know.

0:06:13.4 TW: It’s interesting. It feels like to me that the one, the population writ large does perceive the data that’s being collected offline is, or even online as they’re moving around is perfect and complete that they’re gonna be known too much and they kind of worry about that. It almost feels like if the pendulum which swing to, that’s incomplete data and it’s going to be wrong, and it’s often going to be wrong, and it’s all going to be wrong in damaging ways, and we should worry about that. Right now, the pressure is, I wanna keep my personal… I wanna keep my activity because I don’t want you to know about me. I don’t know, it feels like just the masses kind of have an over belief in the truth of the data that seeps into the marketers and sometimes to the analyst, and that’s how we wind up chasing this, The Myth of if we just throw enough data at the machine, we’re gonna get to some Nirvana, and that just feels like it’s missing, it’s like on the wrong path, it’s not heading in the right direction.

0:07:35.2 AB: You know that animated gift with the little girl with the arms in the air going, “Why not, boo.”

0:07:41.0 MK: Oh yes, I definitely know that.

[laughter]

0:07:45.6 AB: I think that’s what’s going on here. I think data is both incomplete definitionally, and also when used in aggregate sufficient to make certain inferences up to different thresholds. So something that’s going on recently in the US, several weeks ago, there was a leak of a draft Supreme Court opinion that cast doubt on to precedented law regarding privacy and all sorts of implicit matters. And so the question becomes, “Should you be using your phone to search reproductive care? Should you take your phone when you visit places that might become criminalized? Is your phone telling on you?” And the answer is 100%, it is, it is super-telling on you, and sometimes the Fossil, the bone is sufficient, if you don’t need to know what the Dino looked like, but you only need to know that the Dino was there, the bones are enough. Sometimes you don’t need to know the content of the message. The fact that your significant partner is texting somebody at 2:30 in the morning, you kinda don’t need to know what it’s about.

[chuckle]

0:09:09.1 AB: You know that there’s something there, like metadata is data, and so data isn’t complete ever, but it is also sufficient to make inferences as a marketer, and that’s the power of big data. You don’t need causality, enough correlation is sufficient for you to move the revenue needle. And so you don’t need to know why somebody is doing something, you just need to know that they are purchasing, but different players have different needs.

0:09:41.2 TW: That does remind me. I was at Superweek in Hungary a few years ago and got a notification on my phone that there was smoke in the kitchen at my house in Ohio, which I was up later than normal ’cause I was at Superweek, and…

0:09:54.5 MK: Drinking?

0:09:55.3 TW: That resulted in a kind of… Drinking. Yeah, somewhat, I’m using text exchange with my… I mean smoke in the kitchen is… Our nest’s location is probably such that… It’s like if no one was home, that would have been a… Maybe quick call the neighbours, the fact my wife and daughter were at home, it was a text, they sent back a picture of a house like engulfed in flames, it was not our house but… So I’d say that the 2 o’clock in the morning, can give you a clue, it’s good to have a little bit of a notification, but I’m envisioning the refrigerator, you’re trying to clear out your refrigerator to go on vacation, you’ve got some smart refrigerator to keep re-org… I just have this vision of you’re trying to get rid of the eggs and you haven’t told the refrigerator, “No… ”

0:10:42.0 AB: And it keeps ordering eggs, and…

0:10:42.2 TW: It keeps ordering eggs, and I’m like, “Oh.”

0:10:44.4 AB: It keeps getting delivered to your doorstep and you come back and there’s three dozen in different stages of rotting. Yeah.

0:10:51.2 TW: Your refrigerator is still empty. Yeah, sorry, Moe.

0:10:51.4 MK: Angela, I just… I’m completely enthralled by everything that you’re saying, because I feel like you’re thinking about the world that we work in on this plane that I’m just… Maybe I’ve just had a rough day, but I’m not on that plane at all.

[laughter]

0:11:13.5 MK: And I suppose I’m trying to understand how… Have you professionally… It’s like you’re holding to really deep thoughts that conflict with each other simultaneously. How do you get up and do your job every day with these deep thoughts? I’m just, I’m wrestling with it right now.

0:11:39.9 AB: So there’s two ways, there’s a personal way that I get up and exist in the modern intertwined IRL cyber world, and then there’s my professional approach, so my personal approach is there are large technology conglomerates that I do my very best to avoid. There are some that I can’t, but there are some that I do my very best. I am thoughtful about my browser choices, I am thoughtful about my search engines, I am thoughtful about the providers that service me, and I pay them so that I don’t become the product, and I…

0:12:20.9 AB: And it’s a pain in the neck, and it is a privilege to have both the time, resources and money to be able to spend trying to protect my personal privacy. I know it makes me sound paranoid, but given what I have seen professionally, I think it’s warranted. I don’t share pictures of my children online. They have their usernames purchased already, and I am parked on their URLs until they are old enough to be able to do things with them. So that’s how I exist personally. And I think that gives me a certain illusion of control, which allows me to continue functioning. Professionally, I think it makes it so that I don’t blindly follow the data. When the data points left, I look at it and go, “Are you telling me to go left or are you telling me to go right?” I’m much more thoughtful about the questions I ask, and I’m much more thoughtful about the questions I know can’t be answered. So my training is as… Is in theoretical math, so I came into analytics from almost deep nerdom. I think the only way to get deeper nerdom is if I had been in philosophy [laughter] which I think is very adjacent. No, like can’t turn all of that, right.

0:13:47.1 AB: And so one of the things that I studied was Godel’s incompleteness theorem. Which states that there are certain things that cannot be proven and are true, and there are certain things that are false and cannot be disproven. And this is like… That statement has been proven, which is nuts to me. Like it is proven that there are things in the world that you will never be able to prove. And so I approached the data that I am going to be sourcing, or that I am going to be analyzing, or that I’m going to be communicating. And I think about it in terms of… You know the little parable of the two brothers who live at a fork. And one only tells the truth and one only tells a lie, and what question would you ask to know which direction you have to go? That’s the same principle, you have constraints and you have to be really thoughtful about how you interact with that constraint so that it gives you more information than you have to begin with, and you use that information to bet better, to beat the house odds. It doesn’t give you a certainty, but in a Bayesian world, it improves your priors. And you’ll always have better and better priors. And blindly trusting data will give you bad decisions. There’s this episode of The West Wing, which will show you just how big of a dinosaur I am.

0:15:25.9 MK: No, I love The West Wing. If anyone doesn’t like The West Wing, they can stop listening now. [chuckle]

0:15:31.0 AB: So you know Marlee Matlin, the poster? And there’s this one episode where they get these results and the results are really terrible and everybody’s freaking out, and it’s like, “Okay, we need to stop talking about this topic because the public hates it. We need to stop talking.” And the poster that the analytics genius of the show comes back and says, “No, no, that means we need to go harder. That means the opposite of what you’re… ” And the numbers were the same, it’s just the interpretation that’s different. And so to answer your question, that’s how I use the knowledge of both of those things, is to make sure that I’m asking the question at the crossroads of “What would your brother tell me?” Rather than… Go ahead.

0:16:16.7 TW: So some of what you’re saying… And this may harken almost more back to your Harte Hanks days is you’re talking the idea of being intentional about what are we doing… With the information we have, what are we doing not to… Let me know if I’m putting words in your mouth. What are we doing to both gather more useful information as well as acting on the information that we have?

0:16:38.1 TW: I do feel like in the marketing analytics world, and certainly in the direct marketing, direct email, email marketing world, there can often be a lot of, we’ve just done a bunch of stuff, we’ve sent millions of emails, so just mine the data and execute in smarter ways or we’re gonna make assumptions, we’re gonna just design a little email triggers for nurturing programs. It feels rare to have a, we’re trying to actually act on the information that we have knowing that it’s incomplete and imperfect, but also add a layer on that of acting in a way that we do get more information to guide us in the right direction. Is that… And this I’m just… I think when we were first talking to you, it was you were and maybe still at Harte Hanks or maybe you recently left Harte Hanks, but that’s a world I spend a lot of time in. Not with them specifically, where it does feel like there’s just an assumption that we have millions and millions of rows of data, so we must have all that we need. And it feels like the same thing can happen with offline data, Oh, we’re just gonna make these devices smart, that’s gonna generate a bunch of data, so surely we will… We would get to the truth. Which doesn’t feel right.

0:18:00.5 AB: Yeah, I think. So I do have the great fortune at iRobot of having the largest consumer globally deployed robotic fleet in the world. So that doesn’t suck.

0:18:18.5 MK: That’s pretty cool. Yeah.

0:18:22.5 AB: I feel like the character and iRobot the Asimov novel, the one that can… The robot philosopher, the android… Not philosopher, psychologist, the android psychologist.

0:18:36.0 AB: Anyway the thing is at iRobot, we gather data sort of the way that we were talking about it at the top of the show, like a fridge gathering data about your home, like Smart assistant, gathering data about your home, these vacuum cleaners, these mops, these robotic mops, they are in essence, autonomous mobile sensor platforms that are gathering data about your most private space, sort of similar to your iPhone and your Android that’s gathering your most private thoughts of searches and things of that nature. So the thing that at iRobot we’re really careful, really careful is because that data is so precious we want to be very, very thoughtful about the relationship. We don’t want customers to feel creeped out. We don’t wanna be creepy.

[chuckle]

0:19:31.7 AB: If you wanna be creepy, there are plenty of companies that want you. There are seven billion of us, it takes all kinds, but for iRobot, it’s strategically aligned for us to be protective of that. It is our differentiator, and for us to use that judiciously to deliver value, because you have expectations. You expect that this appliance, this tool that you brought into your home, is going to help and not cause more issues than it was brought to solve. And so that’s sort of where that fine line comes for me specifically, and I think for folks who work at the intersection of digital marketing and product analytics. I think if you’re selling shampoo or if you’re selling furniture. No offense, I’ve done that, and there are lots of different problems to solve there, but a table is not telling you anything about it, how it is being used after it is sold. The shampoo bottle isn’t telling you how often somebody is showering, whereas I know how often debris is being picked up in which room. I can infer traffic patterns within the house, so I can make sure that the places that are walked on, get cleaned more often than not. There are ways to use that.

0:20:51.2 MK: I’ve got a scenario which, let’s take our mind back to the eggs-in-the-fridge example, right? Because in my head, the line between creepy and non-creepy, it’s… Well, there isn’t a line. It’s gray, right? There’s a gray…

0:21:06.8 MK: And it’s different for everyone.

0:21:08.1 AB: Yeah. But so I feel like some people would sit there in an office and be like, “Well, knowing that you’re about to go out of town because we’ve connected to your email or your diary, so we know you have a fly book, so we’re not gonna order you more eggs, we’re doing something good for you.” So that’s not creepy, whereas I feel like some people would look at that situation and be like, “Oh no, that’s creepy that you know we’re going out of town.” You need the user to input to say, “I don’t want more eggs, so that we don’t… ” What’s your thoughts on the scenario?

0:21:38.0 AB: That’s why informed consent is huge. I think informed consent is big. I think the little cookie alerts in the bottom of the page that give you no agency are stupid and they’re non-compliant with the spirit of the legislation that they purportedly are addressing. But the ones that say, “Accept or refuse, are you comfortable with this?” Do you know what they’re doing? They might bring you value. Do you know where your line is? I think as consumers… I think as professionals, we all probably have spent some time thinking about that, or I would have expected a little bit of introspection, but as consumers, we don’t do that enough. And I think it’s important, like my line is like two zip codes farther back.

[chuckle]

0:22:29.7 AB: But to some people, it might be totally fine, and that’s okay as long as… If there’s a dial where you can go, I want you to know this much, and I know what that means. I know how much I’m giving.

0:22:47.4 TW: Well, but is that if you take a… If I buy a Roomba, presumably there’s a little bit of a selection bias or a selection effect, I guess, in that I’m buying something because it’s got some G Wiz technology and connectivity in it. I mean, knowing the company has been around for a over a quarter of a century, right, yeah.

0:23:09.6 AB: It does. It’s kinda awesome.

[chuckle]

0:23:12.2 AB: That’s a thousand whistles.

0:23:12.7 TW: But if I buy a table that I buy… If I buy a smart table and say, “Oh, this is the first time I’m buying a table that actually has sensors in it,” I feel like they were probably in a period where people, when they’re buying a connected physical object, they’re a little more likely to say, “I’m okay with this. I see the benefit outpacing whatever my concerns… ” When you go to somewhere online, I think lots of people got on Facebook or lots of people do Google searches or got their free Gmail account and weren’t really thinking about what kind of tracking that was enabling. So I almost wonder if the off-line… Maybe a smartphone being the big exception, that it’s only now that it’s reaching the mass consciousness, how much that is a physical tracker that you’re carrying around with you. But I wonder if when it comes to that… When it comes to hardware, if you’re getting a connected device, maybe it’s a shorter leap for people to say, “I’m getting it, and I recognize that that’s… I’m making a decision that I’m becoming… I’m adding more connectivity to my world,” maybe?

0:24:22.3 AB: That’s part of why thinking about data as imperfect is helpful, and then bringing in the qualitative side to fill in the gaps in knowledge that the quantitative elicits. So we know how people engage with our product to a large extent, and so they will use the product in a certain way. We can then engage with our qualitative in-house team and go, “Can you run a survey, can you run a journaling exercise, can you run a panel of focus to understand why is this happening? Why is that happening?” And so I think we can tell, are people aware of what they’re bringing into their home, once they buy a connected device? We can ask that question, we have asked that question, and it turns out in the same way that people don’t think about it when they sign up for Gmail, they do not think about it when they bring a smart device in-house.

0:25:36.6 MK: Really. Really.

0:25:36.7 AB: To a large extent.

0:25:37.9 MK: Shocking.

0:25:38.5 AB: It varies geographically, it varies according to several persona criteria, but by and large, broad strokes, yeah. But I think the flip side to that is, one thing that Tim was saying, is you can just start collecting more and more data, just enrich it and soon enough you’ll get a deep enough picture that will allow for causal inferences if at first you don’t have enough data. And that is true. However, it gets cost-prohibitive really fast, especially if you think about you’re selling vacuum cleaners, not satellites. These things can’t cost $10,000. They can’t have AM Apple one, super souped-up GPU chip running on them to do all sorts of really, really, really fancy and scary things. They’re running a chip that is manufactured and produced at scale for vacuum cleaners deployed globally. And these things are navigating on the floor for years, so they have to be pretty resilient and robust, they can’t be delicate and finicky.

0:26:52.1 AB: So there’s a whole aspect of, “Sure, but how much data can you actually collect and how much data can you actually process and how much does it cost to ingest gigabytes of data from every time zone, and how much does it cost to peruse that data and scavenge and Splunk to find insights, and at what point are you driving more value from that activity than you’re spending on your platform bill?” So that also plays a little bit and it makes my type of thinking valuable because we have to be smart about how we ask the question, not just because that’s the right thing to do because of all of the reasons I’ve been saying, but because we don’t have all of this data. We have the data we have, and it is restricted by design because we don’t wanna pay to store data that’s not gonna be used to be valuable.

0:27:54.0 MK: So, when you mention restricted by design, does that mean, I guess you and the team have to kind of… Yeah, you have to be really smart about what you do collect and the reasons, and is that something you review or is it kind of like, “This is what we’ve always collected, so it’s kind of what we always stick with,” or is it something that’s constantly evolving?

0:28:14.0 AB: I think, yes, and there are certain parameters that have been there forever, that we use for that reason, they’re anchoring, they’re almost a codex type of measurement at this point, so that we can have longevity for our comparisons. It doesn’t mean that we use those for the inferences that they were originally designed, but we use them because we have a long-term metric to anchor against. But mostly it’s an evolving marker, and we have test units, we have internal test teams that collect lots of data, and then we look at that test data, were like what’s useful, what’s not useful, what… Let’s test it with actual users, let’s run a qualitative group and see how… Is this valuable? Would you be interested in something that provided this new bell and whistle? No? Then chuck that table. Don’t need it. Annualized, that’s another 23 grand.

[chuckle]

0:29:19.1 MH: Alright, let’s step away from the show for a free word about our sponsor ObservePoint. Tim, you have that look on your face that has me thinking you’re tickled with yourself about some way of describing ObservePoint. I might regret giving you the green light here, but why don’t you go for it.

0:29:36.5 TW: What’s actually more of like white blue and red and lights, but good enough. I was just thinking that ObservePoint, it helps keep your data collection processes clean, just like a little robot that you set up and configure for your website, and then all on its own it just roams all around your site, unmonitored, and it checks the tags or firing is expected. And it repeats that process on an ongoing basis, which keeps your data collection clean.

0:30:03.4 MK: Yeah, I see what you did there Tim, but I’m not sure how I feel about it.

0:30:08.9 TW: Well, ObservePoint also produces great content. I was checking out their five pillars of success tagging edition over the weekend, that’s a great guide for anyone who is trying to articulate the why and how of website tagging governance. Things like the importance of having a digital governance committee having a well-maintained tagging plan, and using a data layer and a tag management system. It’s good content.

0:30:33.1 MK: So what you’re saying is that you haven’t found any viral videos of a cat riding around on the top ObservePoint as it audits the website?

0:30:42.8 TW: No, but that’s the spirit.

0:30:44.2 MH: Alright. Good job, you two. To learn more about ObservePoint’s capabilities without any oblique references to autonomous robotic vacuums, you can just visit observepoint.com/analyticspowerhour. Alright, let’s get back to the show.

0:31:02.5 MH: It’s interesting how this data… And you think about it from a company perspective, but also from a society perspective, it just persists too. And that’s also kind of challenging.

0:31:14.6 AB: It shouldn’t, but if you can, yeah.

0:31:17.1 MH: Well, right. It’s sort of one really interesting aspect of GDPR, which is sort of like sort of the ground level data privacy regulation that sort of is this idea of like, “I can ask for my data to be deleted,” and that’s actually really important in a certain sense, because it’s sort of I’m listening to you talk and browsing the iRobot site, looking at like, “Oh, maybe I should get one of these.” But let’s say five years from now…

[laughter]

0:31:47.7 MH: Five years from now there’s change in corporate leadership and they’re starting to do all these other things with data, and there’s no guarantee that one company will continue to work in a productive way with this data or in an ethical way, with this data. And the same with, I guess the same with governments too. And I don’t know, I would like to hear your thoughts on that as well, because that introduced a sort of like, “Wow, where have I exposed my data?” ‘Cause I can make that decision and have that thinking process and then forget about it for years and years. I’m just curious what you…

0:32:22.3 MK: Like many MySpace and Bebo accounts.

0:32:25.6 MH: Just… Yeah.

0:32:29.9 AB: Everything is a double-sided sword. So on the one hand, I’m not sure how many digital humanists listen to the show, or librarians or archivists, but I’m sure they’re like, “All of this stuff that was in zip drives, we can’t access anymore. All of this stuff in VHS and all of those tapes.” And the same thing is happening to our online archive, and I think, unfortunately, as we’re learning the hard way with things like Twitter, yeah, leadership can change. You can have one benevolent dictator, one billionaire who swoops in and says, “No, I do different, I know better.” And I think it’s… You have to think about it the same way that you think about how you engage in the offline world. If you’re driving and you’re going over the speed limit, somebody might catch you. So if you are going online and you’re navigating the cyber square, there are people looking, there are pickpockets, there are cops, there are marketers, there are all sorts in both environments, and we need to be really diligent. There’s no guarantee that data will be well stored, but as a consumer, you can be informed.

0:34:07.3 AB: And so what I can say is how I conduct my organization, my department. I cannot tell you the quality and the hygiene of the data at Harte Hanks anymore. But once I was there, I could speak to it, and I could tell you whether or not I sleep soundly at night. And working at iRobot, I sleep soundly at night. I can tell you that we have been successful in aligning strategically with business interests, technological limitations and constraints by making those explicit and by educating the leadership in terms of what that means. So, there is certain data that we persist because it has value. If data doesn’t have value, we have internal processes that the lead data… For costs reasons, like I said, these are vacuum cleaners and we don’t want to spend $30 a year per unit, ’cause that’s gonna eat into all of our margins, that doesn’t make business sense. And so like houses change, behaviors change. You wanna talk about, what if we have been running optimization models that were fine-tuned and validated pre-pandemic? All of our customers would be mad because they would be like, I’m home now, stop hitting me.

[laughter]

0:35:39.9 TW: Yeah.

0:35:40.0 AB: Right? The models need to evolve, they need to be feeding off of recent, frequent, healthy data, and that helps too in terms of the confidence that somebody can have in how we use our data. But to the original question, yeah, you can only go by what is publicized right now. Like go to our website and read our privacy statement. Go to the Consumer Reports website and look at how the different appliances rank and then make smart, informed consumer decisions.

0:36:14.1 TW: Yeah.

0:36:14.1 AB: And the same thing with the social media choices, right?

0:36:19.3 TW: That’s right.

0:36:19.4 AB: Like, I don’t Zuck, that’s my first no choice.

0:36:20.3 TW: I don’t think anyone who’s in kind of a similar position as you are at say, Meta is saying, “We’re locked out.” They’re like the other end of the spectrum from a… As they came out a month or so ago, and they’re like, “Yeah, we don’t really know where the… What happens to… We collect it and we just kind of throw it off into the lake and then it just kind of works it’s way all the way through.” Which has been my experience with Facebook for over a decade has been…

0:36:51.3 AB: That’s part of why, I don’t Zuck. I don’t Instagram. I don’t Facebook, I don’t. I don’t Zuck, I don’t trust them.

0:37:00.0 TW: And a lot of… I think there’s aspects that, it’s more sloppiness than maliciousness in many… Well, and we won’t go down the what is greed…

0:37:07.7 AB: Why not both?

0:37:10.3 TW: Versus malicious versus… Why not both? [laughter] Wow. This is… Yes. Go ahead my son who’s gonna go work there in a few months is not… Doesn’t listen to this.

0:37:22.1 AB: Listen, I don’t judge anybody for what they do personally, in order to establish themselves professionally. The door is difficult to burst through for everyone. I think…

0:37:36.3 TW: Well, I dared him.

0:37:38.6 AB: You can…

0:37:41.4 TW: I told him he could intern there and I wouldn’t have to disown him, just he couldn’t take a full-time job and I think he just decided to test me.

0:37:46.9 MK: Oh wow, I never knew that.

0:37:47.7 TW: Oh yeah, I thought I was just making a joke, but.

0:37:49.9 MK: So Angela, one comment that you just made then was really interesting, was basically like, “I know what we do at iRobot, but I can say at night.” Have you, in your career, found yourself interviewing and ending up at companies that you’re aligned with, or do you feel… Have you taken jobs at places and over time been able to shape, I guess, the perspectives and thinkings around privacy and consent?

0:38:18.1 AB: I think I was a dumb 21-year-old. Like many, I think there are some people who get to be 21 and they have a little bit more sense. Sadly, I wasn’t there yet, and so it took me a while to understand a lot of these themes, let alone to care. So I did take jobs that in hindsight, I probably today would not given the rubric that I try to go by, but I didn’t go by that rubric then, and those experiences helped me learn what matters to me. So, would I know had I not done those things? I’m not sure, they made me better, but I’m also not sure that they would have been the choices I would make again, with the knowledge I have today, but going forward, I think I have reached a point in my career of professional security of notoriety, that I can have a little bit of leverage.

0:39:31.8 AB: And that’s what I was saying about your son, you don’t start out with leverage, and so I don’t judge anybody who’s starting out or who’s building. You punch up. You don’t punch down.

0:39:45.6 AB: And so going forward, I can be judicious and I can use whatever leverage I have to be at tables to help educate, evangelize and explain why it’s in everybody’s interest to be intentional and judicious, but like I did, I wasn’t at the table at the start. I was just, it rolls down hill. You do as you’re told, and you’re like, this does, wait, that’s how this works. I’m not sure I’m cool with this.

0:40:19.7 MH: Yeah, it’s interesting ’cause I think in a lot of ways that mirrors how companies interact with this in a lot of ways, and they make a choice at some point as they become more intelligent about the data they’re working with. Well, what am I gonna do with it? Am I gonna continue to sort of be like, I don’t know if I’m really that concerned with some of the ethical implications or, Okay, I’m gonna take this much more seriously. Alright, it’s time for the Conductrics Quiz, that quizzical query, the conundrum that temps our two co-hosts to compete on behalf of our listeners. Here we go. Let’s talk about the Conductrics quiz. Well, first things first, the Analytics Power Hour and the Conductrics quiz are sponsored by Conductrics. Conductrics builds industry leading experimentation software for AB testing, adaptive optimization and predictive targeting. Go to conductrics.com to find out more about how they can help your organization. Alright, Moe, would you like to know who you’re competing for?

0:41:25.6 MK: Definitely.

0:41:26.0 MH: Awesome, you are competing for listener, Brendan Lee, and Tim, you are competing on behalf of listener Hector Ferrer. Alright. Are you both ready?

0:41:40.7 TW: Yes.

0:41:41.3 TW: Ready, as I’m ever gonna be.

0:41:43.7 MH: Perfect. And we’ll follow the same rules as last time, so you can bandwagon on to an answer if you desire to. Alright, you are reviewing with a client a report that Michael put together that estimates dollar sales per customer, in that report, I provided the following; a $10 for the point estimate for the mean sales per customer, a $2 estimate of the populous… Population standard deviation, and a 95.44% confidence interval between 980 and 1020, so the upper and lower bound is plus or minus $0.20, the client stops midway and ask, what was the sample size? And as you’re looking over the report, you realized that I have forgotten to give it, you mumble under your breath, the blah, blah, blah, something terrible, and there’s a long awkward pause, and then you remember that 94.44 under the normal distribution is between plus and minus two standard deviations, I think meant 95.44 recalling the formula with standard error and doing a few impressive bits of mental algebra, you say confidently with no small amount of pride, the sample size is; A. 10, B. 100, C. 2000, D. 400 or E. 10,000.

0:43:10.4 MK: I feel like Gershoff might be trying to punish me because I swear he had a piece of homework where I was doing exactly this ones and I never finished it, and this is him.

0:43:20.7 TW: I am gonna say I’m very fortunate in that literally within the last 24 hours, I was looking up standard error, which is… And this is you can hands free, I’m pretty sure is the standard deviation divided by the square root of N, where N is the sample size, which again, if I was sitting here without having to try to do something on the fly and do this math, this would be way way less pressure, but I am going to say that if we’re going $O.20 and we had a $2 estimate of the population standard deviation, then, that’s two over the square root of N should be 0.2. So 0.2 times the square root of N would be 2. So it’s 0.2 times 10, which is, okay, I’m gonna go with the sample size of 100 and Moe can bandwagon or say there’s no way I quickly did that mental math as quickly.

0:44:28.4 MK: Look, I should probably bandwagon, but I just feel like today I’ve been talking a lot about being, MECE, Mutually Exclusive Collectively Exhaustive, and I’m like, I don’t feel we would be doing justice to our listeners by me jumping on the bandwagon, so…

0:44:42.9 MH: Interesting.

0:44:43.0 MK: You go with the answer and I will just live with the consequences.

0:44:46.8 TW: Okay.

0:44:46.9 MH: Well, your instincts, Moe are 100% correct!

0:44:52.2 TW: Correct.

0:44:54.3 MH: B is not the correct answer. So that means Brandon Lee, you’re a winner. Moe great job. The answer is…

0:45:01.7 TW: Good, Jeff.

0:45:02.5 MH: D, 400.

0:45:06.0 TW: Okay.

0:45:06.6 MH: So, well, Moe…

0:45:06.7 MK: That was clearly my, on the fly math skills pulled that one off.

0:45:11.4 MH: That’s right. Well, sometimes it takes more than math anyways. So recall that the formula for the upper and lower bounds of a confidence interval is plus and minus Z alpha percentage times the standard error, where the standard error is calculated as the estimate of population standard deviation divided by the square root of the sample size. Since we remember that the…

0:45:33.7 TW: I had that part right.

0:45:35.0 MH: That under the standard normal or Z distribution, 95.44 represents the area between minus 2 and plus 2, standard deviations for the mean we use Z equals 2, and we’re given the population standard deviation of 2. And that confidence interval bound is 0.2. So we solve for N, 0.2 equals 2 times and then open paren 2 divided by the square root of N. So N equals 2 times, all that equals 400. So yeah.

0:46:03.6 TW: Yeah, I was on the right.

0:46:04.6 MH: You were very close.

0:46:05.8 TW: I was on the right path.

0:46:05.9 MH: You were super duper close.

0:46:07.0 MK: It’s almost like you knew just enough to be dangerous.

0:46:10.3 TW: Oh, my.

0:46:10.8 MH: Just enough be wrong.

0:46:12.4 TW: Well, if that’s worth betting on, then come to Vegas.

0:46:18.3 MH: No. It’s…

0:46:20.4 TW: Thanks for coming to Vegas because…

0:46:21.9 MH: Yeah, exactly. ‘Cause we were just there. Alright. Thank you both. Thanks to Conductrics for sponsoring the Conductrics’ quiz, what a great little quiz and congrats to Brandon Lee and all of our listeners. Let’s get back to the show. I know our conversation is sort of skewed to the dark side a little bit so far, but I think there are exciting things that happen with this data as well. Maybe we could pivot a little bit and are there things you could share where you found some really amazing insights that came from some of these, you know, this kind of data or things like that as you’ve kind of, the work you’ve done and your teams have done over the years?

0:47:02.4 AB: I mean, yes. That’s I, you know, that’s why I stay, ’cause I love it, ’cause it’s interesting and intellectually stimulating and in all of those things, and I’m, you know, there is a dark side and I think we started out by saying let’s all be aware of it. I think the devil’s the greatest trick is convincing people that he doesn’t exist. So you gotta know that he exists in order to avoid him. That being said, my house is covered in smart devices. We had the internet service provider, folks come in, ’cause our network was weird and wonky and what’s going on, and he comes over, he’s like, “Okay, I’m gonna replace your router.” And he is like, “What are… How many, the fuck…

0:47:53.9 AB: So, you know, we have cameras, we have baby cameras, we have a smart crib. We have…

0:48:01.4 MK: Hey, that thing is magic.

0:48:02.5 AB: Vacuum cleaners. That thing is magic.

0:48:04.9 MK: Magic.

0:48:07.0 AB: The SNOO…

0:48:07.9 MK: Yes.

0:48:08.0 AB: Anyone at N-O-O, I’m not, I don’t receive commission.

0:48:13.7 MH: Yeah. We’re not sponsored, but we might as well be.

0:48:15.9 MK: We could be.

0:48:17.8 MH: Yeah.

0:48:18.7 AB: It is two kids now.

0:48:21.8 MK: It’s the best invention ever.

0:48:23.7 AB: We have the Nanit camera over the SNOO crib. We have smart lights that, you know, they flash yellow when the Bruins score, or they just turn on and off according to sunrise, sunset.

0:48:39.1 MH: But even like, as you were setting up the SNOO were you going through and, I guess were there choices you could make from a data collection and sharing perspective or was it kind of like you’re making a… You just kind of read through and say, okay, to use the features with this crib, I’m acknowledged there that risk, not risk reward, trade offs or whatever. I mean…

0:49:05.5 AB: Trade offs, I think is the…

0:49:06.6 MH: Yeah.

0:49:06.8 MK: Like they essentially get data on how your baby sleeps. Like that is the thing that they get, which is actually incredibly valuable for lots of research and like good intent. But I’m sure that there are also lots of things that can be used for that’s not great intent, but hey, my baby slept through the night, so I’m all good.

0:49:25.6 AB: So four years ago or three and a half years ago when my oldest was done with the SNOO, I actually wrote the company and I said, “Hey, I like data. Can you send me the data? ‘Cause it’s not the whole six months.”

0:49:40.4 MK: You can’t export it. Yeah. Yeah. You can’t export it.

0:49:42.3 AB: And you can’t export it, and they’re like, “We don’t keep it.”

0:49:44.5 MH: Oh.

0:49:44.9 AB: So I don’t know if that’s still the case, I don’t speak for them, I don’t work there, but that’s what they told me. That they, it’s not in, it’s not in their interest, so they just don’t keep it. They can’t give it.

0:49:55.8 MK: Wow.

0:49:56.8 AB: But, right. So, but to answer the question I read through, I didn’t know that, like I said, I asked later to collect and I assumed they kept it. I think in the language they give themselves wiggle room to keep it if they want to. And they just had chosen not to, probably for cost reasons if, I’m speculating purely. But I read through and I’m like, “Yeah, but this thing is magic.”

0:50:21.6 AB: Like, if this is what it takes, that…

0:50:24.7 TW: I could see their legal department saying if we keep this, then there will be cases of that are tragic cases that we wind up in a legal fight where we have court orders to turn over… You know, it’s like when smart speakers did they witness a crime? What’s the data. I mean, I completely speculation.

0:50:43.4 AB: That’s a completely plausible scenario. Again. I’ve not been in any rooms, I haven’t heard anything, but that’s plausible. Yeah.

0:50:51.3 TW: We could wind up in the middle of… And it would be in our best interest if we just said, “We do not keep it. It is parsed. It is not available. We’re sorry. That is not gonna be evidence in a custody or any sort of legal… ”

0:51:05.7 MK: Anything. Interesting.

0:51:08.5 TW: I mean, ’cause it… When we go to, lot of times we will…

0:51:11.5 AB: You’re bringing it back to the dark side.

0:51:13.1 TW: I know.

0:51:13.7 MH: Yeah. We’re supposed to be… [chuckle] We’re talking about magic and SNOOs and I was like, “Should we have another baby?” And I didn’t try this out.

0:51:25.7 AB: I can send you a video. I feel…

0:51:25.9 MH: Yeah, Maria, guess what?

0:51:27.6 AB: I can send you a video. I feel like that’s plenty sufficient. Although I do encourage you to Google SNOO stretches, ’cause when babies come out of the SNOO, they all do this exact same stretch and it’s kind of fucking adorable. But anyway…

0:51:40.1 TW: It is.

0:51:40.6 AB: I digress.

0:51:40.8 TW: It is so…

0:51:42.2 MK: And I can send you videos as well. I have videos of Jack wailing and 10 seconds later, he’s like, “Ohhh, this is so comfy.”

0:51:57.1 AB: Yeah. We have a ton of these smart devices that we made the decision of, I am going to be an exhaustive parent, this is a valuable tool, and it is a worthwhile exchange. Same thing with the nanny camera. They are collecting data and then once a week, they send me an email with all sorts of information that helps Milly my five-month-old sleep better, learn to sleep through the night, self-soothe and it gives me tips about what happened when I wasn’t paying attention. So there is a ton of value, and I see value and I’ve bet my career. I just think that as a rule of thumb, we have all abdicated our own responsibility for paying attention, and I think to a very large extent policy makers, politicians have abdicated their role in putting up guardrails. We don’t need to quash things, but it would be nice if companies were bowling with the bumper guards on. So the ball didn’t go into the gutter every third pitch. Anyway, enough sports ball. It’s a different part.

0:53:21.3 TW: Well, there’s a challenge with kind of the understanding. It is… There is complexity in it. Defining what those bumpers look like requires kind of understanding not only where things are now, but things are moving so quickly where they’re likely to be, and then to get legislated and certainly in the US, we’re so broken like there’s nothing…

0:53:43.9 AB: I didn’t even know that we need legislation yet, we need literacy. We need regulators not to legislate yet, but to have the curiosity to go figure out a third of the world that they’re not paying attention to.

0:54:01.9 MH: Yeah. And that goes back to the individual as well today, because there’s not a structure around it yet of having sort of like this higher cost, if you will, of digital citizenship, if I don’t know the right way to say it, but… And it’s hard because you brought this up earlier, a lot of people are suddenly calling into question, is this app on my phone exposing me in a way that I just never even thought about before, given what’s going on in the world? And again…

0:54:32.9 AB: I mean, somebody quantified how long it would take to read all of the terms and conditions and privacy policies for different services.

0:54:39.2 MH: Oh, completely ridiculous.

0:54:43.2 MK: Oh, I don’t think I wanna know the answer.

0:54:44.6 AB: I remember I saw it come across on Twitter and there was… This is before the Meta name changed, so it’s an older analysis. But the Facebook one was, it will take you to 22 hours…

0:54:57.7 MH: Yeah.

0:55:00.6 AB: To read the whole thing. And the Google one was…

0:55:01.5 MH: And if you ever wanna play a video game, there’s like… It’s too much all the time. And you don’t have no optionality like, “Oh, I like this app, but I don’t wanna agree to this term.” Well, then you’re done. There’s no agency there. Anyway, okay, we were supposed to be doing positive, easy, too easy.

0:55:27.1 MH: But I think it’s very interesting. And you mentioned at the top, Angela, you studied theoretical mathematics, which is sort of like philosophy, is that what we need to become? Do we need to sort of give up on our philosophical capabilities a little bit to sort of manage this?

0:55:47.2 AB: Like I said, they’re seven billion of us. So if I were to ask that, somebody would ask me to become more mechanically inclined, that just also seems really difficult.

0:55:58.9 MH: Yeah.

[laughter]

0:56:00.3 AB: I don’t think so. I think just… I say this, and I’m exhausted half the time right now, I’m not sleeping ’cause I have a baby. So for me to say, folks should just… Is high and mighty. And who am I? I think everybody would like to, but it would take a lot of effort to do everything by the book as defined today. And so people just click ‘I agree’, ’cause you gotta go on with your day. If you sign up for your 401K with your employer, you probably don’t have a choice, and the employer doesn’t have a choice because the financial service provider sets the terms themselves. And so are you gonna abdicate whatever match you might receive because you’re gonna be forced to share your financial information with a third party? Where is the agency there? The world is what it is, and I think there are trade-offs that at times are… Makes sense. When you purchase a house offline, nothing smart about it, nothing analytics, whatever the whole transaction is in the public domain. There is no bigger financial transaction that most people go through, and it’s all in the open, and you have no agency there either. And I think there are just certain things that are part of the social contract, and they’re okay.

0:57:30.8 AB: There are certain things that are part of the cyber contract, and they’re okay. But there are certain things that are not, and there are places where we have agency and there are places where we abdicate it. And some people have three jobs and they can’t afford to pay for their emails, so they have Gmail. And they derive value from that, and thank goodness that Google provides that. ‘Cause if they didn’t, that person might not have access to something that they do. So it’s not all bad. I said this in terms of Brazil, having all sorts of services available because they are made cheap by automation. But automation also creates automated mistakes, and so I don’t think that anything is all good and all evil. I think even Luke Skywalker had that moment with Ben Solo of doubt, right?

[laughter]

0:58:26.1 AB: Everything is a balance. It’s just about being aware as much as we can and having each other’s backs.

0:58:34.7 MK: God, you’re fucking phenomenal.

[laughter]

0:58:40.6 AB: You also have a baby, you need to sleep.

0:58:43.9 MH: [laughter] I would vote for you for sure. Yes, totally.

0:58:50.0 AB: Once I do start sleeping, I’m gonna go for the school board locally, but not yet.

0:58:55.1 MH: It’s a good idea and get close to that. Anyway, we do have to start to wrap up. This has been such a great discussion and actually so profound on so many levels. So I really appreciate that. One thing we like to do is go around the horn and share our last call. Something. We find interesting, something that maybe our listeners would find interesting too. Angela, you’re our guest. Do you have a less call? You’d like to share?

0:59:20.3 AB: I was thinking about what I would want to share earlier. And the first thing that came to mind was this book called the Charisma Myth. It’s by Olivia Fox Cobain. Pretty much everybody in my team that I mentor or not on my team that I mentor. I shared this with them, it has a ton of really awesome tidbits, but the thing that jumps out at me from this book is the concept of cognitive reframing. And so I am Latin. I was born and raised in Brazil. I live in Boston, also not known for the most sane traffic in the world. And so I am prone to being a little aggressive behind the wheel. [laughter] I confess. And so this book helped me think through, if somebody cuts you off, you can get really pissed off. You can get annoyed, you’re gonna wanna, you can ride their ass and you can let them know what they did to you. Or you can imagine it’s a parent with a kid in the back seat who punched through a window and that parent is losing their mind and racing to the hospital. And the key thing here is it doesn’t matter if it’s true, all that matters is for you to give yourself that beat, to imagine that being true and suddenly you can find the space to go, you know what? It doesn’t matter. I’m not mad anymore.

1:00:55.9 MH: Not a big deal, nice.

1:00:56.5 AB: Yeah. And so just telling yourself, fake stories that help you create the space to stop being driven by the amygdala and to give your brain time to move to the prefrontal cortex again. So this is super helpful.

1:01:19.9 MK: I’ve already added it to my Audible listening list, so.

1:01:22.9 MH: Yeah.

1:01:23.2 AB: It’s amazing. And then the other one, I know you said one, but I’m sorry.

1:01:26.2 MH: No, that’s totally. I’m just gonna totally fine.

1:01:29.9 AB: So this is Gut Feelings because it was sitting right next door on my shelf. So I just grabbed both. This is Gut Feelings, the intelligence of the unconscious and it’s by Gerd Gigerenzer. He is a researcher in Germany on behavioral economics. And this is all about how to learn, to trust your intuition, to recognize that even if you can’t explain where your intuition comes from, it comes from somewhere and sometimes it is hard earned and it carries a Bayesian prior with it that it is, it would be negligent not to listen to. And then, you know, one thing that was really interesting from this book too, is the concept of anchoring. And so this is where I learned that in a lot of the Nordic countries organ donation is opt out instead of opt in, whereas in the United States and in Brazil, it’s opt in rather than opt out. And so because of that anchoring the number, that the absolute number and the proportion of the population that signs up for organ donation is mirrored, is about balanced, which tells you that it’s not about the number of people who want one activity or the other, but there’s probably some amount of people who want to have agency. And it’s that proportion of people who want to make a choice, an affirmative choice one way or the other. So anyway, those are my last call.

1:03:03.5 MH: No, outstanding. That was like, man, that book is good. Took me years of therapy to get there. I should have just… [laughter] okay. Moe, what about you? What’s your last call?

1:03:17.8 MK: I had two, but there are half baked, in that I’m still processing. So I had this moment tonight where I was doing some stuff for work where I was like, yeah, shit. I am still thinking about measurement, like marketing measurement. And there must be people who’ve been in the industry way longer than me that have that like, oh, we’re still trying to do this, but yeah. So anyway, I’ve been reading a bunch of papers on measurement and I’m bringing it up because I’m really happy for anyone else to read them and weigh in and give me thoughts. One is called the Effectiveness Code by James Hurman with Peter Field. And it’s a really interesting article because it’s a lot more about like creative measurement. And I still, like my friend who sent it to me, I still, I said to her, I was like, I don’t entirely agree with it, but I can’t yet articulate why I don’t agree with it.

1:04:19.7 MK: Like they basically have come up with this roadmap for how you measure like creative, to have like the best creative ever, which has been on my mind. And then the other one is actually by Google called Measuring effectiveness: Three grand challenges, the state of art and opportunities for innovation, which is by Matthew Taylor, their effectiveness specialist, and a bunch of other effectiveness specialists, which is talking about like the typical, like three measurement things like incrementality, cause and effect, how you measure like long term brand and then kind of how you pull them all together. And that one I’m definitely still reading and processing. I just, the summary is we’ve been doing this shit a long time and still, no one’s figured it out. [laughter]

1:05:05.9 MH: I think arguably, there’s a lot of incentives to not figure it out by a lot of their financial and economic incentives by the Ad Tech Industrial complex to not figure it out.

1:05:16.7 AB: That’s why some of us do robots ’cause humans are hard.

[laughter]

1:05:20.7 MH: That’s right. Alright, Tim, what about you?

1:05:26.9 TW: I am gonna go with one kind of simple one, it’s a relatively new podcast from Jacob Goldstein and Pushkin Industries, I can make the elite. I think the… Like what you were talking about, Angela, that it’s reminiscent of blink, I think Gladwell might have quoted some of or use to his research in blank, so I’ll draw a connection to Pushkin industries, which is Gladwell’s kind of podcast thing, but they’ve got a…

1:05:53.4 AB: Production.

1:05:53.8 TW: Production company and it is called What’s your problem? It’s Jacob Goldstein, who used to be at NPR doing Planet Money. So he, each episode, he is kind of like the How I Built This format, but he sits down and talks with one founder and he tries to hone in on one specific problem they’re trying to solve, so he has a really fast grocery delivery person, he’s got self-driving cars, he had one that was a planet, the satellite world, pretty fascinating story of how they got into the… How do you take a picture of the earth once a day? And they started by throwing I think a Samsung Galaxy into a box and basically launching it into space and seeing what happened, so it’s interesting.

1:06:42.9 TW: I think they are like 20 or 30 minute episodes, so they’re reasonably short one-on-one conversations, pretty well produced and kinda interesting. And I’m gonna stop at one.

1:06:54.5 MK: I love that that was one like little thing.

1:06:57.3 MH: I know. I admire your restraint.

[laughter]

1:07:03.0 TW: Michael, what’s your last call?

1:07:03.9 MH: Alright, well, this topic, I think brought up someone in my mind who I really enjoy reading her reporting and follow her on Twitter, it’s just a reporter named, Shoshana Wodinsky. She writes a lot about, especially the ad tech space, but since we’re talking about data and those kinds of things, I think if you’re gonna start to build up your knowledge of what’s going on in the world, that’s someone you should be definitely be reading. She works for Gizmodo, and you could follow her on twitter.

1:07:34.2 AB: She was at Edhitch for years and years, so she knows the industry from the inside as well as externally reporting on it.

1:07:44.4 MH: Yes. Perfect. And so, yeah, I really enjoy reading her writing, she does a great job as a reporter, and I think there’s not enough reporting or investigative reporting that’s going on in our world that really is digging into some of these topics, and so I’ve just found her work to be always very informative and very well done, and so big shout out to her, but also well worth a follow and keep track of what she’s working on in writing, ’cause I think it speaks right to this topic of creating a better informed self to then go out into the world and try to make a better data world out there.

1:08:23.9 MH: Okay, so you’ve probably been listening and you probably are like, Hey, you didn’t tell on positive stories or you’re absolutely right, the world’s ending. But we’d love to hear from you. This is such an expansive topic, and I’m sure there’s things you’d like to say, and the best way to reach us is through the Measure Slack group or our Twitter, or you can also reach out to us via email, contact at analyticshour.io. We wouldn’t mind it at all, and if you are smart, you will also follow Angela on Twitter. Angela, would you mind sharing your Twitter with people? If that’s okay.

1:08:54.9 AB: Yeah, of course. It’s at A-N-G-E-B-A-S-S-A, so Angebassa.

1:09:00.5 MH: Yes, and also an excellent follow and as a leader in the analytics community, someone I follow for many years, it’s actually… I think we sort of touched on this at the top, but we’ve sort of been trying to bring you on the show for quite some time, so this is also like a big moment for us, we’re like, yeah, so thank you so much.

1:09:19.5 AB: I am so glad, it finally worked out.

1:09:22.1 MH: It’s been a pleasure.

1:09:23.7 TW: We tried for Episode 121. We tried for episode 144, we tried for episode 169…

1:09:29.7 MH: Oh, yeah. That’s real. [laughter]

1:09:31.0 TW: We’re like, Don’t make us wait till 225.

1:09:34.0 AB: Oh, wow, Tim.

[laughter]

1:09:36.3 MH: Way to harken it back, Tim. Nicely done. But no, thank you so much, we really appreciate it, and the insights have been amazing as we long suspected. So that’s really good. Alright.

1:09:49.5 AB: It was a really wonderful conversation. Thank you for having me.

1:09:52.4 MH: Yeah, absolutely, and obviously, no show would be complete without checking in and thanking our producer, Josh Crowhurst, thank you for all you do to make the show possible, and I know that no matter where you’re sending your data and what company your government is doing, what with it, there’s one thing, I have both of my co-hosts would agree with me on both Moe and Tim, you gotta keep analyzing.

1:10:20.5 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour, on the web at analyticshour.io, our LinkedIn group and the Measure chat Slack group. Music for the podcast by Josh Crowhurst.

1:10:38.2 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics, analytics don’t work.

1:10:43.6 Tom Hammerschmidt: Analytics! Oh, my God. What the fuck does that even mean?

1:10:52.6 TW: Rock frog and lizard skeletons.

Leave a Reply



This site uses Akismet to reduce spam. Learn how your comment data is processed.

Have an Idea for an Upcoming Episode?

Recent Episodes

#257: Analyst Use Cases for Generative AI

#257: Analyst Use Cases for Generative AI

https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_257_-_Analytics_Use_Cases_for_Generative_AI.mp3Podcast: Download | EmbedSubscribe: RSSTweetShareShareEmail0 Shares