Subscribe: Google Podcasts | RSS
Subscribe: Google Podcasts | RSS
“The people” are often the most valuable asset for a company, so getting the ones who are a good fit, supporting them in their work and their careers, and figuring out what motivates (and demotivates) them is critical. And data—both quantitative and qualitative—can help with that. It’s a topic we’ve wanted to tackle for a long time (well, Moe and Michael have; Tim was confused, as he thought it couldn’t be that hard to analyze a data set consisting of a single “Do they do their f***ing job?” boolean flag), and we finally got to it, with Andrew Marritt from OrganizationView!
0:00:05.7 Announcer: Welcome to the Analytics Power Hour. Analytics topics covered conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.
0:00:22.2 Michael Helbling: Hi everybody. Welcome to the Analytics Power Hour. This is episode 191. Probably there’s very few things more disappointing as a people leader, when you sit down with someone who’s leaving your team or your company and find out they had some dissatisfaction that could have been addressed if only you would have known what their problem was. One very clear learning over the years for me is that people don’t always feel comfortable telling you exactly what’s bugging them, and even in this area, data can be your friend. Using surveys and other tools to gather feedback is just one tool in the people analytics arsenal and human resources, departments and managers everywhere are putting data to work in more and more creative ways to solve employee concerns before they reach a breaking point. Hey, Moe. Welcome.
0:01:15.4 Moe Kiss: Hi.
0:01:16.0 MH: You’re the Marketing Data Lead at Canva, and I bet some of that resonates with you too as a people leader.
0:01:23.2 MK: It really, really does.
0:01:25.4 MH: And Tim, you’re the Senior Director of Analytics at Search Discovery and I bet some of that resonates with you but in maybe a different way.
0:01:32.2 Tim Wilson: What? I didn’t realize that… Was I supposed to hold back and be uncomfortable telling people exactly what’s bugging me?
0:01:36.9 MH: No, I am very glad when people tell me. It’s the people who don’t feel comfortable, for whatever reason, that…
0:01:42.8 TW: You just say that ’cause you’re not managing me now.
0:01:45.3 MH: Well, I didn’t know how to manage you then either. Alright. And I’m Michael Helbling, I’m the Managing Partner at Stacked Analytics, but obviously to progress this conversation, we needed a guest. We wanna give a quick shout out and to say thank you to listener, Matthew Brant, who was the inspiration behind this episode. Thanks for reaching out and recommending our guest, someone who could fill us in on what’s going on in the data space and that is Andrew Marritt. He is the founder and CEO of OrganizationView, a pioneering people analytics consulting practice that helps HR become more data-driven. Andrew has held numerous people in human resource leadership roles at companies like UBS, Reuters and JPMorgan, and today he is our guest. Welcome to the show, Andrew.
0:02:31.6 Andrew Marritt: Thank you very much, glad to be here.
0:02:33.7 MH: Awesome. Well, I think to kick us off, I think just to lay some of the groundwork, I’d love to hear a little bit about the kind of work you do and a little more some of the baseline and stuff. How did you get into this space and kinda what’s your passion behind it?
0:02:49.9 AM: So I’ve been doing people analytics with OrganizationView since about 2009. These days we specialize really in projects which involve text or mixed types of data, quite a lot of unstructured data, quite a lot of graph or network-based data. Partly that’s because the sophistication of most of our clients has gone up tremendously over the last 10 years, so we always get called into helping them with the stuff that they can’t do. To give you a very quick summary of how I got to probably being one of the first people doing data within the HR, well, especially here in Europe, I came from a sort of technical background, I had a father in technology. We built a home computer in the ’70s from a kit and I learned to program when I was like seven or eight in BASIC on something with like two and half K. Went off to do Maths in university, gave up very quickly and moved to economics because I wanted something more applied.
0:03:51.1 AM: I spent ’90s quite a lot in management consultancy before coming via that route into HR. And I never really was a proper HR person, I was sitting often between marketing and HR. And seeing how marketing developed and how that had gone from early ’90s where I always was used to joke that marketing seemed to be about working in the morning, going out to lunch with the advertising agency, that was an alcoholic lunch, and you’d come in late and that was how it is. Moving to the noughties where it was very, very data-centric and I’d always thought… Well, I’d looked at HR and it was rapidly becoming the only part of the business which wasn’t bringing numbers to the table in discussions and I thought it’s inevitable. It will happen at some stage, let me use my numbers background and my knowledge of marketing analytics type of techniques to apply to the workforce and to employees, so that’s how I got to here. And I’ve said, we move towards text, which is what we’re known for, but we’ll often be doing analysis, which will include text as just one of the sources within the data. So we tend to have quite a broad approach.
0:05:02.7 MK: So I do have to say a big thank you to Matthew firstly, because, for those who don’t know, I am obsessed with people analytics, it is my favourite topic. And I swear our HR team keep asking me, and one of these days I’m going to say yes to moving to that space. But the thing, Andrew, that we really came to get your perspective on is just as an industry it does still kind of… Well, the outside perception anyway, and I’ve been trying to hire a couple of people for some time in this space, it does seem to be a bit younger and less mature maybe than some other more traditional areas of analytics. Definitely recruitment of people in people analytics seems quite tough.
0:05:45.4 AM: Yeah, I think that’s true. The best people I know didn’t come from people analytics. They came elsewhere. And I remember when I was learning 15, 18 years ago about how to do this sort of stuff, I had a very close mentor who was the head of brand research at UBS, a statistician by background, and I would spend a lot of time working with him on there, and I found that that was much more valuable from an HR perspective. When I teach people analytics, I use an old job advert, which I think came from Amazon from way back when, seven, eight years ago, when they were starting, and it was basically saying, you may not have considered a job in people analytics, however, this is a way of applying your analytic skills elsewhere. I remember, let’s say five, six years ago, I had some good acquaintances at Goldman Sachs in the people analytics team.
0:06:43.3 AM: At the time, they had their people analytics on the general analytics community within Goldman’s. They had the same pay structure, they were hiring the same sort of people, and the career path was not to be an HR person but it was a… You’re gonna be a good analyst in this firm and you’ll spend some time doing people analytics, and I think that’s the way of getting, certainly on the more technical side, some good people. When I was at UBS and we had a… It was then called an HR analytics team, I was the only non-psychologist there, and a lot of the good people analytics folks have gotten a sort of IO psychology background. There is some really good analytics folks in that area. I think the danger potentially is that some of those folks will think that the techniques they learned in IO psychology are the only analytics techniques that we use, and I think we all know that some of the times our best inspirations come from looking at a completely unrelated field and going, “Oh, that would work really well in my situation.” But IO psychology is a big influence in the field, I think.
0:07:55.0 MH: You said IO psychology? What is I?
0:07:58.5 AM: Industrial organisation psychology.
0:08:01.1 MH: Industry… Okay.
0:08:01.2 MK: I think that’s what Adam Grant studied.
0:08:03.6 AM: Yeah, I think it is.
0:08:06.1 MH: Okay. I’m gonna go a little even farther back, as kind of the total new… Or higher level. So where do organisations… Moe, you said it’s kind of newer space on the…
0:08:16.3 MK: It feels new to me, that doesn’t necessarily mean it’s new.
0:08:20.3 MH: Right, yeah, yeah. So perception, and this definitely feels new to me. I’ve worked in organisations that have various little tools that do little surveys or little feedback things, and most organisations have some sort of review system and I assume those are getting logged into some system, but where does… Are there organisations that do know people analytics besides what our retention is versus ones that are super deep versus in between? Where is even the awareness of the field and does it vary? I assume large enterprises have long since figured out if they’re not doing some people analytics, they’re just burning money, but as it moves downmarket to medium-sized, small businesses, where does people analytics happen and happen robustly?
0:09:13.2 AM: I think that’s a great question, and I think even if we take it a bit far back, the history of the field is longer than just psychology. Arguably, a lot of what we would have called, and what I used to do is operations research, taylorism about measuring work and seeing how it happens, which is what, 100 years old, that has at least as much basis into people analytics today in terms of at least the problems we are trying to solve. In terms of the firms, what we saw was there was an evolution from the technology and the banking, and pharma is another one of, a big area for us, the sort of numerical, science-led, or at least data-led industries, started applying people analytics ’cause they applied data to pretty much every one of their problems and that’s where we see it.
0:10:04.0 AM: It is much harder to deal with when you’ve got a small population size. We don’t… Most firms aren’t recording multiple data points. Your HR system has one or two changes a year, so if you’ve only got 100 people or 150 people, there’s not much data you’ve got, and if you only lose five people a year, how are you gonna build an attrition model for that? We’ve always said, well, as you get a smaller or rarer instances, you probably need to go more towards qualitative data and bring that in, and we’re really passionate about sort of a mixed methods type approach of using data in that way. In the end, we’re trying to solve problems not doing statistical tests or running machine learning models, so you have to be careful of… Can you apply these type of things there? Or you look for instances where there are higher issues.
0:11:00.7 AM: So maybe if you’re a smaller firm, you don’t look at analyzing the very end of the funnel in terms of who you hire. You might go, “Okay, well, what could we do to analyze who is applying in the first place?” And therefore we have much bigger data sets and we can solve that problem easier than we can solve the harder problems. When somebody says, “How do I identify what makes a great person for the executive team?” How many people have you hired for the executive team in the last 10 years? How many time instances have you got? There are better ways of trying to solve that problem than trying to look in historical data.
0:11:38.2 MK: It’s interesting you mention that because… We went down this path of trying to hire people in people analytics, and I think there was this really strong preference of like they had to have HR experience, and I got in this conversation where I was kind of like, I mean do they really… Because when I think of people analytics, I do really think about the combination of lots of qual and quant data, and definitely very strong communication skills because typically… Like, yes, I do think that all data people need strong communication skills, but particularly because this seems to be such an action-orientated, business-facing, leadership-facing type area of the business. Do you feel like finding or even teaching talent, that quant and qual combination? I feel like with people analytics, it’s such an art form to get it right.
0:12:30.7 AM: Yeah, it is and if we look at some of the… One of the pioneers in our area, certainly publicly, was Google who wrote a lot a long time ago. I remember chairing a conference where one of their people analytics teams were doing, 2014, 2015. My understanding is that they have a bunch of ex-strategy consultants working at the interface between the client and the data scientists, and that sort of communication is really challenging because you’re not just… Unlike a very data-rich area, you’re not working with data that has got a huge amount of accuracy in terms of what it is. You’re always gonna have to be blending different sources in to be able to create a story, especially when… If we look at the…
0:13:25.8 AM: As an economist, I’ve got a loss function. What’s the value if we get this right and what’s the cost if we get this wrong, those type of situation, it’s super asymmetric. It’s probably heavily regulated, so managers need to have really strong confidence and that’s not just a model confidence, that’s, “I’m not gonna lose my job if I get this decision wrong” sort of confidence. So the communication skills are hard. In terms of the technical skills, I think it’s rare you’re gonna find somebody with super deep knowledge of HR and super deep technical skills, maybe one of the psychology folks who’ve come for HR and still have got that background could fit into there. But if I think about my team here, I have a guy who did his PhD in Computational Physics, I have another one who did a PhD in econometrics, obviously appealed to my biases of what makes a good an analysst. I didn’t… I came into HR relatively late and didn’t have sort of a formal HR training, but I came through organisation design and working on consultancy projects around there. So yes, you need to have mental models of what you’re analyzing, you need to understand how these things work, but I don’t think that means you have to be an HR person to do that. I suspect a lot of managers have good mental models about how an organisation works and how people interact in that business.
0:15:00.3 TW: On the flip side, the econometrics kind of perspective, which I think is like an amazing way to approach everything in life, but when it comes to bringing an analytical approach to the HR team, is there a resistance for… Are there people in HR who are resistant to the analytics side of things because they have a perception that, “Oh, you’re trying to boil things down to hard numbers and this is a softer skill.” Is their resistance in HR teams sometimes to the analytic side of things, or are they usually saying, “I’ll take whatever I can get”?
0:15:46.1 AM: No, I think that goes two ways. I see more resistance five years ago that I see now. Nowadays in organisations, it’s almost the opposite. They want to use AI to solve every single problem or going like, “Don’t use it for this situation… ” Do get somebody to just do this humanly ’cause it’ll take you so much more time to build a model that will be just to pay some junior person to just do the job manually. We used to see quite a lot of resistance and people are not doing it. I think it comes back to those communication skills, you need to know what the data can do and you need to know what it can’t do, and it’s that balance between wanting to use in a smart way and over-using the data and I think that’s the challenge that we have. And if I think about the data that was coming off my colleagues experiments in computational physics, his level of accuracy and measurement isn’t completely the opposite end than Ahmet acumen. So you need to able to think about the data and how it’s being captured and what it actually does say, and it’s importantly what it doesn’t say. Yeah, it’s a blending sort of thing, so I don’t think you can necessarily run it there.
0:17:05.9 AM: In terms of the econometrics part, the one thing that… There’s two parts in econometrics. There is an obscure little part of economics called Personal economics and there’s some really good work, academic work in there, especially around things like Game Theory and how promotions have done in terms of games and stuff like that. I look at that as somebody who’s interested in the field, also econometrics is you…
0:17:29.8 TW: I feel like managing through the prisoner’s dilemma seems like the perfect way in this.
0:17:33.2 AM: Yeah, you should get everyone’s, you know, game theories as way of managing. The other thing that I found interesting about econometrics is that economists often deal with found data, right? So psychologists will often be creating experiments and doing things in a sort of controlled manner, whereas econometricians will often get given a data set and say, “Analyze this and we’re looking for different things such as natural experiments” and stuff like that. So yeah, it’s useful to have it, but as I said, you wanna get as many different types of people into your analytics team because they all come with a different set of lenses and therefore when you then combine them together, you get really, really interesting results.
0:18:19.4 TW: So here’s then my fundamental… If we look at just text analytics, NLP, and it’s usually just saying there’s found data and there’s kinda gathered data, so we all get told like, “Hey, you work at the company, your emails, your Slack messages, all those are technically fair game.” When we’re thinking people analytics and analysis, does it fall into that ever into a, we’re gonna call through the stuff that we legally and have a right to go through of found data, or is it typically really limited to, this is survey data, this is stuff that… This is data that we did collect either through an experiment or through a survey where we were informing the employees, or is it not black and white? Do you not go there? Is that a, no, people analytics in general and unless there’s a…
0:19:15.1 MK: Judging by people’s faces right now, Tim, this feels like a weird area.
0:19:20.0 TW: I think it is but you just kind of, when you’re like, “Oh, found data,” I’m like, “Well, okay, there’s a big data set. I feel like I wanna ask it.”
0:19:27.0 AM: Yeah. I think when I was thinking found data, I would be thinking about events in an HR system about somebody changing jobs where that system was never designed to capture data. In terms of the sort of data that you mentioned, yes, there’s a lot of conversations in the sphere about that, I think there’s two issues. One is we’re in a space where we have to be really ethically strong and just because you’re legally responsible for doing it doesn’t mean you should do it. I’ve always said to clients and to my team, unless you can stare the employees in the eye and say, “This is why you’ll benefit from us doing this analysis,” we shouldn’t be doing that type of data, right? And some of that might be hard. If you’re doing an analysis to work out and help somebody support a redundancy program, arguably the people are gonna lose the job, right? But if the alternative is that you do a random sample, or as is usually the case, who shouts loudest keeps the job, arguably you’re doing people better by taking a more objective aspect. I think there’s another aspect there in terms of the quality of data.
0:20:33.8 AM: If I ask a text question such as, “What could we do to improve customer experience,” we’ll get really good quality data about customer experience and ideas for improving customer experience. If I go and scrape the internal Slack, I’ll get such noisy data. And yeah, I might find some interesting stuff but our job is not about… I don’t have the time or the money and clients don’t pay me to just go and find interesting stuff, they want a business problem solved. If we can go and ask somebody specifically, how would you fix this, and ask 30,000 people that and get the whole perspectives from around the world, you get a lot richer data ’cause you can analyze the text in the context of what’s being asked, and especially on the ambiguous statements, that context is really, really important.
0:21:28.0 TW: So I just wanna… One, I’m not gonna… I’m not saying I think this will be a good idea, I wanna make sure I’m closing the loop a little bit in saying I could see… ‘Cause there’s precedent for anonymizing data. I mean that was super helpful, that’s kind of what I suspected the answer was. I could see organization saying, we’re not… Yes, we’re taking your message. We’re separating it from you, we’re taking steps to make sure that names are scrubbed and then we’re just… We’ve rolled out a new process and we just wanted to call through it. I do feel like where you hit on at the end was like that’s so noisy that you’re gonna… The signal versus noise in that probably just isn’t worth it. That feels like the sort of thing that somebody in a management position would think that the magic of AI would… You’d purely find some golden nugget and really that’s like chasing after something you’re not gonna find. So okay.
0:22:22.2 MH: It’s like when they wanna digest your Twitter feeds and your Facebook posts and stuff like that and it’s like, what does that have to do with anything?
0:22:30.2 MK: I think the thing is though, Tim, that when I think of what our people, team have in terms of data requests, the truth is they’re actually swimming in data. Of any area of the business, they are the ones that have so much and don’t necessarily have the resources to do anything that you’re like, even if you could do something like that and even if you could find value in it, I’m like… You think of applications that you get. Who you actually… Like your whole hiring pipeline of what applications you get through to interviews, through to actually getting a job, through to who’s successful in the job? Who actually becomes high performing? What does that mean about you refers? That’s just one tip of the iceberg of the hiring process. And then you talk about engagement and you talk about retention. What’s gonna keep staff? There are just so many ways you can go that I’m like, “Why would I wanna analyze people’s Slack messages?”
0:23:25.6 TW: Yeah. And again, I was asking for… I wasn’t…
0:23:28.7 MK: I know. I know.
0:23:28.9 TW: Promoting it or thinking it was happening, but when you say swimming in data, so come from the higher point forward, outside of data that you… Like they’re swimming in data?
0:23:38.5 MK: Yes.
0:23:38.9 TW: I get that there’s over time, there’s promotions. So what are the… And there’s review. What’s the data they’re swimming in that happens to them versus their collect, I guess… What is the data you’re working with?
0:23:52.3 AM: I think you can go… In most of the people analytic applications, you can go from starting with the core, which is usually our Human Resources system, which records who’s in each role, what are their career histories been, some demographic information about that individual, that’s usually the starting place for people when they’re really, really starting. I think they’ll do stuff as early pilots such as attrition analysis, seeing if they can spot any patterns for why people are leaving the organization just from that. Then you start looking at other Human Resources systems such as talent management systems about the reviews, and the whole performance management process, that’s usually in a different system. There is a recruitment system, that’s usually a different system. The recruitment system is a really interesting one actually, because a lot of people… There’s a lot of data for a lot of firms. You’re getting a large number of applications there.
0:24:48.4 AM: When I used to look after recruitment strategy at UBS, I think we got somewhere between half a million and a million applications a year, so there is a vast quantity of data. What’s interesting, if you think about the data though, is you only record people who are successful at any stage. So you don’t know that that person that you never bothered to take into interview wasn’t the perfect person and would’ve been your high flyer. You can’t possibly know that, you haven’t got any data. So people get a bit blinded in thinking, “Oh, I’ll just analyze this thing,” but your data is a creation of the processes you’re usually trying to analyze. And you, by the fact that you’re managing them and making decisions along the way you’re also bias in the data as part of that. So you often have to have a think about those type of aspects as well. I don’t think enough analysts do think about that though.
0:25:42.3 TW: Of all of the white dudes we hired, which was mostly what we hired, the white dudes were most successful, therefore the white dudes must be the… Yeah. Okay.
0:25:50.3 MK: Can I ask about that, speaking of demographics, one thing that has sort of surprised me and I’ve kind of reflected on, so we do annual culture surveys, which is employee satisfaction, very typical, lots of different quant questions with opportunities for comments. And the bit that always surprises me is how much the business wants to understand the results from a demographic perspective. And the reason I find that really surprising is because in marketing analytics like, yes, we can cut data by demographics, sometimes yes, sometimes no, that’s a long winded rabbit hole I won’t go down. But I feel like generally we have shifted our thinking that behavioral data is actually more important.
0:26:38.7 MK: But when it comes to people analytics, and I don’t… In my head, I can’t wrap my head around, is it because we’re trying to figure out if there’s a specific group of people that we’re either discriminating against, that we’re not treating fairly, that we’re not giving enough opportunity? Is that what’s driving people to wanna see that demographic breakdown? Or is it that maybe their understanding is more limited and therefore we don’t have those behavioral cues? It’s a question that keeps coming back in my mind that I haven’t found the answer to.
0:27:11.0 AM: I think it’s predominantly the latter, I think the way that we’d usually be analyzing data from a sort of reporting perspective, which companies have done for many, many years, is by splitting on this sort of demographic data, because it’s easy to get access to, it also can be mistreated. So for example, we did a piece of work with a financial services firm in Asia late last year, and in a country where they were legally allowed to, the recruitment team decided they were gonna hire people over the age of 35 because people over the age of 35 left less than people under there. But that was also significantly because people over the age of 35 had 10 years tenure and we know that the probability of leaving drops every single year.
0:28:01.1 AM: So if we then cut to the set, instead of analyzing how old they were when they left, we analyzed how old they were when they joined, which is something you can control and they were trying to control, is a different set and the difference disappeared overnight. So you suddenly find that a lot of these demographics, you’re looking at something completely different and it’s a sort of confounder that you haven’t been thinking about thats’s really, really the challenge there. I prefer having a look at behaviors, there are some demographics that do make a difference, gender has more issue than I’d like it to have in so many different ways. But often, it’s not just gender but it’s other things linked to there.
0:28:48.3 AM: I think it’s right to understand things from the sort of gender and other protected clauses just to make sure not to see, our people are getting less promotion but then to ask the question, why is that happening and to really study that. Tenure, as I said, is always an important one to look at, management grade is also important, tends to the analysts, tend to sit in headquarters, they tend to be medium to senior level. If you’re the person working in the warehouse, moving boxes around, how the organization looks to you is very different to somebody in headquarters and it’s worth addressing that and an executive will never hear those voices unless we work out a way of doing that.
0:29:33.2 TW: So you’ve sort of said that collecting the text or the qualitative data around… And I think you said it’s around kind of employee experience or employee satisfaction, like what is kind of the mechanism… I mean, to kind of combine that with where we were with demographics, what is the mental model or the framing of… You’re trying to ask the right questions get sort of open-end feedback. I think I saw in one of your talks somewhere, you’re like, “Look, asking one question and getting… ” One open-end question is gonna give you richer information than asking a bunch of drop down selects. So is that kind of the nature of, get them talking in their words, data is messier, but more valuable and then kind of combining with that? Are you slicing it based basically whether warehouse worker versus executive, but are you also slicing by race and gender and other factors? How do you actually analyze that?
0:30:42.7 AM: Yeah, let me start from the beginning. I think if we were to ask… I think it’s worth having a think about the survey and the survey, what it does. So a traditional survey, when you’ve got, let’s say 50 employee questions and maybe a couple of open text questions, if you’re lucky that they’ve tacked on at the edge, probably without much thought about how they’ll do it. Usually, some of the big providers I know of…
0:31:08.1 TW: Anything else you’d like to say?
0:31:10.2 AM: Yeah, precisely. So the scale questions are based on some form of model of what they want to understand about it. It’s often from an engagement perspective, it’s some hopefully carefully validated studies that a bunch of psychologists have done.
0:31:27.2 TW: Are those the ones like the, do you have a best friend at work that… Was it Gallup or somebody came…
0:31:31.5 MK: What? I thought it was like, I’m proud to work at this company, was like…
0:31:35.4 AM: Yeah, both of them have gone through the same sort of process, right? So the way I look at it in terms of the open and the scale is that the scale questions come up with a model first. You have a model of what you want to understand, you ask about those things and you get a ratings. The open text, you develop your model when you see your data, right? So certainly we have taken a very much inductive type approach of what’s in here and then you often find that the employees talk about something that you would have never asked about in an employee survey, right? And I think that’s a really interesting…
0:32:15.5 TW: What’s an example of a prompt that would drive something, like how do you… What kind of prompt drives those sorts of responses?
0:32:23.4 AM: We always recommend people ask questions in pairs on open text, we say, “What’s great about working for this organization, and what would you do to improve working at this organization?” The reason we ask in pairs is that sentiment analysis is really inaccurate, especially on survey data. So it’s quite hard to work out whether this person is really saying this is a positive way and really negative, especially because a lot of those answers are super ambiguous, right? So I give a bunch of people the same responses and get them to rate it, they’ll rate it differently. And then this is just there. So asking, getting a pair, whether it’s about working at the firm or, over the last two years we’ve done a lot of working from home, how could we… What’s good about working from home? What could we do to help make it easier for you or make it more valuable?
0:33:14.2 AM: You need to keep it open enough that people feel that they can write stuff in without having to just sort of, leading them in, but not so open like, is there anything else you want to tell us where, that generally prompts people to be sarcastic and jokey and not giving any serious responses. We’ve seen…
0:33:35.4 TW: We’ve seen them all.
0:33:39.2 AM: So we then get the comments, and let’s say it’s 30,000, 50,000 or 100,000 responses, because our clients are on a larger scale, we will pre-process this and use automated translation, we find that’s good enough for understanding the responses, but it’s not good enough to translate the questions because you need the granularity in a human being to do that. We will then go through a text analysis process. And as I said, for us, it used to be 100% inductive, these days it’s a hybrid.
0:34:08.7 AM: So we’ve got a standard model and training data of somewhere in the region 230 themes that people talk about in sort of open questions and then we’ll look at the remaining data and so say, “Are they other themes that we haven’t got in this dataset?” Because it’s client specific or it’s industry specific. So we look at it there. And then we analyze every single one of those themes by those categories. We like using the binary classification model, so we’ll build 230 binary classification models, is this about training or isn’t about training, we find that a lot more accurate than doing a multi-label model. We then take each category and we look at how people are describing about it. So how are they describing leadership? What are the different ways… I was just explaining just before this call about explaining the results we had had with the clients, we had…
0:35:05.5 AM: The clusters had identified that a bunch of people were talking about too many leadership changes and then another bunch of people were talking about how leadership needed to change. And the clustering would split those two things out and they’re completely different, but the words are pretty much identical just in different orders.
0:35:24.2 MK: Totally.
0:35:24.7 AM: So we then get a much bigger list. Now I think you can get from employee survey to there without what we would call an exam question, but you need to have some specific things about what you’re trying to get out of that. If we had the head of IT asking us what do we need to know about this survey, it’s not just the time people mentioned technology, he probably needs… Or he or she probably needs to know about the time they don’t mention technology but you’d have thought they would. So where we’re doing a lot of manual paperwork, the head of IT wants to know about areas where they are doing lots of manual work, paperwork. So we get it to that level and then we do an exploratory find and say, are any of these topics or sub-clusters more likely to be mentioned by a particular group than others?
0:36:11.2 AM: And we have a sort of a probabilistic model that goes through all of this demographic and say, people are talking about… Well, I’ll give you a real example from last week. If you were a junior person, you talk about career development in terms of training and learning new skills. As you get more senior in the organization, it starts to become about navigating the organization, about the politics, about open opportunities, about, how can we have gone fully working at home over the last two years, but you still need to relocate to Cambridge to do a particular type of job, right? So you start to see the nuance in this sort of data, but to get good communications, to get good reports and therefore to take action, you need to have a really strong exam question of, what am I trying to answer? And you could have the same dataset answering a bunch of different questions.
0:37:01.4 AM: And you have to be happy that you only use a subset of that data to answer a particular type of question, because a lot of it just won’t be relevant to the question you’re trying to answer. And I think that’s the sort of challenges that… We’ve been working really, really closely with a really great… Somebody who teaches academics, how to do quality research, and I’ve just been saying that we can get the technology to understand and split up this stuff but everybody’s just going, “So what?” So we’ve just been sort of saying how do we… How would you do this manually as somebody… If somebody gave you this as an unlimited budget, now let’s see how we can automate the parts that it would be deadly boring for you to do this, and you can’t do everything completely automated.
0:37:47.4 MK: You’re completely blowing my mind because I… I suppose the thing is like, I’ve done a lot of survey stuff in my career and one of the things, I’m not gonna lie, I always lean towards quant stuff just because it’s easier to work with. And then you have a number and you can look at that number last year versus this year and break it down multiple ways and as soon as you start getting into open text, I’m like, “This shit’s hard. NLP is hard.” And one of my biggest frustrations is like you always end up with some like word cloud that someone then is like, “But how many people does that affect?” And you go, “Good question, I have no fricking idea because all I’ve got is a bunch of words and they don’t tell me… Anyway. I’m like mind-blown by this whole process and it’s obviously… Obviously, you’re doing completely incredible work and I’m so fascinated. This is amazing.
0:38:45.5 AM: I think one of the fascinations on the qual side for me, ’cause I was very much a quant person in the beginning, was doing this marketing HR stuff back in the day and doing usability tests. We used usability test to all of our HR systems, which was frankly…
0:39:03.0 MK: Quite fun.
0:39:03.6 AM: Great fun with it. And you’d produce all of the web reports of how they’re using your career sites and how they’re using the tools, and then you’d show one video of somebody struggling and not being able to… “How on earth do I just apply for this role?” And you show that video to the executives and suddenly you get your budget, right? And you can produce as many numbers as you want, but it’s the qual stuff that has power. And if you get the… We look at trying to identify those sort of quotes as well, because we know that if you’re building a story, we can sort of quantify the split of people talking about leadership, how they’re using it in different clusters, but actually also, finding this needle in the haystack of a really well, powerfully written quote, we go, “This explains what these group of people are going through,” that’s important. But you can’t fully automate it.
0:39:54.9 AM: You have to use the human who ends up pulling all this data together, but we just hope we can sort of quantify and give a sort of scale and be able to say, “This is an issue in France, but it’s not at all an issue in the States.” Or, “These are the things that people at headquarters are talking about.” We had a wonderful example with a sportswear company, where the sales teams thought the product teams weren’t listening, they should spend more time with the customers and the product team thought that they were, ’cause they weren’t getting the right products, and the product team’s is saying, “Why in the hell aren’t these sales people selling our products? ‘Cause we’ve been developing them like crazy and we think they are really fantastic.” And you see the same sort of… They’re both talking about the products, but in completely different ways of using those sort of things. So we see that time and time again within that sort of information.
0:40:47.2 MH: Let’s step aside for a brief word about our sponsor, ObservePoint. Sure, you could just have some poor intern or junior analyst click around your entire website every couple of days with developer tools open, looking for missing or mis-firing tags but that would be pretty brutal.
0:41:06.0 TW: Definitely! Downright cruel, actually. Not only would it be prone to error, but if you’ve got even the most basic level of people analytics going on in your organization, they would definitely throw a red flag when it comes to the job satisfaction of those poor souls. Much better to automate a task like that.
0:41:23.7 MK: And luckily, automation is what ObservePoint does best. It automates the ongoing monitoring of your website for data collection issues.
0:41:32.8 TW: Automation for the win! ObservePoint doesn’t remove the need for people, it just removes the drudgery of constantly monitoring key pages and user paths for proper and privacy-compliant data collection. People need to set up the auditing, of course, and people need to be recipients of automated alerts when something goes wrong. But, lets let the machines do the repetitive stuff. That’s where ObservePoint shines.
0:41:53.8 MK: And of course, it’s human beings who will need to look at the reports that trend the results of the auditing, over time, so that they can see if their data governance and QA processes are really working well.
0:42:04.4 MH: Exactly. Basically, ObservePoint not only helps ensure the integrity of your data collection, it reduces the risk that your people analytics will turn up repetitive data auditing as a driver of employee dissatisfaction. So if you want to learn more about ObservePoint’s many capabilities, go request a demo at observepoint.com/analyticspowerhour. And give your junior analysts a break. Let’s get back to the show.
0:42:34.9 MH: So switching gears just a little, or maybe not really switching gears, obviously your work probably tends to… You probably tend to work with pretty large organizations to kinda do this. Smaller companies, like you said earlier, you can’t really do this with their data. But there are companies that sort of do aggregates, like they’ll do in surveys and then aggregate your results up into all their customers. Could you talk a little bit about your perspective about… Are those useful? Have you found them to be useful for smaller companies? Or is it just like, “Well, how do you know?”
0:43:08.9 AM: Yeah, I think it’s mixed. I think sometimes the survey providers try and persuade people that benchmark data is a lot more valuable than it is because it…
0:43:20.3 MH: Well, of course. That’s what you’re paying for. Yeah.
0:43:24.3 AM: And it’s context-free. So you can’t look at those answers and assume that they’re the same sort of thing. We’ve done quite a lot of work in saying we can predict survey answers from the population that give it, because they obviously have different needs at different life stages. If you have a group of people who’ve just joined, they’ll tend to be really happy about the firm ’cause they’re in a honeymoon phase. If you’ve got a team that’s full of eight year old… People who have been eight years within the firm and they’re all a cynical bunch, they’ll just naturally have a much lower score and you just have to accept that. You’re not carrying apples with apples.
0:44:04.9 TW: I was briefly imagining having a team of eight-year-olds. That’d be fun. [chuckle]
0:44:07.2 AM: Oh, that would… Sometimes, I think that could be a benefit. The US-based firm, ADP, runs payroll for a large percentage of the US population and their aggregate data is phenomenal. And certainly, they publish job reports and they’re more accurate, in some ways, than the official statistics, ’cause they just have such a big sort of item-level data set there. So I think it’s mixed. There are things that you can do with people analytics type techniques that can do in smaller organizations. One thing that we’ve been doing a lot of, and I think it’s not just us, I think a lot of people are doing it in the moment, is what’s called organization network analysis. And there are two different ways of collecting the data there, the gold standard is probably both. One is you can look at communication flows and you can get the metadata out of your email system or… Microsoft have got a tool that enables you to do this automatically. The other way is you can ask survey data. The automated way will have a much more complete records of the communication, as long as that communication was via electronics.
0:45:27.5 AM: If it was about people, two people physically talking to each other, it won’t have any record of that whatsoever, but that data doesn’t tell you the quality of that conversation. The survey data will talk about the quality of the relationships, but it wont talk about, people will have their normal human biases, probably recent seeing, all those type of things. So the idea is to bring it together. But I would prefer the survey basis, ’cause also from a legal perspective, it’s usually a lot easier to get through an organization to get it done, ’cause you’re asking people and they can choose not to provide information. We probably can deal with that, for a nice use case, from a 150 people, 100-150 people, you get some reasonably interested, and then you use that data like you do in a social network and an analysis and start to analyze relationships between people within firms. And this comes on to a conversation, which I think we’ve looked at before, which is around collaboration, and actually, what is performance?
0:46:29.8 AM: And I think this is something that’s started to come into the field. Traditionally people had a performance review and the HR system was designed to say, “Tim or Michael or Moe, this is your rating over the last year. This is how you did.” And in some ways, if we think about the data, it’s just sort of a… You can think, “Well, this is assuming that everybody is individual-based” and that the way that you work together as a team is irrelevant, and the networking type approach is putting much more emphasis into understanding how those relationships happen. And look, I’ve certainly seen instances where a team of high performers, they had an admin person in the team and none of them would’ve been a high performer if you took that admin person out of the team, ’cause…
0:47:19.5 MK: Wow!
0:47:21.4 AM: She was…
0:47:22.6 MK: That’s fantastic.
0:47:25.5 AM: Yeah, so you look at constructing teams and I think some of the stuff that… If you look at sports and how… We all know, and whatever our favorite sport is you get some huge superstar moving teams and suddenly they go from high performer to not high performer. There’s a book, I think I’m gonna get this right, I’ll send you guys the links so you can put in the show notes. By a guy called Boris Groysberg, I think, about where they studied equity analysts and equity analysts were an interesting study group, ’cause they all rate each other and that’s public data. So that, across firms there. And one of the findings was, if a superstar analyst moves from, let’s say JPMorgan to Goldman’s, there’s a probability that he or she will be a success in the new environment, right?
0:48:13.0 AM: And what would change that success? Well, recruiting the whole team increases the likelihood ’cause you keep that sort of social glue together, that makes more. Women are more likely to be successful in that environment than men, because women tend to bring networks broader and often outside the firm, where men rely on people within the firm and closer. So it’s fascinating being able to have a look at some of this stuff, as performance as a measure of collaboration. We’ve always said, “We want you to collaborate,” and then we always measure people as individuals, right?
0:48:50.5 MH: Yeah.
0:48:51.5 MH: Oh my, you just picked on a scar that needs a whole another episode. That’s so good.
0:48:57.1 MK: Oh my God, this is perfect. I’m actually doing my performance reviews later tonight, so like, yes, this is such a good thing to get the mind going.
0:49:08.7 TW: So I just wanna close with the whole network piece and the communication. So that is like dipping the toe into the, pulling data from the email system, but just at the communication flow. Does that tend to anonymize the person, if you’re trying to find the communication networks and nodes? It just doesn’t have the content of the communication.
0:49:31.8 AM: It doesn’t have the contents of the communication, but there’s still some really useful stuff you can do.
0:49:34.4 TW: Okay.
0:49:35.5 AM: You can see when people are getting messages, so you can see things like who’s sending out lots of stuff at the weekend and how does that let if we… When you start joining that to engagement data, to people whose boss sent them messages on Saturday night, are they as engaged as people who are expected to have a more normal work life with the separation there? So we can start measuring those. Or you can spot teams where, and I’ve been in some of these environments, where every CC’s their boss on every single thing.
0:50:08.5 MH: It’s a big red flag.
0:50:12.0 AM: Yeah, and you see that stuff in the data. So you don’t need to see what it is, but you can’t tell whether the message is, “Should we go for lunch today?” Or “Can we deliver this project ’cause you’re late.” That’s the sort of thing you don’t get on the email. And I would really… If a client asked us anything about reading text on the emails, we would just say, “No, don’t… ”
0:50:34.8 TW: Yeah, don’t go there. [laughter]
0:50:35.5 MH: That’s right.
0:50:36.5 MK: Can I ask you an odd question, Andrew?
0:50:41.2 MH: Moe, you’re killing me right now. [laughter]
0:50:42.4 MK: Oh, come on, this is like my favorite topic.
0:50:44.8 MH: No, I know.
0:50:47.2 MK: Sorry.
0:50:47.4 MH: And this is your last one, Moe.
0:50:47.6 MK: Okay, it’s my last question.
0:50:48.7 MK: So I was chatting to a psychologist from an employee satisfaction survey tool thing and we were talking about the general questions that we touched on with statements that people typically score like, “I’m proud to work here,” or “I see myself working here in 5 years time,” general indicators of engagement or whatnot. And one of the things he said that was really interesting is that previously… The topic of systems and processes didn’t have a big impact on employee engagement or retention, but since COVID happened and like the change of how companies work now and people working from home and yada yada yada, systems and processes seems to be this area that if people are not happy in that space that does seem to be correlated, I’ll use that word lightly, with lower engagement or retention. I’m just curious, have you seen in your work, any sort of shifts in trends or like things that have changed as a result of the different ways that we’re all working now?
0:51:58.1 AM: I’m gonna answer that in two parts ’cause I think that what we are seeing in terms of the systems and processes is almost completely different. So if you ask somebody what’s great about working here, what’s not great about working here, the really engaged people talk about systems as processes is needing improving, basically if you’re asking open text. So if you’re asking open text, a really engaged employee will talk about things that get in the way of them performing. They’ll talk about bureaucracy, they’ll talk about speed, they’ll talk about decision-making, they’ll talk about tools. As you look at disengaged people, they talk about bullying and a whole range of different things. Talking to the HR department is definitely linked to being disengaged. And likewise, if you look at the positive things, a engaged person will talk about the vision and buying into the vision or the mission of that, about working with the organization, whereas disengaged people will talk about, “Well, I get paid a lot and it’s nice, it’s not far from my home.” So it’s different in there.
0:53:00.3 AM: In terms of the other part of, have we seen any difference within there? I think there’s… It’s changing so so frequently and it changes depending on the firm. Certainly, people want more flexibility but that’s always been the case. Flexibility has been a big, big issue. When we started in sort of COVID two years ago, we moved working from home from flexibility, we had in a category that would tend to assign ’em together to its own category, but it’s now this thing there, you see different patterns depending on the group of people within the audience. And actually, I looked yesterday at one of our clients, the closest thing, and this, I say one of our clients, it could have been any of our clients, the closest top theme that we have linking to working at home is micromanagement.
0:53:51.3 MK: As in people are getting more micromanaged because of working at home. Wow.
0:53:56.9 AM: Yeah. Not in terms of the absolute amounts, but when we look at sort of probabilistic, which ones are really unusually likely to live together, you know, which are coming together much more than you would expect, micromanagement is the number one. So we are seeing those two themes coming up time and time again. But it really depends on the organization. I’ve got clients who wanted… The exec team wanted everybody to be in the office for the last two years and this has been a battle between them and the employees. We find that, people saying, “We’re growing so much as a business, we’re totally overcrowded. Why are we having everyone in the office if we’re gonna all get… ” ‘Cause the office is overcrowded. Or we’ve seen people, firms cutting down time when the canteen is open. So then everyone’s going, we’re trying to stay safe but we used to have a canteen open two hours, it’s now one hour and that means it’s twice as busy as it used to be.
0:54:44.9 AM: So we are seeing all those type of patterns. So yeah, the tools and technology, I think people have relied more on tools and technology. It was a big theme at the beginning of COVID when… I think most firms did amazing jobs of getting people up and running at home, quicker than they ever could have imagined. Especially in places like India, where they had to physically think they’re unlikely to have broadband and kids at home and stuff like that. I think the vast majority of our clients have done amazingly well, but people are now saying stuff like, why should I have to relocate ’cause I’ve been able to do this job from, you know, wherever for the last few years. Why do I now have to move me and my family halfway around the country to be able to do them.
0:55:35.2 MH: Alright. It’s time for the Conductrics quiz, the quizzical query that presents a conundrum to my two co-hosts as they go toe to toe on behalf of two listeners. Alright, Tim and Moe, are you ready.
0:55:48.2 MK: As ready as I’ll ever be.
0:55:49.5 MH: Excellent.
0:55:52.1 MH: The Analytics Power Hour and the Conductrics quiz are sponsored by Conductrics. Conductrics builds industry leading experimentation software for A/B testing, adaptive optimization and predictive targeting. Go check about at conductrics.com for more information on how they can help you. Alright. Moe, are you ready? You are competing for our very good friend of the show, Robert Petrovic.
0:56:17.1 MK: Yes. Hey. Robert.
0:56:20.9 MH: Yes. And Tim, you are competing for our listener, Adam Woodhouse.
0:56:26.8 TW: Ooh. Excellent.
0:56:27.6 MH: So very cool. Alright, here we go. Here’s the quiz. The power hour gang is getting prepped for their next guest, but before the guest pops on, Michael mentions to the team, “Hey, I’ve got a question. I have sales data for a product that doesn’t have a demand that varies over time. However, the demand does differ by geographic regions and I calculated a mean sales value for each geography. I know these are just samples, so I wanted to give the client a measure of uncertainty around them as estimates of the true expected sales in those regions. I thought I would use the standard error, but I noticed that in geos with very large outliers, the standard error is huge. I noticed that in the calculation of the standard error, after we subtract each data elements value by the mean, we then square all of those mean subtractive values before we sum them up. The squaring has the effect of more heavily weighting the outlier observations in our standard error calculation.”
0:57:30.0 MH: So as to not have the outliers have such a huge impact, I removed the square terms and just added all the differences up, but I must be doing something wrong because I’m now seeing, A, the Helb’s error measure is now larger in the GEOs with outliers or B, the Helb’s error measure is now in the GEOs without outliers, C, the Helb’s error measurement is infinity for all the GEOs, D, the error measurement is zero for all the GEOs or E, the error measure is now larger in GEOs with a larger number of sales values that are greater than zero but less than one.
0:58:15.9 TW: I would say repeat the question but I don’t wanna subject any of our listeners to that.
0:58:20.8 MK: I’m really interested in the answer though.
0:58:23.4 MH: It’s a Gordian knot for sure.
0:58:25.6 TW: I’m like, this is where median absolute deviation, MAD, would be awesome. I think I always just got so excited that that would be the… And that’s not at all the…
0:58:34.8 MH: That’s not what I’m seeing. Yeah.
0:58:36.6 TW: Oh my God. He removed the outlier… Okay, good Lord. Well, who’s starting?
0:58:43.1 MK: Oh, I was very confident about removing one, and now I’m thinking of something that always happens in R and I’m like, maybe I shouldn’t remove that one.
0:58:52.5 MH: Interesting.
0:58:55.1 TW: Well, maybe I’ll start. And I’m gonna say I’m gonna eliminate the last one because it was like the longest and most detailed and wasn’t paired with something else. And so even though that…
0:59:06.3 MH: Okay.
0:59:07.9 TW: Just seems… Yeah, I’m gonna remove the…
0:59:10.4 MH: Remove E, error measures, now larger GEOs with a number… A larger number of sales values that are greater than zero but less than one. That was a great choice. It was a mouthful. Definitely it’s out of here. So now we just have A, B, C and D.
0:59:25.6 MK: Okay. I have got two guesses to what it is, so I’m gonna try and eliminate one of the other two.
0:59:33.7 MH: Okay.
0:59:34.5 MK: Which is D, the Helbs error measurement is zero for all the GEOs.
0:59:40.5 MH: You want to eliminate that one?
0:59:42.5 MK: Yes, but now you’re making me question myself.
0:59:45.5 TW: She’s like, “Wait, wait, don’t tell me.”
0:59:47.0 MH: I just wanted to be clear… I just wanted to be super clear that’s what you wanted to do. Alright, so we’ll go ahead and eliminate D and… Pum, pum, pum. Unfortunately that is the correct answer.
1:00:01.6 MH: Error measurement is zero for all the GEOs.
1:00:02.9 MK: Stop it.
1:00:04.5 MH: I’m so sorry, Moe.
1:00:05.1 MK: Wow.
1:00:06.4 MH: I’m so sorry.
1:00:06.6 TW: I’m literally gonna go back and listen to this quiz again and I may actually be…
1:00:10.6 MK: Me too. So removed the outliers and now the error measurement is zero. That’s interesting.
1:00:17.3 MH: Yeah. So here’s the answer, the average difference between each element and the mean is zero. If we multiply the mean sales values by the total number of records, it will equal the value of total sales, since we’re subtracting out the mean from each element, it stands to reason we’re removing the total sales value from the data, so it’s mean of the updated data after we remove the mean for each element should now be zero. It starts to be clear once you play around with a simple dataset. If you wanted to use the measurement of loss that is robust to outliers, you can use measures like the mean absolute error or the Huber loss, which is a combination of the quadratic and mean absolute errors.
1:00:57.0 TW: I still think MAD is an option, median absolute deviation. Now, I do want a sample dataset created to fiddle around with this one.
1:01:04.6 MK: I’m pretty sure Kirchhoff will do that.
1:01:06.7 MH: I’m pretty sure he actually already included one. [chuckle]
1:01:09.3 MK: Oh, stop it.
1:01:10.7 MH: No, but I wasn’t allowed to give it to you. I just did actually create one. Oh, it’s not gonna come through there, I have to find some way to show it to you.
1:01:17.5 TW: You know what? I’ll call in Audible. We will post this dataset in the show notes for this episode, if you would like to go and play around with this.
1:01:23.5 MH: There you go. If you would like to, yes. Alright. So that means Adam Woodhouse, you’re a winner. Tim, great job.
1:01:33.7 MH: Robert, we’re so sorry. But such is life. Alright. Thanks once again to Conductrics for being a sponsor and another awesome Conductrics quiz. Let’s get back to the show. Okay.
1:01:47.2 MH: Okay. This time we’re definitely gonna start to wrap up, because that’s what the employee survey says we have to do. No, I’m just kidding.
1:01:55.7 MH: Alright. Awesome conversation. Awesome. Thank you so much, Andrew. It’s phenomenal. Alright. One thing we do love to do is sort of go around the horn and share a last call, something of interest to our listeners that maybe we could share. Andrew, you’re our guest, do you have a last call you’d like to share?
1:02:13.9 AM: Yeah, I was having a think about this. I’m very fortunate in that my wife gave up corporate life and went to run a book shop locally, ’cause she was distraught that it was gonna go out of business. So I have access to a book shop, which is nice but it’s really dangerous, right? And…
1:02:27.7 TW: Nice. Wow. Yeah. You don’t get a discount?
1:02:32.5 AM: I get a discount…
1:02:35.1 TW: Okay.
1:02:35.4 AM: I get a discount but my mental model is that means I can buy more books. So I buy a lot, I read a subset of that. One that I really enjoyed last year, which is probably about 12 months old now, is a book called “The Great Demographic Reversal” by Charles Goodhart and I believe Manoj Pradhan. It’s a book that… It’s a macro economics sort of book looking at the changes in demographics in our society. One of the big things is that they argue that there’s a whole bunch of implications, including high inflation as a natural consequence of having increasing number of retired folks in society and folks who aren’t working compared to those who do so. I just found it complete mind-opener, it was an easy read. Charles Goodhart, especially is well-known in the economic space. I think Manoj was head economist at one of the big banks, so they got some good stories. And certainly, as a different way of thinking about our societies, it was good, and it survives COVID, if that makes any sense, right? So some of the numbers might be different now but the general themes I think are the same sort of thing, so I enjoy that.
1:03:50.7 MH: Outstanding. Okay. Tim, what about you?
1:03:54.9 TW: So I’m gonna go with a… Back when we had Cassie Kozyrkov on the part four of her Making Friends with Machine Learning series was about to come out, and then it subsequently came out, and I subsequently had it on my to-do list for several months, but I did finally get down to watching the part four of Making Friends with Machine Learning, and it was like an hour and a half and it was really good. It’s the one where she runs through kind of a quick survey explanation of K-means and K-Nearest Neighbors and classification and support vector machines. As I recall, in parts 1-3, she was very dismissive of part four and the specific models, but as much as I find myself kind of harkening back to things from those earlier parts is they’ve… I marinated on them, I was happy to get through part four and I do recommend it. I recommend the entire series, even though it’s a multi-hour commitment, but it’s free and it’s on YouTube, and it’s Cassie who’s just delightful.
1:04:54.3 MK: I was about to say, “Oh my God, Tim, I’m so surprised you didn’t describe part four as delightful” and there she rolls.
1:05:04.7 TW: It’s just so fun to watch somebody that smart explain stuff that clearly.
1:05:12.6 MH: We’ll definitely make sure that gets into the sentiment analysis.
1:05:16.8 MH: Alright. Moe, what about you? What’s your last call?
1:05:19.5 MK: I’ve got a twofer because, one I’ve already done before by given today’s show, I’m gonna revisit it, and the other one is just a fun one. So firstly, if you’ve enjoyed today’s conversation, I really encourage you to read “Work Rules” by Laszlo Bock, who led Google’s People Analytics team. I mean, they grew extreme about measuring whether people will eat less food, if you have smaller plates and all sorts of stuff, it really goes next level. But it is just a really fascinating read about how deep People Analytics can go and it’s just fascinating and a fun read.
1:05:56.1 TW: And they haven’t had an employee walk out in a few years, so that’s good.
1:06:02.0 MH: Okay.
1:06:02.1 MK: Oh jeez. But it’s what got me really interested in the topic in the first place, I think. Anyway, the other thing I wanted to do is to do a shout out to Rachael Gerson because…
1:06:12.2 MH: Speaking of Google, right?
1:06:14.6 MK: Oh yeah, I was like what did I do wrong? You were giving me a strange look, that kinda makes sense.
1:06:19.7 MH: No, she’s at Google, right?
1:06:20.0 MK: Yes, she’s at Google. But she posted a bunch of data memes the other day and it just made me really happy. So if you just are not good at jokes like me and you wanna give your team a laugh on Monday morning, check out Data Science Dojo on Instagram and look at the data science one-liners, you could even just have them up your sleeve for when you need a good data joke. It’s just light and harmless and it was fun, made me smile.
1:06:45.9 TW: Very nice. What about you, Michael?
1:06:48.5 MH: Well, I’m glad you asked. So at Super Week this year, David Vallejo demoed a tool that he had built and it’s still kind of in beta, but I really was blown away by it. So if you live in the world of data implementation, especially in mobile apps, doing debugging of Google Analytics specifically is really annoying and hard to do, and he’s built this amazing tool to be able to do that, and so I would suggest checking it out, I thought it is… Well, it’s kind of a labor of love on his part, but he’s building such a great tool for the community. And if you do have anything to do with sort of implementation, it’s an Android debugger for GA, Google Analytics related things, so a little bit specific, but there’s plenty of people in our listener base who are having to deal with that and don’t have great tools to help. Alright. Well, I’m sure as you’ve been listening, you’ve been getting some ideas or thoughts and we’d love to hear from you. The best way to do that is through the Measure Slack group or on Twitter or you can also email us at firstname.lastname@example.org. Andrew, I don’t know, do you do much in the social media realm? If people wanted to reach out, maybe they could reach out via LinkedIn or something like that.
1:08:08.9 AM: It always falls down during work times and we’re super, super, super busy.
1:08:13.2 MH: Totally understandable.
1:08:14.4 AM: So LinkedIn is best and… Yeah. I’m the only Andrew Marritt, I think, or one of the only Andrew Marritt, certainly the only one there, so you can find me.
1:08:23.3 MH: And OrganizationView is the name of the company. So yeah, I think people might wanna reach out and get some more information as well, and so we wanna make that available. Andrew, once again, very stimulating conversation, thank you so much for coming on the show today, really appreciated.
1:08:40.2 AM: I’ve had a great time, thank you.
1:08:41.7 MH: Awesome. And we also wanna thank our producer, Josh Crowhurst, who does so much behind the scenes to make this show possible, so Josh, thank you for all of your excellent work. And thanks to our listeners. In this case, Matthew Brandt, thank you so much for the suggestion, we really appreciate it, and if you’ve got suggestions as a listener, we’d really love to hear ’em. It turns out pretty good sometimes.
1:09:05.2 MH: Really good, really good in this case.
1:09:08.2 TW: We will provide Matthew with that feedback, we’ll…
1:09:10.9 MH: That’s right. And obviously, we’d love to see, if you like the show, give us a rating and review on whatever platform you listen to it on, we appreciate that. Alright, I know that I speak for both of my co-hosts, both Tim and Moe, when I say no matter how qualitative the data is, no matter your engagement level, keep analyzing.
1:09:36.5 Announcer: Thanks for listening. Let’s keep the conversation going with your comments, suggestions and questions on Twitter at @analyticshour, on the web, @analyticshour.io, our LinkedIn group and the Measure Chat Slack group. Music for the podcast by Josh Crowhurst.
1:09:54.5 Charles Barkley: So smart guys want to fit in, so they made up a term called Analytics. Analytics don’t work.
1:10:01.1 Tom Hammerschmidt: Analytics. Oh my God, what the fuck does that even mean?
1:10:09.4 TW: Oh well, I guess I will throw in, there’s also a thing that I’m gonna… I will yell out at the end that is maybe Rock flag and something which is just a goofy thing that goes in the…
1:10:19.9 MK: You are gonna start telling guests that and not shocking them?
1:10:23.0 TW: It’s… You guys, I thought I was forbidden to tell guests that and you guys are like, “No Tim, you’re fine to tell guests.”
1:10:28.2 MH: It’s totally okay. I think it’s fine.
1:10:30.1 MK: No, I don’t think we should. I like how awkward it is.
1:10:31.8 MH: It doesn’t… It doesn’t change the experience one way or the other for the listener.
1:10:36.5 TW: This is what it’s like working for this show, getting conflicting information from what’s going on to the next…
1:10:39.3 MK: Poor Andrew was like, what the hell have I signed up for here?
1:10:42.4 MH: Yeah, he’s sort of like, could we just do the podcast?
1:10:49.3 MK: Interestingly, survey actually is like… It’s a bonafide system because, yes, I believe in my privacy and obfuscating the user, but basically there is no way that you can look at any road level data, even totally anonymized, because they’re like, basically, if we had a couple of different questions stitched together, someone might be able to hypothetically figure out who it was, which would still be quite tricky, and so therefore, you can’t actually pull all the data out and it kills me, kills me.
1:11:24.4 TW: ‘Cause you really wanna know exactly who said exactly what, deep down?
1:11:28.2 MK: No, because I wanna be able to look at relationships between different questions, which you can’t do.
1:11:33.8 TW: No, no, I was just trying to goad you to saying something while we were still recording the… I’m not gonna be able to…
1:11:38.3 MK: No, I would never fall for that, Tim ha-ha-ha.
1:11:47.4 TW: Rock flag and fill out your employee survey.
Subscribe: Google Podcasts | RSS
This site uses Akismet to reduce spam. Learn how your comment data is processed.
https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_220_-_Product_Management_for_Data_Products_and_Data_Platforms_with_Austin_Byrne.mp3Podcast: Download | EmbedSubscribe: Google Podcasts | RSSTweetShareShareEmail0 Shares
Subscribe: Google Podcasts | RSS