Subscribe: Google Podcasts | RSS
Subscribe: Google Podcasts | RSS
Have you ever worked in a large organization where the data team(s) are perfectly structured to deliver efficient, harmonious, and meaningful results to the business with ‘nary a gap nor a redundancy? If you answered “yes,” then we’ll go ahead and report you to HR for being a LIAR! From high growth startups to staid enterprises, figuring out how to organize the data and data-adjacent teams is always chock full of tradeoffs. And that’s the topic of this episode.
0:00:05.7 Announcer: Welcome to the Analytics Power Hour, analytics topics covered conversationally and sometimes with explicit language. Here are your hosts, Moe, Michael and Tim.
0:00:21.7 Michael Helbling: Hey, everyone. It’s Episode 174 of the Analytics Power Hour. What do you think of that intro, Tim?
0:00:31.7 Tim Wilson: I’ll provide feedback through the normal channels and let your manager…
0:00:37.3 MH: Let my manager know?
0:00:39.8 TW: Yeah.
0:00:39.9 MH: Moe, would you be willing to represent me to upper management in case Tim’s request goes through?
0:00:44.6 Moe Kiss: Well, it depends on your performance.
0:00:48.0 TW: I am gonna need the latest org chart, by the way, just so I know who that is.
0:00:51.3 MH: The latest org chart? Okay. Well, Tim Wilson, you are my co-host, senior director at Search Discovery. Moe Kiss, you are also my co-host, and you are the marketing analytics lead at Canva. And I’m Michael Helbling, and I am the managing partner of Stacked Analytics and this, you’ve blundered into the next episode of the Analytics Power Hour, and we’re gonna talk about something that happens to all of us.
0:01:23.7 MH: See, at work, all of us data professionals usually work at a company of some kind. And the one thing that a company has is some sort of org structure. Honestly, if people would just let us do our jobs and leave us alone, I kind of think maybe things would be alright, but that’s just not how it usually is. We put a brave face on it and try to give a sense of order around who does what and where, and where they fit in the organisation, and frankly, it’s a big topic. We’ll do the best we can to cover it. Tim, you are the quintessential analyst.
0:02:01.8 TW: Oh, brother.
0:02:03.4 MH: Do you wanna kick us off with your vision for how the analytics org should be set up now? It’s probably worth saying, we’ve done shows that have covered aspects of this before, maybe even exactly this before, but actually the world of analytics keeps changing, and so the org structure, I think, changes as well. So I think this is a refresh of something we’ve talked about before.
0:02:26.2 TW: And I think no organisation happens in… Like we can say the analytics organisation or the data team, or the data team is… That’s what Moe would say ’cause…
0:02:35.5 MK: The data team, yep.
0:02:37.2 MH: The data.
0:02:37.3 TW: But the fact is is that, because inherently, analytics supports the enterprise, there are interdependencies with how the stakeholders are organised. So to me, it’s like, well, hire people who are super motivated, super accountable and super bright, and then just kind of let them sort of all run and follow their passions and figure it out, with one person to address random friction points as they emerge. I’ve never actually seen that model work…
0:03:08.7 MK: Huh?
0:03:10.6 TW: So that just falls back on the, “Do your fucking job… ”
0:03:14.5 MH: We’re gonna shift gears ’cause Moe, you’re the most authoritative voice on this.
0:03:18.4 TW: No, no, no, my point is that I think a lot of shifting around of who does what is, one of the biggest drivers, is basically organisational or individual incompetence, so there’s an idea that if you have mediocrity, you wind up saying, “Oh, HR thinks we can fix that by changing a management structure or reshuffling things where… ” that’s the easy solve. Go in and change the way it’s structured, as opposed to saying, “Do we have the right people, doing the right job, pointed in the right direction?” So while I was being quite facetious, I do think that a lot of times, we’re trying to use an organisational structure to fix what are actually other issues.
0:04:06.4 MK: I will say in semi… It’s gonna sound like agreement, so I’ll call it that. But…
0:04:15.4 MK: We did have someone start on my team pretty recently and basically, his job description is… I’m like… Yeah, my team has a marketing function, what we call engagement, so which is like after you learn on the site and then we have like a data platform team, and his function is basically to fix fucking problems. He’s not responsible to a particular stakeholder, he’s just very good at being like, “There is a problem here and I’m just gonna go in and fix it.” And I’m like, “Cool”, that can be your remit. Your remit doesn’t have to be like, “I look after paid search” or “I look after the login team or whatever”, it’s like, “No, just fix problems.”
0:04:57.0 TW: He’s fixing data problems, data integrity problems or fixing… Is it broader than that?
0:05:01.7 MK: Yeah, at the moment, he’s working on brand measurement, he’s done some stuff with improving the quality of our data with specific reference to geography data, that sort of thing. But I don’t think that works for everyone. I think that works for a very unique, highly motivated individual. I feel like lots of times, if you don’t have clear descriptions or org charts or boundaries, shit just falls in a gap because neither team owns it.
0:05:30.8 MH: Yeah.
0:05:31.5 TW: But I think that’s the challenge when it’s like trying to draw… When we you think of boundaries, we envision boundaries as being clear and hard boundaries, and we expect or think collaboration will happen, and the only way for… The problem with trying to work it that way, which I think is a reality, is that if both parties… If anybody doesn’t quite work up to the boundary, then you wind up with a gap, and if people say, “Well, I’m gonna go a little bit extra into your area so that we’ve got… ” Well, then you wind up with the risk of people getting territorial or confusion or something becomes the norm, and then everybody says, it’s confused. But…
0:06:08.2 MK: I think this is the thing, like if you do have a really good team of highly motivated individuals, having an org chart isn’t gonna be restrictive, like…
0:06:16.7 MH: I agree.
0:06:18.2 MK: Yeah, I feel like they’re gonna do their fucking job, in the words of Tim Wilson.
0:06:21.4 MH: Yeah, well, it’s interesting, Tim, because your initial description to me sounded very much sort of like a startup or entrepreneurial space. You see a problem, like go after that problem, fix that problem, or take care of that thing. And I think that does work really well in some companies, but I also think there’s companies of like really large size where you can’t afford to leave your post, so to speak, to go fix something you see because you’ve got enough to do in front of you, that’s why you’ve got the job you have, and so you do have to have people you depend on. And I think that’s what falls apart so many analytics orgs is you get leaders who don’t understand their role in the process or in the org, and so they let things slip or they don’t pursue the right things on behalf of the analytics folks.
0:07:15.5 MH: You get analytics folks too caught up in their projects, so not to call out data scientists, but they totally do this, and analysts do too, so it’s all of us. I’m not a data scientist, so I see it easier in them. But it’s sort of like, “Oh, I’ve been funneling with this model,” is like the whole rest of the thing is burning down around you, like go work on the thing that’s the problem, and so… But that’s what I mean is, I think that’s the other thing is, you’ve gotta have some sort of structure. I think for, depending on the org and its size, I feel like every time you come back to this, you always have to talk a little bit about org size and org momentum, and use of data, all those things seem to make an impact on what you need to do to structure a team.
0:08:05.3 TW: And the type of organisation, so you take a consultancy and because those are… The bulk of the work is gonna be engagement-driven, so to me, there’s inherently more, there’s a structure in order to support compensation and reviews and stuff, but for the most part, you’re defining engagements on a finite period of time, and you can kind of structure the right team with the right resources. I do mostly work with really large enterprises, and the challenge I see is if we… Go beyond the analyst, but you’ve got the DevOps people, you’ve got the data engineers, you’ve got… And when you’ve got these organisations that have a ton of data from a bunch of different places, some of it varying degrees of sensitivity, they’ve gotta have some governance around it, but you wind up… If you sort of try to visualise it, you realise you’ve got a whole bunch of nodes like, oh, these people own the data lake, and that’s a great idea, except now literally everybody has a request of things that need to go into the data lake.
0:09:07.3 TW: And then they build a mark off of the data lake for one purpose, and that’s awesome, except then other people realise, oh, that’s the mark we use for executing our marketing activities, but there’s perfect date, we wanna report on those, so then the analysts are using that mark and the analysts are like, there’s data that’s not in the mark, can we get it added into the mark? So it’s like a whole bunch of interconnected nodes that are all… And they’re all sitting saying, “I’ve got… ” Everyone is saying, “I’ve got 50 different stakeholders feeding requests into me and I need to prioritise them,” and I don’t know what you describe that concept, but I do agree. You’ve gotta define what is your responsibility, but I think one of the challenges in enterprises is that there are interdependencies out the wazoo. One analyst supports three different business groups, so those business groups aren’t worrying about each other’s relative priorities, they’re just saying, “I’ll prioritise mine.”
0:10:02.3 TW: Now the analyst team has to figure out how they’re gonna prioritise, but maybe that gets to a centre of excellence versus kind of a embedded versus… Even with those different structures are, I admit, I sometimes get confused, like we say centre of excellence, I’m like, are there an infinite number of ways to actually structure a centre of excellence and call it a centre of excellence?
0:10:25.9 MK: You could just say centralised team.
0:10:28.4 MH: Yeah, centralised team, or, but… Well, except there are times where I’ve seen where it’s like, we’ve got a core team that’s are the experts, but then they’re still… They’re providing ad hoc consultative services to the separate teams.
0:10:45.2 MK: The embedded ad analysts. Yeah. That’s a hybrid version.
0:10:48.8 TW: That’s a hybrid. Okay.
0:10:51.3 MH: But I think there’s a real big challenge. Moe, I think you’re the one that actually first commented on this in our nodes talk about in that construct, prioritisation happens and if a certain team isn’t happy with their priority, that’s where you get sort of siloed other stuff, they’re just gonna go, “Well, we’ll go fire up our own analytics capability over here, yeah, because we’re just not getting the attention.” It’s funny to me that actually small organisations are just as apt to do this. And honestly, I’m not even sure, maybe even more so, because they’re just not as sophisticated sometimes as enterprise organisations, or rather it happens in smaller companies to me, as much as it happens in large ones because people feel like they have a need, so they just go solve it. And it’s, the funny thing is…
0:11:39.4 TW: Short term versus long-term trade-off, in both cases is…
0:11:43.7 MH: Well, ’cause we needed to do some sort of data strategy layer to basically understand those needs and then make sure that we were able to meet those in a timely fashion as a data organisation, but a lot of times, I found two problems, I think. I’d love to hear your reactions to this. One is nobody takes the time to actually do that work. The other is…
0:12:04.9 MK: Often though, they don’t have the time, I mean, you should make time, but a lot of the time, it’s because people are drowning.
0:12:11.8 MH: Or can’t tell you they have no idea how to communicate what their needs are. Like well… I’ve literally had people sort of basically tell me sometimes, “I’ll know it when I see it.” I was like, “That’s no framework. That’s no way to work. I can’t build you anything that I know is gonna be like, ‘Yeah. We don’t have an agreement on what success looks like. We’re not… ‘”
0:12:36.3 TW: Data strategy is not pornography, damn it.
0:12:38.3 MH: Yeah, well, that might be the… Hey, maybe that’s the title. Moe, you’re the one with boots on the ground here. So I know you’re leading this, how do you navigate this?
0:12:53.6 MK: My experience is very siloed, in that basically when I joined The Iconic, we went through all this stuff. We were like 150 people, by the time we left, we were about 600. Same thing at Canva. I joined, we were about 500 people, now we’re well over a 1000. And in terms of analysts, I was the ninth hire and we’re now over 50 data analysts. And that’s not even including machine learning engineers, data engineering… Structure is becoming a really big problem as you start to grow, and it has been really… I think the thing I probably learned the most is watching as we built the teams out. My team in marketing engagement was the first, I guess, team to service a group or a department. We call them groups. And then they got stood up in other groups and it was really interesting because the different team leaders didn’t necessarily have the same structure. So in marketing engagement, we decided to do an embedded structure or cross-functional, where an analyst focuses on paid marketing and other analysts is embedded with the login product team or the sign-up product team.
0:14:07.3 MK: And another area of the business decided to do centralised teams. So they had a pool of analysts, they had five analysts, and would just get tickets and it was fascinating, watching those two different structures within one organisation and see the different teething pains that we had. In particular, one thing that I found is the area of the business that I was in was the unhappiest with data out of any area of the business, and now they are the happiest. And I think a big part of that was like they felt that they had their person, they had their analyst to go and talk to. We had an analyst that specialised in mobile. I can’t say it like an American.
0:14:51.5 TW: Mobile.
0:14:52.1 MK: Mobile. I say “Mobile.” You say, “Mobile, mobile”?
0:14:55.4 MK: I don’t know, anyway…
0:14:57.4 MH: There’s also Mobile, Alabama.
0:15:00.2 TW: I mean, It’s interesting ’cause I’m the opposite, I like to go into teams that are actually pretty happy with what they’re doing and by the time I’m done, they’re really unhappy with me… [laughter] They’re like, “Eh.”
0:15:10.9 MK: But it was funny how them just feeling that they had an analyst to talk through the problems and to spitball ideas and, “This metric’s going down, like what’s going on here? Can we try and understand it?”
0:15:24.4 TW: You were in more of the former mechanism where you were focused on… Which… Of those two structures you just described, which one were you operating in, in that case?
0:15:33.9 MK: My team was doing the embedded. So, for each area of our group, there was a dedicated analyst.
0:15:42.0 TW: Okay. Which I think… I hate the hybrid thing, but it’s like the having the pure pool of resources where get a ticket and work it just has such gross inefficiency. One, consistency of delivery and two, the poor business user has to re-explain what’s going on, but it has the benefit of like, well, now you’ve got a broader understanding of what’s going on in the organisation. So there’s potential to say, “Oh well, this other group was doing something similar, maybe we should look at that.” So I have worked in cases where it’s like you’re kind of embedding them, but there’s a little bit of a flex in that, yeah, other things may come in. So I feel like all of these wind up being very bespoke, kind of custom like, what seems like it would work and now let’s not have it so locked in and so rigid that we can’t sort of adjust a little bit from that.
0:16:44.5 MH: Let’s step away for a minute and talk about something that absolutely impacts any data professional regardless of org structure, and that’s the quality of the data itself, and you can’t do an analysis on bad data. And a lot of times, data just sits there and no one knows it’s bad until all of a sudden, you try to pull something out of it and that’s probably the worst. I don’t know Tim, what… [chuckle] What have you seen in this area?
0:17:11.7 TW: Definitely don’t like the bad data. Do we have any way to get around that?
0:17:17.6 MH: Well, as a matter of fact, our sponsor, ObservePoint, has products that actually are specifically designed to help with this, who with their Digital Data Governance Suite, privacy compliance tools, ways to do journeys and audits of rules and looking at comparisons of data and data collection. So yeah, there’s a pretty nice tool out there that people can use for… [chuckle] For this.
0:17:43.1 TW: They have stuff you can automate, automatically audit your data collection for errors across the whole site, you can basically pick your most important pages and user paths for functionality, check them for accurate data collection. You can get alerts if something goes wrong, you can basically track it over time, track your data quality and QA over time.
0:18:02.6 MK: And data quality really matters to me, so I recommend checking out their demo at observepoint.com/analyticspowerhour. And there you can learn lots more about ObservePoint’s full data governance capabilities.
0:18:15.4 TW: Keep that data clean.
0:18:18.6 MH: I love it. Let’s get back to the show.
0:18:21.4 MK: The funny thing is, is that we did go towards a hybrid model. So of the 50 pool of analysts, we went towards a hybrid model, where we had a centralised team, shockingly called Analytics Foundations. Because one of the biggest problems like, and this happens in every time you have embedded analysts, the cross-functional data work that needs to be done falls away because you have a main stakeholder and your main stakeholder doesn’t give a shit about data integrity across the whole business, they care about their particular subject. So that always ends up being a massive flaw of an embedded model or a cross-functional model, and so a hybrid model makes sense.
0:19:00.0 MK: Which we did, and then we’ve actually disbanded that team, I’m not entirely sure why. So that’s now gone, but now my team is actually moving towards a hybrid model. So we kind of have this thing called a M&E data platform or M&E platform, which is gonna do basically the data foundations for our entire group, and then we’re gonna have embedded analysts for marketing and embedded analysts for engagement. But it’s… I don’t know, I guess from…
0:19:24.6 TW: So what… What’s the M&E?
0:19:29.2 MK: Marketing and engagement.
0:19:30.7 TW: Marketing, oh, okay. Marketing and engagement.
0:19:31.2 MK: Yeah, sorry.
0:19:32.7 TW: That was the data, that was the… Got it, okay.
0:19:34.8 MH: Here in the US, we use the word experience now.
0:19:37.6 TW: Well.
0:19:38.6 MH: Yeah, so the experience economy… No, I’m just kidding, I’m sorry.
0:19:41.0 MK: The point is that maybe this stuff just has to change all the time, as needs in the business change, like, I don’t know.
0:19:48.8 MH: Well, one thing we’ve been discussing but actually haven’t really said something about is even from five years ago, data orgs or analytics orgs have vastly changed with the emergence of data engineering roles and analytics engineering roles…
0:20:04.2 MK: 100%.
0:20:05.7 MH: Becoming pretty much common. Every company and org is starting to wrap their arms around that and try to figure out where that lives, and what’s interesting to me is like organisationally, it’s actually started to create this tension where analytics lived over in marketing for so long and now it’s starting to shift back over to IT in a lot of organisations, and it’s really interesting ’cause, well, web analytics started an IT with server log files and parsing them, and then moved over to marketing as marketers got ahold of that, and now it feels like the pendulum is swinging back over to IT, as we’re having to build out all these data lakes and ETLs and data imports and all these different things, which is a lot of time-consuming tasks. And most of the reporting systems on top of that are highly technical too, so it requires a lot of technical work to maintain those operational systems. And it’s like, well, that’s not really a marketing function anymore. And I wonder like, do we think that analytics people are moving from marketing back over to IT, or?
0:21:11.3 MK: Well, for me, they’ve… For me, they’ve only ever been in technology or engineering, and that’s at both of the places I’ve worked, but we’re actually going through this at the moment, where we are changing our data engineering structure a little bit as well, and we’re basically gonna have a machine learning engineering function which we’ve always kind of had, but it’s just kind of getting refined a bit and acknowledging that it needs to be a bigger place and that sits in hard engineering, whereas the rest of the data team and we do now, are gonna be having analytics engineers specifically called that. And they sit within tech, but not necessarily as an engineer, whereas the machine learning engineers are really sitting in that system. But I think for me, that’s my experience, but I’ve been in startups, so that’s… I feel like startups are always the ones trying the new shit.
0:21:58.0 TW: Yeah. Well, and maybe there’s part of what this… ‘Cause then, I’m thinking of another situation now that I’ve realised as we have talked many times, like great analysts are, with sufficiently broad and deep skills, are in high demand and hard to find, and I think there are organisational pressures to try to figure out how to do a structure, where the work that that kind of analyst would do is sort of dispersed. So if we put in this BI platform and make all this… Like the citizen data scientist, that’s basically a punt to say, we just want our business users to be doing analysis.
0:22:48.3 TW: And so I think that may be part of it too, that there is such a scarcity of resources. You’re a good example, Moe, word gets out, somebody in the M&E team goes somewhere else, and they’re like, “Man, we really wanna get Moe over in our group” and so there starts to be sort of a constant pressure, we don’t have enough analysts, oh, the technology team says that if we put in this new tool, we won’t need those analysts because the data will be at the fingertips of the marketers or the product owners, which it’s easy to kinda cross that line between yes, you want the marketers and the product owners to be comfortable with data and able to do self-service to a certain extent, but there’s a big leap from that sort of self-service to what a skilled analyst is going to deliver. And so I guess I’ve seen that too, I’m thinking of another… Yeah.
0:23:44.0 MK: We… This happened, well, probably last year, we had a very new sales function, sales team’s like, “Hey, we’re gonna get this tool, it’s gonna let all the salespeople have the data they need,” and we kind of had to say, “Pause, hang on a minute, you really, someone has to implement this tool, and someone has to actually make all those connections to have the data flow in and the data flow out,” and they’re like, oh. And I think sometimes there is this naivety around what needs to happen largely from an engineering side to make some of these tools work and to make sure that they’re compliant, and you have the data flows that you need.
0:24:24.1 TW: That is the… It is easy to say, “Oh, this one platform is where we’re gonna just house all of our data,” but no matter who you ask, that all of the data is at most 60% of all of the data that’s needed in the enterprise, and so that can happen too, where somebody is trying to take the longer view, as you said, maybe they didn’t really have… They knew the project would collapse under its own weight if they truly went to everyone in the enterprise and figured out their data needs and try to architect a solution, which organisations try and then they get… They say, “Okay, well, it’s gonna take us two years to get the foundation built to start putting data into it,” which is fine, except all these other groups then start saying, “Well, I need to do stuff now,” so they start getting their own bespoke solutions and causing all sorts of problems. So I’ve seen where it’s like the solves… Someone says, “Oh, this is all the data from marketing,” and it’s like, “What do you mean by marketing?” “Oh, well, marketing means A, B and C.” And it’s like, “What about D and E?” They’re like, “Oh, yeah, it doesn’t include that.”
0:25:28.1 TW: Like, what? That’s clearly part of marketing, so there is a precision of language around how does all this work? And they’re like, “Well, we knew, Salesforce said that there are built-in dashboards and we just have to pull our data in,” or good Lord… Adobe. Adobe’s Edge experience, a whatever, you can just pump all your data into that, and it’s like, well, but that doesn’t just happen. And any time you’re moving that data around, there is a cost to stand up and to maintain it, and who’s owning that and who’s… And boy, I hope the person who’s owning it architected it in a scalable way, ’cause that person is not gonna be owning it forever, they may own it for six months or three years, but they’re not gonna be in that role forever.
0:26:14.0 MK: How big, Tim, are the companies you’re working with? ‘Cause that sounds like big company stuff, I mean it happens a little bit but…
0:26:20.4 TW: These are… There’s a multi-billion for the…
0:26:24.4 MH: Tim only works with the top of the top.
0:26:32.1 TW: No. But even in my… When I worked at… When I worked at the insurance industry, and the companies that have been around for 50 years and have had these bespoke solutions or they’ve done acquisitions, that’s one… Like a healthcare system, where they literally buy another hospital and then they’re integrating, and integrating the data is one piece of it, they’re integrating all of their processes, and it takes multiple years, and there’s what is the pristine, pure way to do it right and then there is, what is the absolutely wrong, short-term way of thinking?
0:27:07.5 MK: One of the things that we kind of did play around with is “you build it, you earn it”, which basically, we have a lot of analysts obviously that do engineering and data… Sorry, analytics engineering, I need to specify.
0:27:23.2 TW: Is it the person or is it the function or is it the group?
0:27:25.5 MK: Well, that’s actually where we had some teaming problems, is like… So we have an analyst on our team who’s a rockstar, shoutout to Fincy, and she built a lot of our foundations like sessionisation tables and things like that, and it kept always going. She ended up moving teams, but she was still getting pulled into code reviews because she was the owner of that table and the owner of that data. And she was the expert, and it got to a point where things were breaking and we kind of had to be like, hang on a minute, this system of “you built it, you own it” doesn’t work anymore, there needs to be a hand-off at some point, and a hand-off happens when you move teams or whatever it is, and then it becomes the new team’s responsibility. And we only learned that by, I guess, the hard way, it’s like you make the mistake and realise like, “Oh shit. This doesn’t work.”
0:28:15.1 MH: Yeah, and then we go back to the individual, like, “Fincy, how come you didn’t transfer all this knowledge?” It’s like, well A, we need a process, we need… How is that one person supposed to know everything they’re supposed to communicate to the next in line? We have to assist. The thing I was gonna mention, Tim, is you’re kind of describing what you’re talking about before, this comes back, you hear a lot about an iterative approach or a minimum viable product when it comes to some of these attempts, and I think the vendors do play a part in convincing people how “simple” it is, and do we just misestimate what is actually needed by a lot… Across analytics projects?
0:29:03.9 TW: Always. That’s a rule, I think that is an absolute truism.
0:29:08.4 MK: But I also think we misestimate, not the time, but like the competing priorities, we’ve done multiple migrations and every time they blow out, and it’s always because you get pulled in other directions.
0:29:21.0 MH: So then it’s a throughput or a stick-to-itiveness problem or one of the… It’s like either an execution, focus or bandwidth problem, and you can solve those problems, you go build or get more bandwidth, you can give people the ability to focus, you can… And so that’s what I wonder is sort of like, “Do we just go into every conversation and be like, ‘It’s actually 10X to what you’re thinking’”? Well, let’s start there and then we can talk about it. I don’t know, what do you think?
0:29:57.0 TW: I think we sometimes under-appreciate the value of, I’m gonna say, brilliant architects. I’ve said this about Adam Greco forever, and we’re talking just Adobe, which is just one Adobe Analytics system. It’s just one stupidly complex system, but I’ve always said that the process he goes through, it’s more… What he delivers is typically more than what somebody would wanna say is their shortest way to actually implement the platform, but very consistently, what he designs and says implement has long-term scalability, it will answer questions that nobody’s explicitly asked yet, and they’re not gonna ask for a year or two; it doesn’t mean the implementation doesn’t need to be maintained and looked at, but I think when we’re talking about data environments, the number of people who buy the bullshit that the vendor sales guys shovel is insane.
0:30:58.9 TW: Like if you were over 30, you should know the size of the grain of salt you should be sucking on the whole time any platform vendor is telling you, “This is all that’s involved” and there are incredible number of people who… They wanna hear that shortcut, they’re easy, easy pickings. But when implementing something, it’s gotta be that trade-off of, how is the minimum viable product heading in the right direction? It needs to be built out, it’s not a proof of concept, it’s not, we’re building a POC that we know we’re gonna have to throw out ’cause it’s not gonna stand the test of time, it needs to be something designed, which I’ve watched those people who were like, “Boy, that database structure seems a little… Does that have to be, I don’t know, de-normalised or something?” And they’re like, “Trust me, this is gonna pay dividends,” so trying to balance the, “I can’t build the complete pristine system, I don’t wanna take too many shortcuts,” it comes down to me, to human beings who say, “This is the path that is gonna get us to value quickly enough, but it’s also gonna be extensible, not heavy heavy rework; even some agree, rework is fine.
0:32:13.6 MK: And I think that that path to value depends on the organisation. Like we probably only build for a year or two? Like, I don’t think we would plan three years out because the company changes so rapidly that we know we’re gonna rebuild in two years anyway, it doesn’t make sense for us to have a five-year data strategy because it’s a completely different company by then. And we also know that the patients of the organisation wouldn’t let us… If we said we’re implementing something that’s gonna take two years to build, everyone would laugh in our face, but if we say, “Hey,” and we did this to be fair, we went through a migration and we basically said, “You’re not gonna have an analyst for a season, we’re gonna pretty much go offline for three months and move a whole bunch of data, but guess what, after that’s over, things will be usable and that actually worked for us.
0:33:05.8 TW: For 12 to 18 months.
0:33:07.3 MK: Well, yeah.
0:33:09.9 MH: But I also think, Moe, it’s really interesting, but the organisation that you work in is a fast growth startup unicorn, very unique. And I listened to the two-year remark you made, and I was like, “Two-year?” I was like, but actually, even in enterprise organisations, turnover happens that much, and turnover is one of the biggest functions of redoing everything. Because the next person you bring in is like, “Well, I’ve always used Google, so we’re gonna go on to Google Analytics now; even though you spent the last five years setting up an entire ecosystem with Adobe, we’re going full Google now, ’cause that’s what I know how to use.” And that was just as likely to change everything up as shifting strategies is just shifting people.
0:34:03.0 TW: But I think that even architecting a, we don’t know what we’re gonna need to do in two years but the way we’re gonna build this is going to allow us to potentially rip out pieces and parts, like building a monolithic, we’ve built or build this all on our AS400 and two years from now, we’re gonna basically have to start from scratch and do a massive migration, so it’s another sort of thing. I think there are designs that are like, “No, no, no,” we do feel like there’s a very good chance that this is our core underlying data lake, okay, let’s build that. Now how are we gonna build hooks into that, so that we could swap out the data lake or we could swap out the hooks?
0:34:45.4 MK: Yes.
0:34:47.8 MH: Yeah. No, it’s a well-made point, for sure.
0:34:48.8 TW: How could we… Which is I guess it’s like having a data layer versus the… Not that you see a lot of people saying, “I’m swapping out my digital analytics tool and using the same data layer,” but at least there was the promise of that.
0:35:00.5 MK: How do we go from organisational structures to a whole bunch of tooling and implementation stuff? I feel like this is a true lobby bar conversation…
0:35:08.9 TW: Yeah.
0:35:09.0 MH: It is a true lobby bar conversation and those things impact the org structures.
0:35:16.2 TW: Well, but who owns that? That is the sort of thing that it’s like, yeah, it should be kind of an enterprise function, but it’s gotta be connected and tapped into those different groups, and yet organisationally… Again, I’m thinking specific examples, that centralised team of which there were… By the way, there are four centralised teams that are all with an enterprise mandate and all working with data, so that’s a little fucked up, but the one that says, “We don’t have the capacity to go around to all these hundreds of different constituencies and explain to them what we’re doing, so that they can provide us their needs in a good form,” so I don’t know, maybe I’m just… This is why big ass enterprises just have so… There are things that are… They are inefficient, but they’re not… They’re not inefficient due to incompetence of the org design, it’s just you’re trying to get a shitton of people to coordinate.
0:36:19.3 MH: Yeah, it’s not easy.
0:36:19.5 TW: I mean, I managed the budget once on a project where I asked our technology team to stop hiring people to support my… And they were like, “Oh yeah, we added three more people to your project,” and I’m like, please stop. They just generate more documentation…
0:36:31.6 MH: You’re making it worse.
0:36:32.8 TW: I was like, I would like us to cut this in half, get the whole team in a room and say, “Can we run in that direction?” And that wasn’t on a… There was data involved, but that was more of a transactional system. So yeah, there’s just some degree of how many people can actually collaborate efficiently before you start saying, “You’re just gonna keep re-orging, trying to figure out your… ” It’s a whack-a-mole.
0:37:00.5 MH: Alright. It’s the time in the show when we do the Conductrics quiz, the quizzical query of wonder that is the thing that we ponder. Tim and Moe, going toe-to-toe on behalf of our listeners…
0:37:12.4 TW: Oh, geez! [laughter]
0:37:14.6 MH: Sponsored by Conductrics. Hey did you know that your AB testing vendor often promises a silver bullet to make experimentation easy, but instead leaves you with a lead slug? I assume that’s sort of like the thing that crawls across the road and leaves that slimy trail. Anyways, running an effective experimentation programme is really hard work, that’s why partnering with technology vendors that pretend otherwise just leads to failure, you need a technology partner that is not only innovative, but forthright about the challenges to build and maintain a successful experimentation programme, and that’s Conductrics. They’ve written one of the first APIs for AB testing, contextual bandits and predictive targeting, and really what really sets them apart is they go above and beyond to exceed expectations and really provide honest feedback to help their clients achieve their goals.
0:38:03.6 MH: Alright, let’s do a quiz. Moe, do you wanna know who you’re representing?
0:38:10.0 MK: Sure.
0:38:13.5 TW: It is none other than Fred Pike.
0:38:14.6 MK: Oh, I love Fred. I’m so sorry, Fred. I don’t wanna let Fred down.
0:38:20.2 TW: Now the pressure’s on.
0:38:20.5 MH: No, no, no. This one actually, Moe, I feel like maybe more your direction than… We’ll see, we’ll see. And then Tim, you are representing another listener, who is Marissa Goldsmith. So thank you, Fred and Marissa, and I’m excited to see which of you win the exciting prizes. Okay, so let’s get into the question. Are you ready?
0:38:45.9 TW: Nope.
0:38:50.1 MH: We’ve been talking about collecting and storing and retrieving data as core activities for analytics, that’s why it’s a bit surprising, the name Edgar Codd isn’t inscribed on every analyst’s heart. I don’t write these. You know that, right? Anyways, in 1970, Codd ushered into the world, the modern relational database. Associated with his relational database model was development of a query language.
0:39:19.5 MH: Codd based his proposed query language on a mathematical formalism called the Relational Calculus, which roughly has the same expressive power as first order logic. You may be thinking, this query language was SQL, S-Q-L, but that wouldn’t be quite correct. You see, Codd had written another paper that showed that relational calculus, which is a declarative language, has the same expressive power as relational algebra, which is a procedural language. This relationship is known as Codd’s theorem. Unlike Codd’s proposed implementation, SQL is not based on relationship calculus but on relational algebra. As a haddock guy myself, I don’t often eat cod… Anyways, okay. So this is the question. What was the name of Codd’s original proposed query language? Was it, A: Alpha, B: CalcQL, Calc-Q-L, C: OWL, O-W-L, D: Spar-Q-L, SparQL, or E: Q-U-E-L or QUEL? Okay, I think I’ve got a guess, but it’s…
0:40:34.1 MK: I’ve got a guess, but I think it’s bad.
0:40:37.8 TW: You go first.
0:40:39.3 MH: All right, Moe, what is your guess?
0:40:41.2 MK: I think it’s E. But only ’cause it sounds vaguely familiar, but…
0:40:48.7 MH: E.
0:40:48.7 TW: And? Is she a winner?
0:40:50.2 MH: Well, let’s hear what your guess is.
0:40:51.5 MK: Well, you’ve gotta guess first, Tim.
0:40:52.3 TW: Dammit.
0:40:52.5 MH: Because when we’re both wrong, then we know what’s been knocked out.
0:40:55.8 TW: Okay, well, I’m gonna go with OWL.
0:40:57.8 MH: OWL. Okay, so we’ve got one vote for Q-U-E-L and one vote for OWL. Let’s consult our judges’ panel.
0:41:05.2 TW: I believe we’re both wrong.
0:41:10.0 MH: Well, QUEL technically should be given half points because this is a real database query language that was like you proposed, but actually you’re both technically wrong, but Moe, you’re the rightest. [laughter] So Fred Pike, you’re a winner. We’re gonna go with that one.
0:41:27.3 TW: I’m sorry, Marissa.
0:41:27.4 MH: But the answer is A, Alpha, apparently is the correct answer.
0:41:30.8 MK: What? Oh wow.
0:41:31.2 MH: Yeah, well, who knew? But obviously, we all have a lot to learn about Edgar Codd, and if you’re listening and you wanna understand… I don’t mean to say diabolical, but that’s really the only thing that really comes to mind when I talk about the brain of Matt Gershoff. That’s a brain you want in your corner, he and his team at Conductrics guiding your AB testings. So Edgar Codd, Alpha…
0:41:54.5 TW: He is definitely the best guy to hang out with in a lobby bar.
0:41:57.1 MK: That’s true.
0:41:58.8 TW: Yeah, ’cause he’s writing these when he’s sober as far as we know. Get a couple of drinks in him…
0:42:03.6 MH: And Tim it’s unsurprising that you picked OWL because that is the Web Ontology Language, which I imagine you probably just…
0:42:09.8 MK: That’s… Okay.
0:42:11.0 MH: Had those two just flipped. Yeah, so anyway, that’s the Conductrics quiz. Thank you both for participating. Congratulations, Fred, you’re a winner, and we out. Let’s get back to the show.
0:42:23.0 MK: I have to confess, I have to confess, and this stays in the sanctity of the podcast, of course…
0:42:32.4 TW: That’s right. So everybody listening…
0:42:34.5 MH: Okay. Mum, cover your ears.
0:42:35.1 TW: Okay, now who are the other listeners we need to worry about?
0:42:38.3 MH: The circle of trust.
0:42:39.5 MK: The joy of working from home is that you also get to see a bit of your partner’s work, and my husband often carries around like a little printout of the team structure, and they recently went through a reorganisation. He works at a very large bank, and the notes on the thing are like, “Do not call it a restructure, this is not a restructure, it’s a reorganisation” and I looked at these and just lost it laughing. It was like, are you kidding me? Like what’s the difference between a restructure and a reorganisation?
0:43:18.0 MH: Yeah. Different words mean different things to people. I found that out one time, I was really early on in, running the practice of Search Discovery, and I let everybody pick whatever title they wanted and was like, “We don’t care about titles here.” But some people felt like the title they picked it actually gave them certain privileges. [laughter]
0:43:42.0 MH: And so then all of a sudden, I started getting all this feedback like, “So and so’s telling me what to do about this, this and this.” And I was like ” Well, that’s weird, they’re not your boss.” Everybody was like, “Why are you doing that?” “Well, my title is this so I get to do all the… ” [laughter] And I was like “No, that’s not how we’re organised.” But the thing is I don’t think like that, but they obviously do, and so… Yeah. Little words mean a lot to people sometimes, it’s really funny. The other thing that surprised me over the years, I thought pretty much everyone is familiar with RACI matrices or diagrams and I’m constantly surprised the lack of people knowledge about that. That’s a good way to just sort of try to hone in on, like who’s responsible for what and how are we gonna get to… And it’s useful for product ownership, table ownership, or maintenance or data feeds, whatever the case may be. It’s just sitting down and knocking out a few of those across the team, so that at least internally, you have something to work with as a starting point sometimes.
0:44:45.0 TW: Well, the exercise of saying to do a RACI, you have to figure out what goes in the rows which means you actually have to figure out what needs to be done and you have to agree on the language for it. I feel like more often than not, when we’ve gotten to the point where we need to do a RACI, just for anybody… And you guys know, Responsible, Accountable, Consulted and Informed.
0:45:05.3 MH: That’s right.
0:45:05.7 TW: And there’s an S sometimes thrown in as a supporting, still has the same phonetic.
0:45:09.4 MH: I see DACI too which I forgot what the D stands for, but yeah.
0:45:14.5 TW: I’ve had cases where we have gone through that for several groups and it’s taken multiple meetings, and it’s some difficult discussions. And then we get it figured out, and then nobody ever needs to look at it again. The process of going through it gets some clarity around what’s done. When reorgs or restructures happen, I think this is another disaster waiting to happen. Oh, you reshuffle it, give general terms as to what these groups are gonna do and then you put it on the teams to go and define the way that they’re going to operate. And that doesn’t work because you have some people who are off trying to make a land grab, and then you’ve got other people who are just avoiding the stuff that they don’t wanna do. And you’re like, “Well, who’s actually making sure that this actually makes sense?” And then you definitely wind up with personalities who just don’t work well together.
0:46:11.4 MK: Yeah, how much do you think the org structure gets impacted by personalities? Like…
0:46:19.9 TW: I think a decent amount.
0:46:23.8 MH: A lot. I think leaders matter a lot to an org. I use or used my experience at Lands’ End, we swapped out CMOs right during my tenure there, the org changed quite a bit. Our roles changed quite a bit from one to the other. So I think it matters a lot who your leaders are, speaking of that, who should we be really reporting to?
0:46:44.6 MK: I definitely think…
0:46:45.3 MH: There’s a lot of…
0:46:47.0 MK: The CTO, unless there’s a CDO.
0:46:51.7 MH: Really?
0:46:52.4 MK: Yeah.
0:46:52.9 MH: See, I don’t know that the CTO always is the person… Like an analyst should be reporting into a CTO…
0:46:57.9 MK: Oh, no…
0:47:00.7 MH: Does the CTO care about…
0:47:01.2 MK: Like the Data Lead should.
0:47:02.2 MH: So like data engineering and all the technical sides of it. What about data science? What about people doing modelling and statistics and things like that, CTO?
0:47:14.8 MK: I think the thing that I have found really nice about being part of the tech team is that they’re, number one, you need to work really closely with engineers, whether you are… If you’re doing experimentation, you need to work with the engineer to make sure that the tracking is in place, that they’re measuring the experiment properly, etcetera, that they don’t faff up like multiple product launches, faff up with note. We’ve had an engineer that, there’s literally a thing in their code that says “analytics tracking on” and someone switched it off and you’re like, “Why would you do that in your codebase? But sure.”
0:47:50.0 MK: And I think you get a lot of respect from engineers when you’re a part of their group, but I also think you build relationships with them, and I think sometimes, that relationship is one of the most important… Yes, I think the stakeholder relationships obviously also really matter, but if engineers don’t understand the value of data or making sure that they track things properly, the whole thing fucks up. I don’t know, but I feel like Tim’s gonna massively disagree with me.
0:48:22.9 TW: I feel like you’re… No, I think it winds up making the case for the value of a good Chief Data Officer, who can do the reading… I think where we’re winding up is, it really is dependent on the organisation, what data there is, how heavy the governance seems to be, where the organisation is on its kind of analytics maturity, to use that overused word, because it needs somebody who’s got a bird’s eye view of, what are the key components? Do I need to have analysts who are really holding the hands of the business stakeholders, or do I have analysts who can support the business stakeholders? And so where do they fit?
0:49:07.4 TW: I need to figure out whether my analysts should be… They may report up to me, but they’re embedded and sit with the groups they support or nope, they need to be supporting those groups, so they need to sit with the people who are building the data lake, but even figuring out what those roles are. We have so many roles out there, and I don’t think any organisation needs all of them, so you’ve gotta sort of pick your suite of, is this an organisation where we need to have one-fifth of the staff for analytics engineers, or do we need zero analytics engineers? And that’s kind of somebody who understands all of the parts from the data collection, to the moving the data around and storing it and governing it and stewardship, all the way through working with the data, through Data Science, through ML, through self-service dashboards.
0:50:04.3 TW: I feel like that’s kinda like, what do we need based on the nature of our business? And I guess I’m gonna land where I started, I think, which is, that winds up having a huge dependency on the way the entire business is structured and organised. Okay, how does the data work fit within… Because you’re probably not gonna be influencing how the marketing team’s organised and the product team is organised, so maybe that’s a pun, I just sound like a consultant, “It depends.”
0:50:35.3 MH: Yeah. The “It depends” is strong. [laughter]
0:50:36.7 MK: I think the one thing that you most need to consider as you come to this of what is the perfect dream team structure or whatever, is how analysts share knowledge, because I think that’s something that’s often kind of an afterthought that you don’t realise until months have gone by of a particular structure and then you’re like, hang on a minute, analysts are not collaborating as much as we want, or they’re not sharing information. And I think whatever structure you have, you need to answer that of like, how are we gonna make sure… Because for example, if you have an embedded or a cross-functional structure, you might have a very junior analyst in one team working with one set of stakeholders, and you might have a really senior analyst in another one, how are you gonna get that junior analyst to eventually be a senior analyst if you don’t have a really good way of sharing knowledge? And I think it’s something that’s really just left to the very last minute, but the learning and how you’re gonna coach your staff is really important.
0:51:38.4 MH: Totally agree. The one thing I always insist on with all of my staff is that we end meetings on time, and that includes podcasts, so we’ve gotta head to wrap up. [laughter] No, I’m just kidding. I’m the worst. When I pay attention to it, I’m actually pretty good at it. But we do have to wrap up. I actually think we knocked a couple of things off of my list on this conversation, so Moe and Tim, thank you. This was actually an effort I feel proud of. I don’t know if we feel like we’ve got the title we want. No, I’m just kidding. [laughter] All right, so…
0:52:13.9 TW: Is the chief moderating officer going to…?
0:52:15.8 MH: Chief, oh… Yeah, that role has no power at all. I don’t want that title. [laughter] Let’s head into doing some last calls, and we like to go around the horn, share a little bit about what we’ve got going on. Moe, what’s the last call you’d like to share?
0:52:33.6 MK: Okay, I’ve got one intense one and one fun one. So, the intense one is actually a topic that I’m really passionate about, which is organ donation, I’m gonna totally butcher this poor guy’s name. [laughter] But Ronghuo Zheng. He’s a professor at a university in Texas, he’s been doing some really interesting…
0:52:54.7 TW: He’s at the University of Texas.
0:52:56.0 MK: Oh the University of Texas. Perfect.
0:52:58.0 TW: Okay. I just was an asshole for people who…
0:53:00.3 MK: No worry, that’s fine.
0:53:01.1 TW: Now there’s a whole thing…
0:53:03.3 MK: Oh, okay.
0:53:04.4 TW: It’s as bad as the Ohio State University. But…
0:53:06.9 MK: Oh okay… Well anyway.
0:53:08.9 TW: Okay, carry on.
0:53:08.9 MK: He has done some analysis and there’s a few articles floating around about how direct flights can influence organ donation levels and… Yeah, it’s pretty fascinating. So I will show the link if you wanna have a read at that.
0:53:24.7 TW: Can I throw something in on top of that?
0:53:25.6 MK: Sure.
0:53:26.8 TW: Because I scanned it as well.
0:53:27.4 MK: Sure.
0:53:27.4 TW: Moe and I both saw the paper, but it was… What was kind of interesting is you’re like, okay, where’s the analytics angle? And they basically try to figure out, like some challenging questions on, how do you actually predict how many more organ transplants would happen if there’s a direct flight? So like trying to figure out what their unit of analysis is, they actually use like a diff-in-diff approach to predict it, so it’s like the first part of the paper goes through, these are the challenges of even trying to predict this, ’cause you can’t do it in like an experimental design, so I will plus one that as well.
0:54:09.9 MK: I like the Tim explanation better.
0:54:14.5 MH: Nice.
0:54:15.5 MK: Okay, and then the fun one, it’s a podcast I’ve been listening to called Algorithm, which is actually about a data journalist who… It’s really fascinating, but he basically does analysis, and his analysis then becomes a story versus like data journalism where it’s kind of sometimes supporting, but it’s basically about some work he did to find whether you could use different analytical techniques to predict if there is a serial killer in the area. And in one particular case, they found that there was a serial killer and he notified the local police and then it was only like five years later that he was caught, so yeah, that’s an interesting one. Algorithm.
0:54:57.6 MH: Is it specifically around serial killers?
0:55:00.1 MK: Yes, it is around serial killers.
0:55:02.1 MH: So… Okay.
0:55:02.9 MK: I probably shouldn’t refer to that as my fun last call, though.
0:55:05.8 MH: That was your fun… [laughter]
0:55:06.3 MK: It’s kind of fun…
0:55:08.0 MH: You know what’s fun?
0:55:09.1 MK: But you know what it… Like it’s a podcast, it’s like interesting.
0:55:13.3 MH: Yeah.
0:55:13.3 MK: It’s not super work-y, like the algorithm side is very lied on, it’s much more about serial killers.
0:55:20.4 TW: Awesome, true crime.
0:55:21.7 MH: Alright, what about you, Tim?
0:55:25.3 TW: Well, I will go with a book that I am reading and I will say right now, I am not going to ever finish reading it, but it’s called Field Experiments by Garber and Green, so it is a deep… Yes.
0:55:42.5 MH: That’s on my wish list. Yeah. I don’t think I’ll ever read it, but I want that book. [laughter] Yeah, sorry keep going.
0:55:49.9 TW: I was counselled to read the first three pages to get a deeper understanding of randomised controlled trials, and I now can just spout out and observe heterogeneity, rolls right at the tip of my tongue. I’m actually into the fourth chapter, and I think I’m gonna finish the fourth chapter and I may decide to go back and read the first three chapters or I will continue, but it’s interesting and it’s… Well, it’s a slog, they’re, with a highlighter and re-reading and trying to understand equations and understand the notation, but it’s actually… They’re a little mad that the sharp null hypothesis is like this… All of a sudden, I’m like, that’s so clever. So it’s a…
0:56:31.6 MK: I feel like someone recommended this book to me as well.
0:56:33.9 MH: I wanna say did Joe Sutherland bring this up on… When he was on the show?
0:56:38.3 TW: There’s a very good chance that he did, ’cause he definitely is the one who…
0:56:41.9 MH: I feel like that’s the only way I would have known about it and added it to my Amazon wish list.
0:56:46.7 TW: He might have.
0:56:47.5 MH: I’m not sure it’s the exact same book, but he did say.
0:56:49.4 TW: No.
0:56:49.6 MH: There is a book that he did bring up…
0:56:51.5 TW: It probably is ’cause he knows the authors or at least one of the authors and still prefers…
0:56:58.7 MH: Here’s what you can do, Tim. You’re doing great. You’re doing something really great. The next major step would be to write sort of the “don’t make me think” version of Field Experiments for the rest of us. If you can just take that on after you get through make the dumb Michael version, then that would be delightful.
0:57:18.2 TW: They’re a little bits and pieces where I’ve already done that, and there are definitely some other ones that I think you have to really know it to actually be equipped to like really design and run a randomised controlled trial.
0:57:30.6 MH: I sense a potentially amazing epic super week talk on this maybe in the future.
0:57:38.4 TW: Yeah, I think there are pieces of it that it’s like, if you just understand this cool little thing, all of a sudden, it makes sense why doing blocking in your design will reduce the variance, which makes it more likely to detect an effect and you’ll be like, “Well, that’s really clever.” So some of it’s just like Algebra like, what if we substitute this equation in? Then look, it doesn’t matter what the whatever, whatever is, so.
0:58:00.1 MH: Nice.
0:58:00.8 TW: But it’s like a $50 book, so… It’s like textbook pricing.
0:58:07.1 MH: Yeah. It’s definitely a textbook.
0:58:08.3 TW: So what do you have? Do you have anything light and fun?
0:58:11.9 MH: I’ve got two things, it’s a twofer and… Yeah, maybe you…
0:58:16.1 TW: Oh crap… I forgot I had a second one.
0:58:18.4 MH: Oh, would you wanna share?
0:58:19.7 TW: It was really just gonna be a plug ’cause we’ve only done it on Twitter, but we do, with the new look, we do actually have stickers, so if you go…
0:58:26.6 MH: Oh yeah.
0:58:27.8 TW: Go to bitly/aph-stickers or if you go to the FAQ on our website, there’s a link to it, but you can get podcast stickers. We also have a store up for overpriced merch, but it’s overpriced purely because of the store we’re using, not because we’re making bank on it.
0:58:47.5 MH: But the mug that you… I may need to order one. Yeah they’re… That’s really nice. The new mugs look very nice.
0:58:55.2 TW: Yeah.
0:58:55.7 MH: Okay, I do have a twofer. So I recently came across the Institute for the Quantitative Study of Inclusion, Diversity, and Equity. So that’s an institute that exists, and I was not aware of it before, but I became aware of it because they recently published a paper about the most discriminatory federal judges and the sentencing lengths they give to black and Hispanic defendants, and they specifically analysed all of the sentencing structures across all of these federal judges, and were able to basically highlight, “Here are the judges that basically give massively different sentences” and so this is an amazing piece of initial scholarship and analysis, really interesting. I’m still kind of reading through. It just came out like, literally, I found it today so I haven’t read the whole thing yet. But it’s definitely worth checking…
0:59:45.5 TW: Who was it who did it?
0:59:47.7 MH: I found it on Twitter, it was a retweet from Samsing Yangwei, which I always feel like I mispronounce his name, but I know I’ve used him on the last call before, and I wanna say the guy that he was tweeting was a guy by the name of Chad Topaz, who’s a co-founder of that institute. So that’s the guy on Twitter I found, but anyway, this paper really interesting read, and it looks like the Institute for the Quantitative Study of Inclusion, Diversity, and Equity, kind of fascinating as well, so I think there might be some really cool stuff. This is right up my alley, I like this a lot. And then my other one is from my good friend and Tim’s co-worker, Jim Gordon so…
1:00:32.4 TW: Oh, he’s a good one. Damn it, I’m gonna cross this off my list.
1:00:33.2 MH: Adobe Experience Platform is a thing that Adobe, of course, have been marketing very heavily and telling everybody to talk about and use and buy and whatever, and unless you’re really in the weeds with Adobe, it is really hard to get your arms around what exactly it is that they’re talking about like, “What does it do?” And Jim wrote this blog post, and honestly, I think this may be his best that he’s ever done.
1:01:00.7 TW: It’s got cartoons in it.
1:01:01.4 MH: Well, not only that, but it’s super useful, and he lays out exactly what’s in there, what the products are, what it means, very friendly language. Honestly, it’s a really great job. Jim, if you’re listening, top-notch, great work. It was a pleasure to read this and I feel like I increased my understanding. I kinda knew some of this stuff, but I now feel like I really know it. So that’s a shoutout to Jim. So yeah, you can check out his blog post if you wanna get sort of the real English language version of what APE is, check out that blog post. There’s also really amazing people at Adobe who also explain this stuff really well too, I don’t mean to… I’m not saying his don’t, I just…
1:01:43.4 TW: Well, he credited Eric Matisoff for helping him out with the content, so.
1:01:46.2 MH: Yeah, Eric and Jen Lasser, Ben Gaines… People we love on this podcast, but I just don’t go to their webinars, I don’t have time usually so… [laughter] Asynchronous content delivery, it’s my thing. Okay, that’s enough of last calls, Moe and Tim, really good conversation. I hope as all of you out there are thinking about your org and how to make it better, I hope some of this helped, I hope maybe gave you some food for thought. If it did, or you’ve got thoughts and comments or you’re like, “Hey, you didn’t address any of this.” We’d love to hear from you. Reach out on the Measure’s Slack or on Twitter or on our LinkedIn group, we’d be happy to hear from you. And throw us a rating on whatever platform you listen to us on, we love to find out that people are listening, and we love to hear that you’re enjoying the show if you are. Oh if you hate to show too, you could probably say that as well, there’s a…
1:02:41.4 TW: But you don’t have to…
1:02:41.5 MH: We’re pretty sensitive about it [laughter] but we would take that commentary. Anyway, obviously, the show and all that goes on with it, never be possible without our amazing producer, Josh Crowhurst. Josh, hats off to you. And as with all things org structure, and whether you’ve got the star-crossed lovers or the data centralised observable ports, or the whatever structure you’re working with, I know I speak for my two co-hosts, Moe and Tim when I say keep analysing.
1:03:17.1 Announcer: Thanks for listening. Let’s keep the conversation going with your suggestions and questions on Twitter at @AnalyticsHour on the web, at analyticshour.io, our LinkedIn group, and the Measure chat Slack group. Music for the podcast by Josh Crowhurst.
1:03:35.0 Charles Barkley: So smart guys want to fit in, so they made up a term called analytics, analytics don’t work.
1:03:41.6 Thom Hammerschmidt: Analytics. Oh my God, what the fuck does that even mean?
1:03:50.4 TW: I think, more importantly, did your Prime Minister shit his pants or not?
1:03:55.5 MK: What? Oh, totally. Pfizer reached out last year and were like, “How many Pfizer doses do you want?” He’s like, “No, we’re good.”
1:04:06.5 MH: Oh boy.
1:04:07.1 TW: I bet more the McDonald’s thing from like 1987.
1:04:09.7 MK: Oh, I don’t know what you’re talking about.
1:04:12.0 TW: What? Have you not followed any of this?
1:04:14.3 MH: I don’t even know what’s going on.
1:04:15.9 MK: I was three, I was three then.
1:04:17.1 TW: No, it’s recently bubbled up that he had to actually make a statement that he did not shit his pants at McDonald’s.
1:04:28.2 MK: Wait, our current Prime Minister?
1:04:31.6 TW: Yes.
1:04:31.7 MH: What?
1:04:32.1 MK: He did shit his pants at McDonald’s in 1987 and you’re worried I’m not keeping up with the news?
1:04:39.0 MH: Okay. Let’s take that energy and put it into a podcast episode.
1:04:53.1 TW: Rock flag and data strategy is not pornography.
Subscribe: Google Podcasts | RSS
This site uses Akismet to reduce spam. Learn how your comment data is processed.
https://media.blubrry.com/the_digital_analytics_power/traffic.libsyn.com/analyticshour/APH_-_Episode_220_-_Product_Management_for_Data_Products_and_Data_Platforms_with_Austin_Byrne.mp3Podcast: Download | EmbedSubscribe: Google Podcasts | RSSTweetShareShareEmail0 Shares
Subscribe: Google Podcasts | RSS
Will disagree with Moe on having all “technical” people reporting up to the CTO. Some should, but I’ve seen so many cases where some techies are better positioned under other verticals.
As a mostly techie person, my main concern with me and my team of developers sitting within IT is the mind-set of most Tech leaders. My team are the ones who do most of the digital data collection. We work fast and flexible, trying our best to quickly react to the ever changing needs of business. Constantly changing course and adjusting as their needs change yearly or even monthly.
Most IT management is still stuck in a very task oriented mindset with long range fixed structures. Flexibility and speed is not something they seriously give a lot of thought or effort to. They think stability with an extreme level of accuracy. Which is fine for some aspects, such as building a complex data lake or other such structure. But so much of today’s data world is not meant to last for years and years. Change in data is a constant, and rather quick constant. Quality Digital data is good, perfect data is not worth the extra effort required. So most of what my team does is about balancing stability and scalability, but also how quickly we can get things into production and then pivot quickly. Under most IT management that would be highly frowned upon and resisted. I can’t count the number of times I’ve been sharing digital data with some IT managers, and they question some small discrecpency. When I explain that data is naturally messy, they don’t like that response and say if it is not fully accurate then it can’t be trusted or used.
My intent is not to disparage IT leadership. They do some things very well and have very valid reasons for much of how they operate. But there are also times where having techies operate within that model is counter productive for business, so are better placed under different types of leadership.